Finally, after over a decade of personal research, >3 years of full time work, I am so proud to show my ultimate passion project: virtualized geometry tech we call Nanite coming in
#UE5
and a demo using it running real-time on PS5.
As some have noticed Nanite software rasterizes most of the triangles we draw! We have 2 different specialized software rasterizers running in async compute. Fast enough to fill the screen with micropolys. More cool info on
#UE5
, Nanite, and Lumen: .
@siggraph
Advances in Real-Time Rendering in Games course line up here: - awesome program this year, themes of efficient upscaling, handling dense geometry, RTGI and large-scale worlds lighting, to name a few.
I learned this week that I have been promoted to Engineering Fellow here at Epic, a level that did not previously exist. Humbled. I never thought I'd be considered for such a thing in my career.
The response has been amazing! Thank you! There is one type of response that I’ve heard that I’d like to correct though. I did not prove anything was the right or wrong approach. These were merely my conclusions along the way as to the path I most believed would achieve my goals.
At the
#HPG2022
in-person event,
@BrianKaris
presented his Journey to Nanite. He talked about the process of inventing the groundbreaking geometry engine in UE5, highlighting the perseverance, confidence, and fear that comes with research.
Watch it here:
This is amazing! I wish I could have worked on this demo. Just so cool. Hats off to everyone who did.
1000s of dynamic shadow casting lights in real-time on a PS5!
We are Mega excited about this in particular 🔦
MegaLights, a new experiential feature in 5.5, enables artists to use orders of magnitude more lights. They are movable and dynamic, with realistic area shadows, and can light volumetric fog. See this capture of our live, real-time
When I say over a decade I'm not exaggerating. Some blog posts of mine from the beginning of 2009. From then till now I've been thinking about this. So excited this day has finally come
I'll be doing a live stream soon (I think next week) to cover some more
#Nanite
info and answer questions.
For a full technical deep dive into how Nanite works I'll be presenting at
#SIGGRAPH2021
in August.
Sigh. Apparently I need to weigh in on this TAA topic. MSAA is an antiquated technique designed to generate more samples on polygon edges as that was the only area that was undersampled at the time. 1/12
We're kicking off Unreal Fest Online 2020 with a look at next-gen game development and the creation of our Unreal Engine 5 demo.
#UnrealFest
#UE5
Tune in now:
Plenoxels: Radiance Fields without Neural Networks
abs:
project page:
propose a view-dependent sparse voxel model, Plenoxel, that can optimize to the same fidelity as NeRFs without any neural networks
IMO the most important thing to work on for gamedev technology and services is how to reduce the cost of making games without needing to sacrifice production value. I can’t speak for Epic’s entire strategy but that is my personal drive.
The Matrix Awakens - a defining moment in the latest gaming generation.
@Dachsjaeger
,
@dark1x
and myself discuss this astonishing demo in the wake of discussions with Epic Games:
But over the last year I've been joined by an amazing team of engineers who have all been super influential to Nanite and this demo
@Stubbesaurus
,
@gwihlidal
, Andrew Lauritzen,
@ols_olsson
,
@ZabirH
,
@Snosixtytwo
. Been an amazing experience working with them. So proud of this team
SW rasterization is an important component of Nanite but one of many that goes into pulling this all off. We'll be talking more about how it works in time and of course the source code will be available for you to check out yourself early 2021.
#UE5
@HPG_Conf
The slides are here which include the complete script of what I said. Also included are a few bonus slides and footnotes that needed to be cut for time.
@iBrews
@mmalex
#Nanite
will use pixel sized triangles when there's pixel sized detail. It decides how many tris to use based on amount of pixel error. Default is 1 so it allows error of <1 pixel, which effectively is imperceptible. Big tris can be <1 pixel different than original if its flat.
This is the night before SIGGRAPH starts. If you arrive in town before 7pm you might as well come because registration is cheap and it should be fun. I will be speaking about the circuitous path of research that lead to Nanite, what didn't work and what I learned along the way.
🚨
#HPG2022
is excited to announce our in-person reception co-located with
#SIGGRAPH2022
in Vancouver. We will have a keynote by none other than
@BrianKaris
of Epic Games, who will talk about his journey to Nanite. More details and registration info here:
Check out this new case study to see how
@EpicGames
employs
@AMD
Ryzen Threadripper processors to enhance productivity, enable greater creative freedom, and dramatically improve build times in Unreal Engine.
Read the full case study here:
Yes, Nanite draws the gbuffer in 4.5ms on average! Many assumed this amount of detail only can be had at 30fps. Not true! This is well within typical 60hz budgets. That doesn’t even count optimizations I’ve made since.
Virtual Shadow Map documentation:
Detailed geometry needs detailed shadows. Virtual shadow maps are an awesome new feature that was developed in conjunction with Nanite.
Less than a day now! Join me tomorrow in "Advances in Real-Time Rendering in Games, Part 1" (hopefully this link works) and then in the following Q&A section.
Also in the
@digitalfoundry
article above, we render virtualized shadow maps with Nanite to get those super detailed shadows. Without it is sometimes hard to tell the difference between real high poly geo and normal maps. Detailed shadows are important!
Awesome article on how to add custom render passes to
@UnrealEngine
. Looks like there are other great
#ue4
#ue5
articles from the past on the site too.
Kim Libreri and myself are building a team here in the Bay Area focused on pushing the state of the art in real-time simulation and computer graphics. We are looking for others to join us and help build the future of Unreal. If interested contact me or
First day starting at the new Epic San Francisco office! I moved out here months ago and have been working out of my home waiting for the office to open and today is finally the day.
Really enjoyed this panel! Surprised that no one made the tiling texture argument in favour of UVs and 2d textures. I'll make a bold dataless claim here. The vast majority of pixels rendered in modern games are sampling an artist created texture with a coord outside of [0,1].
(5/5) At 12:00 pm (PST), we will have a panel discussion on high-performance geometry, featuring industry leaders Henry Moreton (NVIDIA), Jonathan Dupuy (Unity), Ryan Schmidt (Epic), David Farrell (Adobe), and Alex Evans (NVIDIA)
So many of the industry ills come from games having ever increasing costs to make. Massive studio consolidation, many beloved devs closing, sequels and licensed IP dominating everything, lack of innovation in design due to risk aversion…
Check out the slides even if you saw my talk today. They include a long list of references to prior work, complete speaker notes of every word I said, bonus notes on many slides, and bonus slides that had to be cut for time.
Was up till 4am last night finally finishing
#TLOU2
. Absolute masterpiece! I can’t stop thinking about it all day. Congrats to the whole team
@cgyrling
@mrobin604
.
Where the hell did the idea that our
#ue4
ray tracing demo requires $150k worth of hardware to run? Seeing this all over the web. Runs great on 4 TitanVs which are 3k a piece. The fancy DGX Nvidia built box we ran on costs 50k but is probably overkill.
@_mamoniem
While funny to think of it as coworkers fighting with each other it’s actually following more a respectful pattern of academic publishing of citing prior work and comparing against state of the art and ground truth.
@mirror2mask
@siggraph
I had many people ask me if I saw this talk. Yes, I did. What did I think? Well it's a Nanite clone with a few minor differences. If this was peer reviewed I expect it would be rejected due to lack of novelty but I have that gripe about many talks in this course over the years.
@BartWronsk
I personally like Ed Catmull’s version. Ideas are cheap. People matter. “If you give a good idea to a mediocre team, they will screw it up. If you give a mediocre idea to a brilliant team, they will either fix it or throw it away and come up with something better.”
Reminder that I’ll be doing the keynote at the HPG dinner on Sunday. Another monster of a slide deck clocking in at 115 slides so far. No repeated material.
This is the night before SIGGRAPH starts. If you arrive in town before 7pm you might as well come because registration is cheap and it should be fun. I will be speaking about the circuitous path of research that lead to Nanite, what didn't work and what I learned along the way.
While I’d love for my work to be built upon, I’d honestly be happier yet if more unexpected approaches are developed. I’d love to see more point based, voxel, implicit, regular and irregular mesh research done. The world is boring if we all do the same thing.
DLSS and new innovation in temporal sample reuse is the future and will continue to be our reality for a very long time, until we have enough power to generate numerous path traced samples per pixel in a single frame. 9/12
PS. Even if you accumulated shading samples in another space, micropoly geometry now means nearly every pixel is an "edge" pixel. Traditional hardware MSAA will still not be the solution.
Off to SIGGRAPH! If anyone wants to chat about graphics or especially if you are interested in working at Epic, please hit me up. This year I’m mostly going to talk to people.
Just watched
#KlausNetflix
for the first time. Incredible film with amazing animation. Should have won all the animation awards last year. Delightful story with a good message, beautiful artistry, and genre pushing technology. Kept me guessing how it was done the whole way.
So we are left with temporal accumulation of samples.
Will this go away anytime soon? I argue no. It is going in the opposite direction.
The amount of cost to generate a pixel is increasing faster than hardware power and slower than screen resolutions. 6/12
Definitely check out the last third of the video where
@Feaneroh
covers the art production of the demo. The last bit where she flies 100mph through a city? Yeah, that was all just a detailed as everything else, you just couldn't tell.
@tom_forsyth
@iquilezles
@pervognsen
@CasualEffects
Graph/visual programming languages are IMO marketing gimmicks. The only benefit they provide is appearing to be not scary. Human language countless times has converged on text, linear lists of symbols as the most efficient form of written communication, doubly so with a keyboard
Now if folks want to debate what space is best to accumulate samples temporally, screen space vs others, that's more interesting and understandably contentious. 12/12
It's relevant lifetime lasted from colored triangles through textured triangles. It was artificially extended through per pixel shading and early days PBR with NDF prefiltering. It ran out with artist created material shaders. 2/12
… general lack of fidelity and immersion in indie, giant teams that lose the sense of shared vision, long term crunch, shift away from stand alone products and single player to multiplayer and GaaS where the people are the content because actual content is too expensive.
Good resource but also a nice reminder to give others some slack when speaking in person. There’s likely at least one term on this list you’ve been mispronouncing.
The question "Is TAA optional?" is really the question "Is image quality optional?". I've argued for years it shouldn't be optional in Fortnite but I don't ultimately make that decision. TAA and actually AA altogether can be disabled along with shadows. 10/12
@knarkowicz
Megalights is not quite to the point where light budgets don't matter anymore but it's a massive step in that direction. Enough where it feels like "virtualizing" lights is within sight.
but it just didn't seem within sight for this gen. Besides, geometry will suck up years of my time. Leave that problem for UE6. I was right about my time but not when it would be viable. What
@knarkowicz
and team achieved here is remarkable.
Also memory isn’t insane. This is super WIP and was immature for the demo’s release timeframe but it a top focus for us right now. It’s already not as bad as you think and it will get significantly better over the next year.
@mmalex
@D3rzo
@UnrealEngine
Wow, thanks. Hugely inspired by your work on Dreams! I think you’ll notice a few parallels if you read between the lines in the upcoming tech talk.
@GPUOpen
Very happy to see this published! DGF is very similar to Nanite's encoding minus attributes. Switching Nanite to this format would be some work but is definitely possible. Doing so wouldn't be much of a win for rasterization but the really exciting part here is ray tracing it.
My point being that UV control of the instancing of 2d textures is extremely powerful and I'd go further and say it is the most well used "procedural" art tool ever invented for 3d graphics. Too often is the problem of texturing conflated with painting a model.
In the modern age of Monte Carlo integration and ray tracing (that we have at least partly been in for a generation and firmly are in now) we need a vast number of samples per pixel. 1 will not do.
This means supersampling is required. 4/12
My conclusions are opinion, not fact. Perhaps my arguments were convincing. If not, prove me wrong! Or perhaps your goals differ from mine which changes the conclusion.
When disk size isn't a concern, say film or enterprise use cases, you can do this and more. Honestly I expect data delivery to be one of the biggest constraints in game graphics for next gen. Virtualization tech like Nanite, VT and fast SSDs make the run-time side a nonissue.
For real-time it isn't practical to generate those all in a single frame. In some RTing contexts sample sharing can be done spatially. That isn't possible for a large class of problems as evidenced by the fate of MLAA and it's decedents. 5/12
@ID_AA_Carmack
UE4 does this at pixel granularity. A full screen dithered stencil is laid matching a fade amount. LODs stencil test against 0 or 1. This way we skip any shader permutations with discard. All objects fade in/out at global cycles which also means no per object fade timer state.
What can we do about it? Obviously make the tools easier and faster. But perhaps bigger than that, stop remaking things from scratch! Reuse or share! That implies a shared style to art assets which means photoreal (just like film).
@castano
Thanks! Not vector displacement or tessellation. That was my plan for this problem for many years but it's not general enough. Stay tuned for more info.
It also implies the fidelity of those assets are “future proof” or at least can span generations before they need to be replaced or updated. That means autoscaling and tech like Nanite. It means PBR and converging on a shared reality of how light behaves.
@phyronnaz
This is mostly just the voxel prototype I made years ago while researching approaches for virtualized geometry. I talked about it in my HPG talk with screen shots. It's been sitting in a shelf ever since and I got sick of fixing it locally.
At the
#HPG2022
in-person event,
@BrianKaris
presented his Journey to Nanite. He talked about the process of inventing the groundbreaking geometry engine in UE5, highlighting the perseverance, confidence, and fear that comes with research.
Watch it here:
@TimSweeneyEpic
@JCGT_announce
is a great example of an open journal. Everyone involved is volunteer and their operating costs are minuscule . Would love to see paywalled journals die immediately but the argument they need a ton of money to do so is false.
No one would say to a painter, go back and delete those early paint strokes you completely covered later on. That's unnecessary and the request sounds ridiculous. Why shouldn't it be the same for set dressing?
Temporal upsampling has gained great popularity as a means to generate less than 1 sample per pixel as the cost of those samples increases. More sophisticated denoisers (distant decedents of TAA) accumulate even more samples than old TAA ever did. 8/12
One does not require the other but the true power is when they are used together. The detailed shadows show Nanite meshes in their full glory and the way Nanite renders makes VSMs fast.
Thank you everyone that joined!
Quick note: last year's demo content I showed in the editor was not PIE so does not have the same lighting which is activated by blueprints to hand tune sun light for different areas. So the lighting surely didn't look as nice as the original.
@renderwonk
@olson_dan
Absolutely. I "lost" a ton of time on a voxel, point based, and displacement prototypes during R&D. But the solution I landed on then was highly informed by that work. Wouldn't have gotten here without many failures first.
Back during early UE5 ideation I sketched out the big rendering problems I wanted to work on. They were all about removing budgets and "virtualizing" an aspect of art. Textures were done, geometry I thought was next and achievable this gen. After that was lights...