In response to the many requests I have received for introductory performance-programming materials, I will be posting a serialized course on Substack starting February 1st:
Me: "Delete this file."
Windows: "Someone is using it."
Me: "Who?"
Windows: "I can't say."
Me: "I checked using a utility. It says your file explorer is the one using it!"
Windows: "Well, I had to show a preview."
Me: "Why?"
Windows: "Because you selected the file to delete it!"
Today I ran Turbo C++ 3.0 from 1992 in a DOSBox. It runs instantly, compiles instantly, steps instantly, syntax highlights instantly - even though DOSBox is emulating actual 386 speed.
It's crazy how much modern software underdelivers :(
@TEDchris
As with your previous post, it is difficult to comprehend why you believe these comments will improve the situation.
First, I agree that it is unfortunate you dragged Adam Grant into this. I use the active voice here, because unlike the passive "got dragged into this", the
@TEDchris
@coldxman
The lack of self-awareness in this response is disturbing.
By uniquely subjecting Coleman to the requirement of a debate, you implicitly did directly to him what your employees claimed he did only indirectly to them: you are sending a strong signal that his identity is not as
Apparently, Framework had to pick a specific screen resolution for their display to work well with popular Linux desktop environments. It should be considered a serious software engineering failure to ship a resolution-dependent desktop environment in the year 2024.
If a "cyber security" company ships a worldwide update that unrecoverably bluescreens all their customers' machines, that company should have zero customers the following day.
A typical Rails app will serve up to 1.5 requests per second per vCPU. With tuning plus an optimized app, that can be up to about 5 requests per/sec/vCPU.
So these days, vertical scaling can buy you 75-250 requests per/sec for 350 a month… not bad. ~5-10x better than Heroku.
When I tell the Windows Terminal team something is simple, I am "misguided", being "somewhat combative" and am "impugning the reader". But a year later when they call the exact same thing "trivial", that is just, you know, them writing a blog post:
So Apple fanboys see:
1) OS not bloated with slow, half-baked AI garbage
2) You can play Fortnite
3) You can run Chrome instead of Safari (or as I like to call it, "Internet Explorer 12")
and somehow they get from that to "see, the EU version is worse"?
A) Yes
B) Myself and others have been sounding this alarm for over a decade
C) Many factors are at play, including an improperly trained workforce, monopoly business effects, and widespread adoption and standardization of low-quality practices, languages, platforms and protocols
I like how credit cards are ostensibly 16 numbers, but we now use both expiration dates and CVV codes for authentication, so they're actually 23-digit numbers you have to enter in three different fields for no reason whatsoever.
When I posted , I was obviously expecting the standard reddit/HackerNews nonsense threads. What I was not expecting is the overwhelming number of people who have replied saying not only that they agree, but they find "clean" code less maintainable as well!
[1/2] Here is a demo of a simple, completely unoptimized terminal renderer I wrote over a few days. Supports scrollback, line wrapping, Unicode combining, RTL-over-LTR, multicolor fonts, changing fonts on the fly, etc.
It runs at several thousand FPS.
Things that still happen frequently:
1) People incredulously asking me why I don't use programming languages with garbage collection,
2) Teams who use garbage-collected languages presenting slides on the absurd amount of work they had to do to fix their stuttery framerates.
@embersunn
This is just another way that companies indirectly demonstrate that they are unable to assess programming ability. If they were actually good at talent scouting, they wouldn't need to categorically deny opportunity to a whole class of people.
Both are bad. The correct approach is to keep in mind what will need to happen for high performance, and ensure throughout the process that your design will always permit full optimization.
So, don't think performance "first" or "last". Think "performance-aware throughout".
Two ways to write software:
1. Performance first. Consider performance from the start.
2. Performance last. Make it work, make it right, then make it fast (if necessary).
Poll: What's your typical approach?
You can do a complete from-scratch rebuild (!!) of the RAD Debugger, load an executable, and start stepping through it in less time than it takes to merely launch a vanilla install of Visual Studio.
The software performance excuse parade never ceases to amaze me. I guess it's just a given that if you think modern software isn't extraordinarily slow, you also have no concept of what is and isn't time-consuming for a CPU to do.
I'm tired of seeing lazy "software performance doesn't matter" excuses on forums and social media, so I devoted an entire video to the mountains of evidence debunking that ridiculous idea:
What if we just started calling dropped frames and laggy input "stadia" as a noun? Example: "I'm seeing a lot of stadia in the latest build" or "if you're experiencing a lot of stadia, we recommend upgrading your graphics driver to the latest version", etc.
Looking forward to the HackerNews/Reddit/etc. comments helpfully "explaining" how this real-time 3D dataset visualizer is just not comparable to the work a chat client has to do, and that's why it can load and run right away while Microsoft Teams takes nine seconds...
@ID_AA_Carmack
This happened to me "IRL". Some programmers at a lunch wondered, "If everyone only had one child, how long would it take for humanity to just be one person left?" I instinctively said "about 33 generations, depending on how you count it". They had no idea how I did that :(
Many people believe GPU performance has steadily increased over the past 20 years. However, if we graph the number in the name of the highest-end nVidia card from each year, we clearly see that GPU performance peaked in 2008! Today's top cards contain a mere 30% name-power:
Since the OP said it was time to have a "conversation" about reaction videos in the programming space, here you go:
I have said it before (because
@FreyaHolmer
said it first), and I'll say it again: if you are creating serious in-depth programming content, you are not making
@ThePrimeagen
I don't see any time period associated with the original tweet that you QT'ed with "100%". That tweet was saying that you can't succeed on YT/Twitch if you're not relatable (with you as an example.)
First, that's false. There are lots of folks out there
Proposal for a new metric: The Moon Unit, or "moo", which is equal to 2.71321035034 seconds, the maximum ping time to the moon. It's a measure for web technologies, like Google Drive, which regularly takes around 4.5moo to list the first 50 files in a directory.
Programming practices that increase total code volume lead to more bugs and less performance. It's not a tradeoff, it's a lose-lose. When you use significantly more code than is necessary to implement a feature, you provide an order of magnitude more code path combinations for
VS2022 is so buggy with basic functionality it's unbelievable:
1) can't create solution folders
2) fails to refresh syntax highlighting when removing comments (leaves code green)
3) text search is unreliable
4) keeps re-enabling the code analysis
WTF Microsoft?!??!?
Running a browser to connect to the cloud to run a browser to connect to the cloud to retrieve the contents of a single 2D page to recompress and send back to the original browser is now "the future of computing".
Usually when people talk about grand things like changing "the future of computing," they're full of it. But not this time. Suhail has been working on this for 2 years. There's a good chance it's the new default infrastructure.
Interviewer:
Invert this binary tree
Junior dev:
Here's the algorithm I memorized
Experienced dev:
Let me find a library for that
Senior dev:
Does inverting the tree align with our core business objectives?
I would like to add three points of important context to these "but Unreal's royalty can be more expensive in many cases" rationalizations. They are missing the gravamen in several ways:
1) Unreal is a much more powerful engine, with firmly established high-end credibility. It
@unity
Unity were with you really, we love the service. Bad execution on the last post but what would really calm nerves is releasing numbers. Do the work for us, 2 hours and everyone would agree.
@unity
does this look right? You clearly cheaper in almost every case. 5% from Unreal is
I have some bad news for everyone. There will be... MORE VIDEOS.
That one that caused a commotion was just stuff that got cut from the prologue of my course. If we're going to have a full discussion about clean code and perf, well, buckle up because I can do that all month.
@ForestKatsch
This is one of those "but what if we already invaded the country" kind of questions. The answer is, don't have an un-undoable "delete this repository" button in the first place. It makes no sense. It's a code repository. Just mark it as deleted and wait six months just in case.
I have been talking about SIMD programming for years now, and somehow I never thought of this. I am so disappointed in myself. This is good enough that I might start turning comments on for my YouTube videos. (from )
How the collision system of The Witness was engineered to completely prevent entire classes of bugs normally found in game engines of its type:
(Restored version of my 2018 talk, since the convention lost the audio from the original)
Dear IHVs: we don't want "AMD Adrenalin", or the "Razer Experience", or the "GeForce Experience". The only "experience" we want is to install the f'ing drivers. Coercing users into installing buggy UIs on top of the drivers makes the experience worse, not better. Please stop.
*Some* kind of mystery AMD power management is kicking in at 20% battery, that makes my laptop unusable for 3D by slowing the GPU down to an insane degree; thus I cannot use the bottom 20% of my battery for work, which is MUCH WORSE than it just running out due to lack of “power
Or, you know, it could be that they would like to get paid for making JavaScript better, instead of being expected to do that for free while everyone else profits off their improvements?
I want to like Bun but I keep reminding myself that it's a VC-baked company. They have a goal.
If that goal was to make JavaScript better, it wouldn't be Bun. It'd be a bunch of pull requests to Node.js to improve performance and stability. That's not the case.
Custom
A lot of people complain about CSS because they wrap a div in a div and it doesn't align the way they expect.
What they don't understand is that if you wrap that div-in-a-div inside a div _that is itself inside a div_, then you can put that whole thing inside another div.
@Josh9817
Because if you actually know what's going on in a computer, a web stack is horrifying to even look at. Even just HTTPS by itself is so bad, it would make you not want to work in that industry.
At some point someone should confess to new people who are struggling to learn programming that 99% of everything they are trying to learn is not actually endemic to programming, has nothing to do with how a computer works, and will be completely different in 5 years.
After many weeks in development, my series on Zen, CUDA, and Tensor cores is ready to roll out. Up first is a look at the silicon - what does the physical layout of a Zen 4 or Ada Lovelace chip actually look like?
It increasingly seems like the new normal for software is shipping new versions that are mostly feature regressions from prior versions, then pretending it is somehow exciting to reimplement those features and slowly scrape back to parity :(
@pikuma
Takeaway: always concatenate the middle finger printout to the end of the existing autoexec.bat. That way, when the user executes the script multiple times, they will simply get additional middle fingers, which can be considered a feature instead of a bug.
It's very difficult to play most "AAA" games now. At least for me, the design quality has degraded to the point where the high production values can't compensate, and I just end up wishing I hadn't wasted my time.
You've got a problem, so you write some Modern C++. Now you've got std::vector<std::shared_ptr<Problem>> p = {original_problem, std::make_shared<Problem>()};
Couldn't have said it better myself.
As I've often forecast: I don't think Linux for desktop has been getting much better, but Windows for desktop has been getting much worse. I can see a Year of the Linux Desktop™ in the future that is 100% due to Microsoft.
I refuse to believe anyone who prefers Windows 11 to Windows 10 actually does meaningful stuff on their PC. In the last 15 minutes I’ve experienced:
- Explorer.exe freezing when right clicking large files
- Windows Update slowing machine to a crawl
- impossible to unpin OneDrive
For studios looking to leave Unity, I put together a list of alternative engines. If you have experience with any of them, please leave a (respectful!) evaluation in the comment section to help other devs know what's out there:
This is the most awesome demo I've ever seen. You may have heard that the Commodore 64 had a second CPU in its disk drive. But you've probably never seen anyone prove it quite like this:
@yaxamie
@TEDchris
I have a tremendous amount of respect for
@coldxman
. I've mostly stopped posting political content myself on X, but seeing Coleman treated this way crossed a big red line for me. I am very upset.
And in round three, WE HAVE A WINNER!!!
Congratulations, Fedora 40! Debian 12 Stable failed. Ubuntu 24.04 LTS failed. But Fedora 40 installed!
It is now successfully running on the computer, complete with dual monitor support and excruciatingly high resolution.
Whenever anyone says, "Use an existing engine, don't reinvent the wheel," you know they don't know what they're talking about because wheels go on the bottom of the car. The engine doesn't even go there, it goes in the front under the hood.
Does x86 need to die?
@ThePrimeagen
and I did a livestream about this last week. I've linked to the VOD and provided a bunch of additional material for those interested:
Even if true, to what extent would you tolerate the "Yes, clean code is much slower, but it's about programmer productivity" for other products? Would you want a car that only went 5mph because the designers could do less work to make that car?
[1/3] Regarding NFTs, it is disheartening to see Silicon Valley being proud of creating artificial scarcity. Physical scarcity is something humanity fought to overcome for almost our entire history. Creating scarcity is never something anyone should ever be proud of.
My hottest tech take: generated source code should be committed.
It happened to me so many times when the code generation logic was changed, a bug was introduced and there were no git diff to even compare with the old versions.
This is a great diagram from Anandtech (). It uses color to show the relative cost of communicating between any two cores of a 64-core Threadripper. The physical layout of chips is becoming increasingly important to performance-oriented programming!
People often ask if they should use tau instead of pi, but the answer is that usually you don't need either. Most uses of those constants are because of functions that take radian parameters, which is almost always a bad idea:
I've periodically warned about Unity for game dev for about two years now. IMO these kinds of things were inevitable based on their filings and investor calls, and I assume it will continue to get worse from here as they push to meet their valuation:
Considered holistically, requiring job applicants to have college degrees is implicitly giving them a take-home test that takes four years and costs them tens of thousands of dollars to complete. The room for pipeline improvement here is undeniable, and massive.
@WillEhrendreich
Much like forcing people to change their password every 3 months, it's one of those things people who don't actually know anything about security think is "more secure", when actually it is at best the same but often less secure.
I should also add, "premature optimization is the root of all evil" was not about architecture. It was about microoptimization. Almost nobody today even does microoptimization, so the quote basically never applies. Charles Cook explains better than I can:
The xz utils fiasco reminds me of the age-old adage, "There are two kinds of software: closed source, which may have a backdoor in it, and open source, which definitely has a back door in it."
Introducing the Meow hash - a non-cryptographic hash capable of 16 bytes _per cycle_ throughput on modern CPUs while still cleanly passing all of smhasher:
(This is an alpha version. Comments welcome on the GitHub.)
Here is the rant I promised on arrogant, dismissive replies about software quality on Twitter, complete with several exhibits for your delight and amusement:
I tried asking ChatGPT a question about programming. Instead of answering, it told me not to do what I wanted to do. Then, when I insisted, it got the answer wrong.
Therefore, I can fully certify that ChatGPT has passed the "Hacker News Turing Test" with flying colors!