Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
We Are Doomed: A pessimistic point of view of "modern software engineering" (carette.xyz)
55 points by LucidLynx on Jan 14, 2024 | hide | past | favorite | 72 comments


What ever you think of Vulkan, it is NOT typical of modern software engineering. Its a massive outlier. It is very demanding of its users, it is very low level, it exposes the complexity of the hardware, it is large but not bloated, as it reflects the hardware complexity.

If you complain that modern software is slow (it is), then its because people are NOT using low level APIs like Vulkan, that forces you to think about how to most effectively utilize the hardware. Its because people don't want to know how anything works, and load up on a mountain of dependencies and run everything in a garbage collected virtual machine in a browser.


> because people don't want to know how anything works

for some definition of "people" .. isn't the detail at the low level sort of overwhelming on modern cards?


Yeah, its hard. But the original author complains about having to understand hard things, and then at the same time wonders why modern software sucks. You have your answer right there: If you don't understand a system you can utilize it.

Lots of people argue that slow shitty software is good enough, because its not worth spending time, energy and money doing things right. I dont agree with this, but its consistent argument: Money and time is more important then software that runs well. At least then you have made a choice about your priorities.


Hi, author here. I made a mistake in the article to be honest. I don’t complain that I have to « understand hard things », but, I missed this in the article, that the complexity of the Vulkan is like 5x more than OpenGL for… 5% better performance? Like Directx12, where most of the Directx11 games do not have any extra valuable performance compared to the complexity of moving to the newer version of the API.

I would love to learn a new low-level API for better performance, but I really don’t think the time I spent learning Vulkan to benefit of the same performance I could have using OpenGL worth it.

I will make an update to the article this week to make a clearer explanation, thank you for your comment.


> that the complexity of the Vulkan is like 5x more than OpenGL for… 5% better performance?

The complexity of Vulkan can (and in naive cases, always) slow things down in comparison to OpenGL. What you get with Vulkan isn't +X% more performance, but consistent performance.

Both OpenGL and DirectX already did all of the things that you need to do with Vulkan/DX12. The difference is that drivers were black boxes back then, and everything worked with heuristics. A relatively minor change could evict you from the fast path into the oblivion. You had to blindly figure your way out, or if you were "big enough" you could contact the driver team, at which point you would enter the world of GPU politics. GPU mafia was/is a real thing.

Vulkan cuts straight through that. Yes, synchronization is hard, but it is way harder to figure out when the driver arbitrarily inserts a gigabarrier and when it doesn't. Even with Vulkan/DX12 you still encounter these issues, but at least with these latest APIs you can reason about things, and be generally correct.

It was never about more performance. It was always about consistent performance.


Hear me out: we have _two_ APIs.


Yes, I used to work in the OpenGL architectural Review board. OpenGL, had fundamental limitations, that only a clean sheet design could overcome. Vulkan is in my opinion 90% the right design. The fact that Apple continues to be Apple is just dumb, and hurts developers and Apple.


People love bashing Apple for "being Apple" without really thinking.

Metal released almost a whole two years before Vulkan (Mantle is irrelevant). Apple needed a good API for developers because Khronos couldn't get their shit together. We had many versions of OpenGL and OpenGL ES, we had wrappers around OpenGL ES 2 to work on the desktop version.

Apple made an API that developers liked, and what do you propose? Apple start being Google and ditch their thing for another thing?

Don't see people complaining here about Direct3D being windows-only.

If you want cross-platform, then pick ANGLE and call it a day.


> Apple made an API that developers liked, [...]

Citation needed. I definitely didn't like Metal.

> [...] and what do you propose? Apple start being Google and ditch their thing for another thing?

I would propose Apple to have worked together with Khronos on Vulkan. The could have also cooperated with AMD and based Metal on Mantle (released a year before) - as soon as it was apparent that Mantle will be the base for Vulkan (which was long before its release IIRC) they could have slowly merged it all back together.

The least they can do is to also support Vulkan on macOS.

> Don't see people complaining here about Direct3D being windows-only.

Ehm, the article we are commenting on complains about it being Windows only.


Whats apples relevance here? Vulkan can be a good API even without apples involvement.


No Vulkan on macOS. Molten used for Metal translation.


IMHO, the most concerning sign about software engineering practice isn't exactly "bloat", but the inability of the industry to do anything securely.

Consider every software security update to be a bridge falling down, due to incompetence.

In this case, the fault isn't so much individual incompetence, as collective incompetence of the field. The ecosystem is toxic, as are conventional practices, as are market incentives. Individuals might be incompetent on top of that, but the situation is nigh impossible for competent ones as well.

And there are no professional engineer licenses to pull, nor few individuals to send to jail.


Computer security was broken during the Viet Nam conflict, as there were systems that had to accommodate multiple classification levels at the same time, and they couldn't do it with the OSs of the day. Multilevel security was developed after that, which leads to Capability Based Security.

Essentially, your current OS requires you to have absolute faith in any software that runs on your behalf. If it goes rogue, or gets confused, almost anything you could do to sabotage yourself... it can do in milliseconds.

On the other hand, if the OS didn't have that requirement, and instead let you choose what files to open, and enforced your decisions, you wouldn't have to trust your software at all.

From the point of a GUI user, they wouldn't see any difference in behavior. File Open still works like always, but the logic behind the scenes is slightly different.

Command line usage, that's a tougher nut to crack. There needs to be a standard way of defining what files you're passing on the command line.

We could stop blaming everything but the OS... but we likely won't. 8(


This is pure hyperbole. You may as well compare my broken fridge to a bridge falling down. Engineers designed that too but no one is going to jail over it.


Why is it "pure hyperbole"? Are you saying that all software is like your fridge, and that there is never a need for software that is like a bridge?

The concern is that the field of software can't seem to build a bridge when it needs to.


>The concern is that the field of software can't seem to build a bridge when it needs to.

Based on what? Software operates life critical systems 24/7 and nobody blogs about it because it’s not filled with hype technology.


I've become a strong believer that software engineers to form a guild/professional society such that we can hold ourselves accountable.


A software guild wouldn't make software better, only much, much more expensive. Existing terrible practices would become enshrined in what is effectively law, and simply wrapped in additional bureaucracy that serves only to muddle and diffuse responsibility, rather than improve anything for the general public.


“Join our guild so we can punish you”


The point of professional livensure is gaining the leverage to hold yourself hostage.

Boss tells unlicensed employee to do something evil: employee weighs "do the right thing, and need to find a new job" vs "do the evil thing".

Boss tells a licensed employee to do something evil: employee weighs "do the right thing, and (maybe) need to find a new job" vs "do the evil thing, and maybe lose the ability to work in my field".

That's leverage to bring to your boss. "I cannot approve this, because it's evil, and approving this would get any professional stripped of their license". That puts your boss in a hot seat because now the scales are being tipped for him, worried that even if he fired you, he wouldn't be able to find anybody who would rubber stamp the evil thing.


No, the real point of guilds or licensure is to constrain the supply of those that work in the profession, generally to inflate salaries.

Licenses sound good in theory but often that just means you get more "inside the box" thinking (the kind of thinking that helps on the license/guild exams), and innovation will suffer. Would we be better off if the programming industry simply used, say C, for everything since the 70s? Some things might be better, but we'd have far less and less varied of it, I'd wager.

EDIT: I assumed guilds/licenses here are mandated (which is what a guild that issues licenses often wants to do). If licenses can be ignored, it's less of an issue.


"Join our guild because we know you're top-tier and we can collectively keep the hacks from calling themselves your equal"


"Join our guild, make 10x, but there is a responsibility"


So, introduce risk to becoming a software developer? What's the upside for the developer?


It wouldn't impact developers who aren't working on important systems (safety, security, privacy contexts). HIPAA compliant software? Maybe they (org and, possibly, a "professional software engineer" in charge of signing off on things) should be held accountable when there's a data breach. Video game maker? Yeah, you wouldn't need a PSE. It also, if done like PE licensing, wouldn't impact the day-to-day developer. Only the ones who choose the responsibility (and usually better compensation for the responsibility).


There is risk in being a doctor, lawyer, pyschologist...

the thing is, a lot of people trust what we do and our negative impacts scale up much worse. A friend shared https://www.youtube.com/watch?v=gkJ4qv5RLRc with me today, and it made me enraged. Too many incompetent technologies are deployed.


> There is risk in being a doctor, lawyer, pyschologist...

True, but in those professions, the risk to the individual is typically offset by somehow enforcing labor-scarcity to drive up income for those in good-standing.


Pointless jumping through circles.

At that point I would definitely get a JD instead.


A spine to push back on bad practices


Bingo. People do not understand that lawyers/doctors can tell their boss to go pound sand by saying ‘that would lose me my license and so no one will ever agree to do it’


No, guilds makes it worse, not better.

As the article itself says, the doomsaying should be taken with a grain of salt. There are problems, and the best thing to do is improve your code, and encourage your team to focus on performance, security, etc. Lead by example. Adding bureaucracy never fixed anything, and wastes so much time money and opportunity in the process.


Yours is the logic that undermined support for unions for decades.

It's gonna take more than a Jesus-type to advocate for corporate responsibility.

While you're fucking around with security fixes, you're not generating revenue. Security incidents are an assumed risk.


I think the incentives to business have to be there.


It should be more about pride of engineers shipping products that people can trust.


> but the inability of the industry to do anything securely

And reliably. I’ve been banging this drum for a while, the castle is built on quicksand.

As general purpose software makes its way into cars, and ever more critical system start running JavaScript on non-realtime Linux with buggy drivers.

we will eventually have some kind of software caused catastrophe and then regulation will come down on us like a metric ton of bricks.


> In this case, the fault isn't so much individual incompetence, as collective incompetence of the field. The ecosystem is toxic, as are conventional practices, as are market incentives. Individuals might be incompetent on top of that, but the situation is nigh impossible for competent ones as well.

Eh, I disagree. Software engineering allows for failure where the cost of failure is low. So your mail app stops working and you have to restart it, who cares, so long as you get some neat new features faster that on the whole make your life better?

In parts of the industry where failure is costly, like medical or aviation, we operate differently.

Speed and correctness are antithetical generally, and different parts of the industry accept different trade-offs based on their risk tolerance.

The goal isn't to build something that operates perfectly at all costs, the goal is to develop systems that operate as well as they need to given external constraints.

[edit] The truth is we wouldn't have 1/10th of the cool shit we have today if we demanded absolute perfection from things that just didn't need it because we're a "profession."


I respectfully disagree. I’ve passed the same thoughts back and forth in my mind before, but it’s mostly nostalgia.

Yes, some tools were much snappier on much crappier hardware, but they also lacked features, including safety and collaborative ones.

We have so many more people using computer devices and the internet now. We have to account for them to some extent and a lot of libraries do that for us, but it does make them heavier.

I’d say yes software has gotten fat and slow in a lot places, but also immensely more capable and more reusable.


> We have so many more people using computer devices and the internet now. We have to account for them to some extent and a lot of libraries do that for us, but it does make them heavier.

A significant percentage (majority?) of those devices and internet connections are low powered and slow compared to what most readers here are using

Those heavy libraries can often exclude those users simply because they can't adequately run the fancy features those libraries provide

There's a balance here that I think we, as an industry, are not managing well


Neither Apple nor Microsoft want any usable multiplatform graphics API. For this reason, none of them delivers such a thing.

If you want a multiplatform graphics API, you should use a library which implements such API on top of these native OS-specific APIs.

I have good experience with that one: http://diligentgraphics.com/diligent-engine/ I’ve used it couple times on Windows with D3D12 backend, and on Linux with GLES 3.1 backend.


> If you want a multiplatform graphics API, you should use a library which implements such API on top of these native OS-specific APIs.

Typical software approach- instead of fixing the fundamental problem, just put a plaster on top. That’s how we deal with everything. Thats why our industry can’t be trusted.


How would you fix the fundamental problem?

It’s technically possible to force these two companies to implement a shared API, but hard to do, and “Vulkan API Support Act of 2024” sounds a bit weird, IMO.


We do it in other industries all the time, petroleum fuels content is regulated that way for compatibility.

We have a ‘electrical plugs act’, ‘fire hydrant act’, ‘railway standards act’, ‘air traffic control act’, ‘water mains act’, etc.


I think the #1 reason for these regulations wasn’t compatibility, it was safety. People die in fires, railway accidents, aviation disasters, and from waterborne infections.


Well safety argument is now relevant, as software is Joe , controlling all those things


Something similar happened back in the late 80's when we had painful EGA and then VGA/MCGA arrived with its glorious mode 13h, then everything about graphics became easy and cool! Then Super-VGA entered...


So the problems in article are: Apple doesn't support Vulkan, Apple chips were incompatible with Docker, Apple deprecated OpenGL support, Apple don't make their Metal multiplatform. I think I see a pattern here, hehe


> Apple chips were incompatible with Docker,

Uhm what?


In my arms brother

   I am very pissed when I see software “engineers” more concerned about a vim configuration than thinking why their shell profile takes three to four full seconds to load.


The answer for Apple is shockingly simple... stop buying their hardware. It sucks for games, so just stop.


I've been happily playing Baldur's Gate 3 on my MacBook at 1920x1200 resolution with most settings maxed out. It hasn't sucked. In fact, it's been pretty smooth.


I would argue the iPhone and iPad together are probably one of the biggest gaming markets on earth. There are after all 300M consoles and 900M iPhones out there.


I'm not saying it's good for games, but the iPhone's GPU blows a big ugly Switch or Steam Deck out of the water

I really wish the gaming space on phones weren't such a radioactive wasteland


That's true, but then you can't complain about Apple throwing their weight around either.


I did. Got a Steam Deck and couldn’t be happier.


The root of many successful revolutions is observing that what exists is simply unacceptable and working to build something transformatively new. Many such attempts fail, but not all of them do. Perhaps the author is identifying such a need, and might consider architecting a superior cross platform solution from first principles.


I highly doubt that building a 10 c++ files was as fast as now. I am clearly remember building c# apps for a much longer, i remember how deployment of a trivial app on Azure took ~15 minutes. Meanwhile we still don't have 5k displays with 120hz, hardware is not even here.


There are 4K and 5K OLED PC displays with 240 Hz capability being made this year, with some models reaching 480 Hz at 1080p!

E.g.: https://tftcentral.co.uk/news/asus-announced-rog-swift-pg32u...

Azure deployment pipelines are still slow as molasses. That hasn't changed.


4k is trivial, 5k is impossible, your link is about 4k.


Most companies aren't like Rockstar Games or older Valve. More MBA's pushing for product & market-share grab than building from passion. IMO


Well, remember, as Donald Knuth said, “optimization is the root of all evil” (that’s what your boss heard).


"Graphics programmer thinks that their field represents all of software, news at 11. In other news, systems programmers think that anyone who doesn't want to deal with memory management is a weenie."


Missed an option for cross platform graphics, which is IMO the best one: WebGPU. Supported on all platforms, much simpler than Vulkan, and with solid C++ (Dawn) and Rust (wgpu) libraries.


The author says they were choosing three years ago, when webgpu did not exist


Plus, it is still a working draft today.


I've been saying this for some time. This is all in line with the bullshit jobs phenomenon which is likely the result of reserve bank 'full employment' agenda.

People used to think I was a conspiracy theorist for suggesting this. Yet it's clear as crystal that the incentives in the monetary system itself are set up this way. Can you think of a more powerful incentive than money to drive behavior?


my opinion is that high resolution displays are the root of all evil.


Meh. Yet another unfocused rant about "complexity". PC gaming is essentially dying and so there's no money in it and no effort being put into it, and that goes triple for PC gaming on Apple systems. Stuff gets faster when you pay skilled people to make it faster, and gets slower when you don't care how fast it is. That's all it's ever been.


PC gaming is essentially dying?? Is that why the gaming market is more than the market for music and movies combined? Maybe Steam would like to comment.


> Is that why the gaming market is more than the market for music and movies combined?

It's a factor yes. Monetization is much better on consoles and phones.

> Maybe Steam would like to comment.

Notice how Valve has essentially gotten out of the actual PC-game-making industry, in favour of a) the selling-shovels-to-hopefuls business of publishing, and b) trying to become a game console maker.


This is not a direct decision of valve. There are lots of people working on games but the company is privately held, democratically vote driven where they don’t fire anyone. I’m pretty sure it’s relatively small staff compared to other modern gaming studios these days.

The gaming industry pc is the largest it’s ever been and growing. Look at subreddit growth for r/pcmasterrace and the popularity and growth of discord as a platform. Pro Pc Esports is also growing, has more watchers every week than the Super Bowl and dominates consoles.


> This is not a direct decision of valve. There are lots of people working on games but the company is privately held, democratically vote driven where they don’t fire anyone.

What do you mean? It may not be an explicit decision by Valve's CEO, but it's evidently a decision that Valve as an organisation has made. (Are they even making games any more?)

> The gaming industry pc is the largest it’s ever been and growing. Look at subreddit growth for r/pcmasterrace and the popularity and growth of discord as a platform. Pro Pc Esports is also growing, has more watchers every week than the Super Bowl and dominates consoles.

I'd like to believe that but we're seeing more and more of the AAA games going console-first or console-exclusive, so even if sales numbers are high that must not be translating into profitability. Indie or indie-style games, like Minecraft or Rimworld, are succeeding on PC, but those are games that don't need maximum graphics performance, so they're not going to be putting money into improving Vulkan or whatever. (Even something like Fortnite is "AAA" in the sense of being a big-studio game with big-money marketing, but it's not pushing cutting-edge graphics performance). Hell, most of these games are using third-party engines, so the only companies that might want to put money into graphics works would be the engine developers - but look at how terrible Unity's financial situation is, even as they're probably powering most PC game sales at this point.

Consoles generally have voice chat built-in, so you can't really talk about discord numbers without having something to compare with. Esports may be growing in some places but it's mostly in console-friendly genres like FPS; even if the pros are playing on PC that doesn't mean the people watching them are. Compare with RTS which used to be the biggest esports genre but is now declining, because it's PC-only (the only sort-of-RTS games that are growing are MOBAs - but those are precisely the ones that work best on consoles).


TBH, I think Valve is doing pretty well with both. Steam has now long resulted in a renaissance for indie game marketing and access, and the Steam Deck is incredibly well designed for on-the-go usability.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: