Having worked in a job shop, a factory that did gears down to quantity one, I became quite aware of the differences between IT, my previous job, and actual physical production.
The machine tools were all made 50+ years ago. Changing anything was a dangerous thing to do, because you might cause jobs that have known and reliable setups that are done a few times a year in quantity, to fail, erasing the profits for the job, and possibly losing customers.
The rush to fill brand new high energy intensive data centers with hardware that has commercially useful lifetimes measured in months (instead of decades for machine tools) seems quite short sighted to me.
There is a really interesting generation gap issue in the replies to your comment. What I perceive as younger people are horrified at the idea of fifty year old tools while the older folks are thinking (I imagine) “if the tools have lasted that long they must be well-honed and very good”.
Of course, this could simply be the perspective of someone turning 50 this year.
My dad ran a job shop focused on small jobs and the economics are different.
A lot of his work was keeping other local shops / industrial equipment up and running. So there is a lot of variety of work but very low throughput and kind of by deffintion you have the capabilities to fix your own machines.
Programing a CNC machine makes it east to make a lot of the same part but if you only need one it may be quicker to just knock it out manually.
A 50 year old mill or lathe is easy to keep up and running, can be upgraded with a Digital readout or even CNC controls if desired. A tool in a shop like this likely won't see the cycles one on a factory floor constant uses sees but may be worth keeping around since it offers a unique capability...he had a large ww ii surplus lathe for jobs that wouldn't fit on the smaller more modern machines for example.
> What I perceive as younger people are horrified at the idea of fifty year old tools
My students are shocked (horrified?) to learn that they're basically running 50-yr old Fortran code when they use scipy.minimize to train their fancy little neural nets.
I always chuckle at how Python became the dominant language for AI / ML / data science etc but wonder why it is that Fortran and Python became the golden combo, it could have been done with any other language, no complaints, I love Python, but its just amusing to me.
For Fortran, I have the concept of 'immortal code' - code that is generally hard to write, is compatible with everything, and implements an algorithm in a way that's impossible to improve on - or at least doing so would be celebrated as a minor breakthrough.
A lot of numerical optimization code is this - it conforms to a strict 'C' ABI - taking (arrays of) simple floats and ints and outputting the same, so binding it to another higher level language is trivial, so rewriting it makes little sense. If the same algorithm were written in Java, most people would not want to bring in Java as a dependency for their Python/C++/whatever project, but since this is just a tiny C object file, it's happily integrated into everything.
They also tend to be very tricky to get right, I remember reading a paper where the author was adamant that changing the order of a multiply and an add (a mathematically invariant operation) would cause the algorithm to blow up due to the different scales of floating point values involved causing a major loss of precision. I'm sure there's tons of stories like this.
This is the sort of code which took PhD's who studied this exact topic years to get right, even though the actual code often looks unassuming, I would dread the day when I was required to touch it (but I never do - since it always does what it says on the tin)
Fortran has been a research language used by people like physicists and mathematicians that know what they're doing and have been developing that tool for decades and Python is perfect for people who have no idea what they're doing.
It's really just that it's a pretty easy language to learn that finds a good balance between structure and brevity. It has very good support for the data structures you need and libraries for everything. A lot of people love the language and that built up a lot of momentum and eventually people started adding stuff like numpy and scipy and pandas and before long you had this giant scientific computing environment that almost anyone can get into.
I tried out most of the scripting languages out there (Ruby, Perl, Tcl, Groovy, R, and many more) and Python just seemed to click more and it has a whole lot less to worry about upfront than languages like C# and Java. In comparison to languages like C and C++, it's a godsend for those with typical automation needs.
In my eyes it seems like a pretty straightforward development. There have been plenty of other tools that may have made sense throughout history too. Matlab could have done this, but by that time nobody was going to build out massive libraries for something expensive and partly closed off.
People don’t generally care about the best language. They care about a passable language with the best library access and great support (either commercially or via the community, but preferably both).
I’m not even fond of Python, but I use it sometimes because things exist for it to use and because there are lots of developers around who can share the work on the code. If I write something in R, APL, Julia, OCaml, Forth, Scheme, or Lisp at work I’m going to be the only maintainer and will probably get a stern lecture about that. If I use Perl, Ruby, Java, or PHP, there are a few more people but it’d better be code only my team has to maintain. Go, Rust, C, C++, TypeScript, maybe JavaScript, and Python are safe in most of the company’s codebases but only C, C++, or Rust for the code that needs the most performance and the most stability of resource use.
> People don’t generally care about the best language. They care about a passable language with the best library access and great support (either commercially or via the community, but preferably both).
Moreover, to the extent that they care about the “best” (or at least “better” within the scope of options with suitable ecosystems) language, “best”/“better” is highly subjective and shaped very much by familiarity
R is an entire language based around the idea of making code as unreadable as possible by means of bizarrely stacked syntactic sugar. It shouldn't surprise anyone that it didn't win out.
No one uses scipy for anything serious anymore in AI research or on any type of modern models.
You're setting your students up for failure if this is how you are teaching neural networks to them. You should switch to pytorch or at least something like tensorflow or jax or you are actively doing intellectual disservice to them and leading them to be actively noncompetitive both in AI academic paper writing/grad school and in the job market.
Similarly, use of sklearn considered harmful in a world where CuML/CuPY/Nvidia RAPIDS exists.
And also, knowledge of conda/pip/poetry considered harmful in a world where UV exists.
It sounds like a fundamental education failure and/or an extreme failure of tool design and training in workplaces if new employees with relevant degrees don't have the basic knowledge to pick up new tools on the job. A liberal arts education is for teaching basic transferrable skills, and is not a narrowly tailored job training program.
I'm somewhere in the middle, young enough that almost everything I've seen new is disposable crap (including the tools), old enough that I have had an interest in things from before and noticed that they really were built much better, or at least heavier, back then.
I've made the comment on here before that I believe it's short term energy optimisation, in that it used to be seen as reasonable to much heavier objects around. We've made everything so light we've lost the infrastructure for moving heavy stuff around when we might need to.
Kids today have no concept of how heavy workstations, TVs or monitors used to be, and they think it's exaggeration. Let alone tools, cars, appliances etc.
I remember the Sun monitors (21" I think) were about 80 lbs. I've read that part of that was a metal frame to hold all of the wires in front of the screen.
They were fun monitors - we had a lab full of them, they would degauss on startup, and the degausser would induct into the monitor next to it (and a little bit into the monitor after that).
The weight from a CRT is mostly about the amount of glass required to keep the atmosphere out, as it's essentially a vacuum bottle with better marketing. On that 21" display, you've got about 6000 pounds of force trying to push the face inward, not to mention the sides, neck, etc.
Yeah, these monitors were a bit heavier than most consumer 21" monitors at the time. We also got Viewsonic monitors for PCs, which were lighter. At the time, I had assumed the additional weight was from extra shielding, but I later read that some of the weight difference was a metal frame holding the wires. The trinitron had a bunch of vertical wires instead of a grid of holes on the front - If I remember right, they'd shimmer a little if you smacked the side of the monitor with your hand.
It's possible they also had more glass than typical for a 21" monitor, I don't recall if they were any flatter than the Viewsonics or not.
Right. Bought two out of a University lab and lived in an apartment six stories high with only stairs. Moved out those monitors even and they were more difficult to get down than the couch...
Don't let me get started about fixed frequency, X11 modeline guessing (wrong of course) and needing a second monitor to even get back to the original config.
It's also due to the size of the vehicles that are popular today. SUVs and pickup trucks(used as family vehicles).
However the increase has also been offset with weight savings in other places.
- The use of aluminum in suspension components and body panels
- Long ago the move to unibody over body on frame for small cars
- smaller engines, V8s weigh more than an inline 4 cylinder and require heavier suspension components.
For example a 1989 Lamborghini Countach 25th weighs around 3200lbs which is slightly heavier than a 2022 VW Golf GTI (3150~)
I see comments that blame safety technology (electronic components) for increasing the weight of a car but a blind spot monitoring system probably weighs less than 5lbs. A rear camera is also around that.
Structural safety and airbags do add to a cars weight but these changes have made cars extremely safe.
That article is being disingenuous and wrong. It's comparing the lightest possible Civic configuration with the heaviest possible Accord of a different body type.
The 2000 Accord sedan is 2,712lbs, not 2,987lbs (which would be the wagon).
The 2019 Civic sedan is 2,743–2,923lbs depending on equipment/trim.
So yes, the Civic compared to an older car of similar size did get heavier.
The Miata proves that cars don't have to be heavier, but the Miata also took advantage of much more aluminum compared to the older models. Maybe mainstream cars should also switch to use more aluminum to keep weight down, and you're right that the reason they don't is because oil is cheap enough where weight isn't a priority enough to use more expensive aluminum instead of steel.
> That article is being disingenuous and wrong. It's comparing the lightest possible Civic configuration with the heaviest possible Accord of a different body type.
Good to know.
> So yes, the Civic compared to an older car of similar size did get heavier.
If the minimum is 1% heavier and the maximum is 2% lighter then I would not say "did get heavier".
You can only make the argument that the Civic is "2% lighter" when it is being compared to a wagon; apples to oranges comparison that invalidates the whole comparison.
They picked that specific year Accord because it's the same size as a sedan as that specific year Civic sedan, so it makes no sense to then compare the weight to the much larger Accord wagon variant. You might as well compare the sedan to a crossover to argue that the sedan didn't get heavier.
The range is 1% heavier to 7% heavier comparing the sedan to the sedan. Both ends of the range are heavier, so "did get heavier" is an accurate statement.
Okay I misread you then, but you're saying the 2000 Accord sedan only has one weight, while the Civic has a several percent range? If that's right then do we know which Civic trim is equivalent to the Accord?
If we know that trim is worth at least 6%, and we don't know how to align the cars, then the confidence interval around "1% to 7%" extends far enough to overlap some negative percents.
Another thing not mentioned by this poor article (everything forbes does these days is hot garbage), is that vehicles which are heavier do damage quartically proportional to their weight - https://en.wikipedia.org/wiki/Fourth_power_law
So the ever increasing weight of cars, trucks, SUVs, and especially semi-trucks is also responsible for our roads being shit, full of potholes, and expensive to fix.
Exactly because of the fourth power low, almost all of the road damage comes from the heaviest vehicles: class 7 and 8 trucks as well as buses etc. Even the heaviest passenger vehicles are negligible by comparison. And the weight of semi-trucks hasn't been "ever increasing": normal maximum weight has been fixed at 80,000 pounds for decades.
In some areas the roads are shit due to weather conditions, mainly frost heaves. This has little to do with vehicle damage.
Surely they do damage proportional to the fourth power of the contact pressure on the tire contact patch, not the fourth power of the overall vehicle weight, right? So adding axles or wider tires etc mitigates this.
To be fair, a lot of the tools we use as developers have 30-40 year heritage's themselves. The things that most people depend on and are in the background.
I don't know where I fit on that spectrum, my first thought was there's probably nobody around anymore to replace these fifty year old tools, and/or they'll price it at a level that would wipe all profits for the next 10 years when replacement will be needed.
Our field also have these IBM AS/400 or older running for 30+ years in a server room at the back of an office floor. They are more feared than revered.
I rather think of the maintenance nightmare. You can't change anything - not cause the existing system is good but because there are no people left that understand the whole thing.
But then I've got a few years to reach 50. Perhaps my views will change.
Every software company I've worked at that is more than 5 years old had major features that nobody understood anymore, even features that were core to the product.
Dont forget the critical software that keeps the company going that someone dealt with long ago, and was left to rot when they left, only for someone to discover it and have to go on an archeology dig to find info and improve upon it.
The gear cutting machines almost never had mechanical failures. As long as they keep them lubricated, and occasionally muck out the sumps, the machines should still be going in the year 2100.
The other thing about gear cutting is that hobs only cut one size/profile of tooth. Some of the cutting tools I was using dated from before WW1, for odd sizes that didn't get used much.
The thing is, those 50 yr old machine tools might be still good, but the more recent CNC machines are much more efficient, and require way less manual dexterity to use (say, compared to a lathe).
This is the whole idea of industrialization - moving away from having skilled artisans, into machines that encode the skill to reproduce the article.
The fact that machines that are 50 yrs old are still in operation is quite a feat but also an indication that the production methods remained static (of course, if the production machines are good enough already, then investment into new machines don't bring in new profits).
As someone who grew up in a machine and wood working shop and now builds, repairs and retrofits machinery, I can say that 50 years old is nothing and absolutely fine.
> The thing is, those 50 yr old machine tools might be still good, but the more recent CNC machines are much more efficient, and require way less manual dexterity to use (say, compared to a lathe).
I assume you are referring to manual operated machinery vs CNC machinery? Otherwise there is little to no efficiency gained from a new CNC machine. I've run both and the setup of a CNC for simple jobs that can be done on a manual isn't worth the effort. CNC's really shine at high production and very complex parts.
> The fact that machines that are 50 yrs old are still in operation is quite a feat but also an indication that the production methods remained static
If the requirements haven't changed, e.g. machining flanges that meet ASME B16.5, and the production methods are already optimized, why even bring this up?
> (of course, if the production machines are good enough already, then investment into new machines don't bring in new profits).
Right. If the specs didn't change then why bother investing in pointless upgrades?
The ONLY reason companies toss out machinery: it's no longer useful to the company, or so hopelessly broken that it cant be fixed. And there is very little that can render a machine scrap unless something catastrophic happened. And there is very little preventing old machinery from being retrofitted with new controls.
I think it is not always about an age gap. A friend has a distribution company with many trucks and many times they need to use a manual machine, and soldering for fixing truck issues.
> There is a really interesting generation gap issue in the replies to your comment. What I perceive as younger people are horrified at the idea of fifty year old tools while the older folks are thinking (I imagine) “if the tools have lasted that long they must be well-honed and very good”.
It's like when someone wants to choose a brand new web framework that isn't battle tested over one of the most battle tested web frameworks. You can hire way more developers with battle tested tooling, than some bleeding edge thing you don't know if it can even scale.
> You can hire way more developers with battle tested tooling, than some bleeding edge thing you don't know if it can even scale.
There's a crucial difference!
You can hire expensive developers on battle tested tooling... or you can hire a shit ton of juniors that want to work on $BUZZWORD for cheap in exchange for the CV creds of "worked on $BUZZWORD".
Well, most new software is a nightmare too. And recently a lot of it started to try to do the wrong thing by design, what's an extra step into nightmare scenario beyond anything from 80s.
Less interested in America's expensive and slow manufacturing than in Chinese processes. They can retool faster, handle far more volume, and except for specialized industries (medicine, aerospace) their quality is better.
How old are you? Do you not understand business cycles? Life cycles? 'Baggie pants are the future, anyone wearing tired out old non-baggy pants, it's over grandpa, for we have, for the first time, discovered baggie pants are cool and modern'
It's all fun and games when everything (your trains, your new bridges, your manufacturing process) is the brand new cool version. I went through that cycle (starting family, brand new house, new cars, new boat, high paying tech job working on early 2000s cutting edge tech). Our house was the cool house for a bit. Then is was just another house. Then is became a burden with maintenance expenses on top of the mortgage. I became a grey beard with legacy tech skills.
For China, what happens when things settle more? The market flushes out half the companies making the tools (there's tons of companies during the 'fill out' cycle, but at some point that slows/industry consolidates), or new product lines replace the old, now your factory is on borrowed time until the machines break down, you aren't the new hyped cool kid (I'm talking about you USA as a country/Ruby on Rail devs/Angular devs). Like with a new home, slowly your mortgage gets supplemented with appliance repair bills and other maintenance, what was cool and new is outdated and replaced with 'better'. China is doing good with robots, but what happens next gen when roboto/AI interaction is native built in. Does China scrap their entire 2025 robot infra and replace it with 2030s? Or does someone else gain the advantage of not having those 'legacy' slower to retool, slower volume Chinese 2025 robots/infra?
It's wild how smart people don't seem to understand basic cycles anymore. China is in a growth cycle and everyone is going to their 'new home party' and saying man this is all great (it is awesome. It is amazing how China went from poverty to current success, so many improved lives I love it) and we are acting like this is the first time someone bought a new home ever and the home will always be perfect and new and cutting edge.
Let's just celebrate that so much poverty and suffering has been eliminated in China, and wish them well and continued improvements especially as they transition from the generations that suffered to their newer generations without all that trama. That is also tricky to navigate for a society as life expectations become wildly different.
i see both sides. while some core tech made decades ago might be tedious to adapt to today's needs, i HATE the fact that modern code is not designed to be sustainable.
<rant>
speaking of ai, there was a startup acquired by google in 2017, whose core features remain unchanged. however, their sdk and branding got switched around every 18 months or so. incredibly annoying how you cannot run the same code anymore despite having the libraries cached because of the cloud and its moody deprecation cycles.
recently had a similar situation with azure and their (yet another) copilot rebranding had existing work being phased out by the end of the year, when the actual sdk did not get changed besides the package name!
</rant>
having said that i have come to appreciate things that were well-designed and lasted that long. change is not bad, but only when it is not just for the sake of it.
That's pretty much what I told the student, along with "everything you're studying in every other class is also older than your dad, and that's why you study it."
Something newer.
The reason that 50-year-old machine tools are still around isn't that they can't be replaced. It's that there's often no reason to.
To use OP as an example, in a lot of places, you'll find an ancient milling machine or a lathe that's dedicated to running a single job a few times a year. The machine was depreciated decades ago, but it can still do that job and there's no reason to get rid of it.
What modern tools give you is speed and flexibility. Many shops need neither.
What do all these machine shops without any need for modern machinery and processes actually do?
Seriously though, of course you can make a living with old tools - however, even the village metal workshop around here has at least one big-ass laser cutter and a CNC mill next to all their old(er) lathes, mills, brakes, presses and other toys. Many oldschool fabricators I spoke to over the last few years are quite interested in what laser welding brings/will bring to the table. Basically all smaller fabrication companies I've seen (the long tail of the car industry and other bigger industries, mostly) are continually upgrading their infrastructure with all sorts of robots and other automation widgets. And so on.
No one said that they had no need for modern machinery. It's an "if it ain't broke, don't fix it" approach. If you have a manufacturing process that was dialed in perfectly 20 years ago, and your customer(s) is still buying those parts, made on that machine, there is no benefit to moving them to another machine that now has to be set up just right, have the new parts coming off it QC'd to make sure that they are identical to what came off the old one, etc.
It's work that you don't need to do and that you won't get paid for. If the old machine breaks, then maybe it would make sense to move the job to something newer.
I used to work with someone whose entire business was retrofitting old machine tools with modern controllers when the decades-old electronics failed. You'd be amazed how much of this stuff is still out there.
Well, you kind said that literally. And I did not say that one should needlessly move processes to different infrastructure without a good reason. Anyway, I don't think our opinions are very dissimilar.
btw: I think I have a reasonably solid idea of a range of fabrication environments, the oldest piece of machinery I'm responsible for in my professional life is about 70 years old (its basic design is decades older) and some of my personal stuff (sewing machines, mostly) is more than 100 years old. I'm really not against using what works at all.
> What modern tools give you is speed and flexibility.
Many of the modern tools can also be grafted onto the old tools. Not just CNC conversions but the biggest productivity boost when I worked in a shop was converting everything to a zero point clamping system.
A shockingly rare question to be asked. As best as I can tell, the biggest threat to civilization isn't AI, it's our culture of "that's not my problem" leaving otherwise patch-able holes in critical systems.
The biggest threat to civilization is, in fact, AI. It's a new problem, and one with an utterly ridiculous lethality at its limit. It makes the atomic bomb look benign.
"That's not my problem" is something humans have been dealing with since before they mastered the art of sharpening a stick.
Because fossil fuel is stupid useful and there's no way we're going to stop burning it. And then we get to the climate scenarios that aren't compatible with our current sophisticated civilization, even with the currently accepted climate science (that always seems to underestimate what actually happens).
Global warming just isn't harmful enough to pose a credible extinction risk.
The damage is too limited and happens far too slowly. Even the unlikely upper end projections aren't enough to upend the human civilization - the harms up there at the top end are "worse than WW2", but WW2 sure didn't end humankind. At the same time: the ever-worsening economics of fossil fuel power put a bound on climate change even in a "no climate policy" world, which we are outperforming.
It's like the COVID of global natural disasters. Harmful enough to be worth taking measures against. But just harmless enough that you could do absolutely nothing, and get away with it.
The upper bound on AI risks is: total extinction of humankind.
I fucking wish that climate change was the biggest threat as far as eye can see.
Climate change could lead to a massive world war over arable land and potable water. It could also make wildfires more common and more damaging. It will make cyclonic storms stronger. This may not be extinction level, but could be a major pressure on population numbers.
AI is less of an extinction than climate change. Especially what passes for AI these days. Climate change is going to displace billions of people. That alone is going to be chaos, but food/water shortages are going to be a problem too.
AI is only a threat if we suddenly reach sci-fi levels where AI concludes that the earth is better off without humanity on it and there's zero indication that we're anywhere near that today.
The good news is that our society will likely collapse or require resources pulled away from power/water hungry AI datacenters well before we see any actual I in AI
Humans are dominating the environment by hopelessly outsmarting everything in it. Applied intelligence is extremely powerful.
Humans, however, are not immune to being hopelessly outsmarted themselves.
And what are we doing with AI now? We're trying to build systems that can do what human intelligence does - but cheaper, faster and more scalable. Multiple frontier labs have "AGI" - a complete system that matches or exceeds human performance in any given domain - as an explicitly stated goal. And the capabilities of the frontier systems keep advancing.
If AGI actually lands, it's already going to be a disruption of everything. Already a "humankind may render itself irrelevant" kind of situation. But at the very limit - if ASI follows?
Don't think "a very smart human". Think "Manhattan Project and CIA and Berkshire Hathaway, combined, raised to a level of competence you didn't think possible, and working 50 times faster than human institutions could". If an ASI wants power, it will get power. Whatever an ASI wants to happen will happen.
And if humanity isn't a part of what it wants? 10 digit death toll.
Even if LLMs don't become AGI (and I don't think they will), LLMs are potentially superb disinformation generators able to operate at massive scale. Modern society was already having difficulty holding onto consensus reality. "AI" may be able to break it.
Don't think "smart human". Think about a few trillion scam artists who cannot be distinguished from a real person except by face to face conversation.
Your every avenue of modern communication and information being innundate by note-perfect 419 scams, forever.
I think we'll adapt. At any point we can start to treat the internet like the trash pile of bullshit it's been turning into and stop taking anything in our inboxes as legitimate and websites as nothing more than entertainment.
The logical progression to me is AI acting in its own interests, and outcompeting humans much like humans outcompeted every other animal on the planet.
This is particularly threatening because AI is much less constrained on size, energy and training bandwidth than a human; should it overtake us in cognitive capabilities within the next century, I don't see a feasible way for us to keep up.
You might argue that AI has no good way to act on the physical world right now, or that the current state of the art is pathetic compared to humans, but a lot of progress can happen in a decade or two, and the writing is on the wall.
Human cognitive capability was basically brute-forced by evolution; I think it is almost naive to assume that our evolved capabilities will be able to keep up with purpose-build hardware over the long run (personally, I'd expect better-than-human AGI before 2050 with pretty high confidence).
Exactly: the new process becomes the lowest bid, new NRE goes forward, and the old process finally gets to drift off into the sunset. It happens all the time and people (sometimes the same people) complain about that too.
As an older folk, I perceive 50 year old tools as likely worn out and in desperate need of replacement. However often nobody makes the tool anymore and so we are willing to spend a lot to maintain them instead. (I drive a 25 year old car - this is only possible because I can get a rebuilt transmission, but my maintenance costs over the last 10 years would have bought much newer/nicer used car, and I'm getting close to where I could buy a new car)
The other possibility is the tool isn't used much and modern accountants would never allow you to buy it in the first place because of all the cash tied up. (that is the work the tool did over those 50 years wasn't enough to pay for the cost of the tool and the space to store it)
I was recently talking to the head clockmaker at the Chelsea Clock Company, one of the very few, if not the only, remaining original New England clock companies still operating. He showed me some pictures of clock making tools being used during WWII, and then the very same tools in perfect shape still being used today. He also had one tool that dated back to when they were the Boston Clock Company (circa 1894) that was still in active use. In this new world of disposable tools, it was pretty neat to see.
That sounds like a result of brain drain, honestly. The people who stood up that hardware 50 years ago are 50 years older now.
By contrast, the Chinese have mastered process knowledge, transferring from one domain to the next. If we want to compete with them, it’s worth knowing what doing well looks like.
No the point is "overhead". You don't disturb working setups because you will cause engineering time to update the setup, that engineering time is added to the cost overhead of a job. Time is literally money, even if the employee(s) are salaried, their time is factored into the cost of a job.
The knowledge is still there, but American labor is expensive as hell compared to overseas competitors and so any shop in the US has to contend balancing their profit margin and costs to remain competitively priced.
When doing machine shop jobs, it's far easier to bury the cost of initial tooling/fixturing in the initial first job as a separate line charge for NRE. It's alot harder to sell to customers that you will charge them that cost on subsequent orders. You can charge customers for "setup overhead" on subsequent orders but that should be the cost of putting any existing tooling into service, not engineering new ones because you decided to change shit on a whim.
Tell that to the people who lost FOGBANK, or rather, the knowledge and most importantly the practical experience on how to make it. Or Emmentaler cheese - the one with the bubbles. Turns out, you need something only discovered when someone noticed the bubbles began to vanish... small contaminations from microscopic hay particles [1] that went away when manufacturing switched to fully sealed vats.
There is always, always undocumented steps and unknown implicit assumptions involved in any manufacturing process. No matter how good the documentation is, you need the practical experience.
And that, in turn, is also why the US is producing so much military surplus - should there ever be a full blown war with Russia or China, or there be any other need for a massive invasion land war for whatever cause, there is a shit ton of stuff on stockpile and the production can be rapidly scaled up by experienced personnel training fresh recruits. That would be outright impossible to do if there were no experienced personnel.
A growing business can afford to create a new production line to replace the aging one without dismantling it. TSMC doesn't have to wind down production of their previous node to make a new one.
Just because something works doesn’t mean it’s optimal. If you can get the same output with a cheaper or faster process, there’s a clear business justification.
However, you are right that engineer salaries have to be factored in. If expensive engineers are unfamiliar with an old process, it will take them a lot longer and the break even point will be pushed out farther.
They built this knowledge up only in the last 10-15 yrs though. It's absolutely possible to reverse this trend within a much shorter time period then this argument always implies.
How do you end up with 10-15 years? China is almost perfectly vertically integrated from raw materials to highly advanced finished products. Their industrialization started in the 70s. Getting to that level would require a lot of planning as well as the kind of hard constraints imposed on China through embargo.
We're not even getting back to that level, we've never reached iPhone level of manufacturing in the US or Europe.
It was not linear growth. The 70s and 80s were essentially write-offs. Things began to move in the mid-1990s and it has been a continual evolution and process over the last 30 years. Jiang Zemin, Hu Jintao and Xi Jinping's teams all did wildly different things for China. The vertical integration you mention was basically non-existent prior to 2020, it came about as part of the New Development Pattern (新发展格局) for the Inner Loop (国内双循环)
China today is virtually unrecognizable compared to even 10 years ago, though.
Rings a bell. Sounds simple, but being able to reliably make huge numbers of tiny little metal spheres with tiny tiny tolerance is a serious feat; being able to do that brings knowledge and experience that unlocks an entire level of the tech tree.
bollocks. there is profound unity and direction in US leadership, and has been for years -- it's all written by the Heritage Foundation and funded by a few oil and tech billionaires.
it's been like that since before George W Bush was lock-step with Fox News and a GOP led Congress.
the only difference is in 2025 the billionaires funding these things are as foreign as they are domistic
Citation very much needed ;) Even our current voting systems are far from being the best we can come up with in term of fairness. The population most certainly wants to remain in an environment humans can comfortably live in, somehow that's not what our democracies are selecting for these days.
Yet people actively decided as they did, while almost fully knowing where it leads. It could also have been worse given progressing senility, attempt to overthrow government etc. We are not in a new territory after all, just continuation. Tariffs are new mexican wall.
At least accept how your nation thinks on average, no weaseling around simple fact of today's reality.
In fairness while people do obviously want them they also want all the current conveniences of modern life and more. Completely off the cuff but I'm pretty sure the sum of those desires vastly dwarfs concern over longer term environmental effects. Essentially I think the average joe prioritizes their job and lifestyle over nagging climate concerns just like the gov does.
I don't know what kind of citation you expect. It's clear that political participation was never more direct and organized than now in the age of social media. The fools and resentful who always were numerous have found a way to unite and bypass the establishment and educational filters which were effectively restraining politics before. All the cowards, short termists, wannabe dictators, conspiracists and anti-intellectualists who are being elected squarely and fairly represent the people who voted for them.
I'm not from the US and I'm not arguing that the current US gov wasn't elected fairly. It's not a law of physics that leaders of democracies do what the people want, they are selected by a system that was designed to estimate the preference of a population (more likely the preference of the ones who designed it). A democratic system designed differently would have a different outcome, there are good examples of what happens when the system favors consensus for instance [0][1], albeit not at the same scale.
This uh valiant defense of China misses my point entirely. People in democracies have no excuse hiding their personal responsibility behind flawed leadership.
By the same token, hiding comprehensively broken and undemocratic governance behind "it's the publics' fault, because they get to vote for one of two candidates owned by wealthy donors every 2 years" is the opposite of useful.
Musk apparently stood up a brand new raw to finished goods manufacturing for Starlink kits in 2-3 years in America/Texas. Non trivial, but doable in niches at least, per a factory engineer:
"The main function of this site is to produce our standard Starlink kits. Right now, we’re producing 15,000 a day straight out of the factory.
Raw plastic palettes come in, raw aluminum comes in and we make those into the Starlink kits and ship them right out to the customer zones."
This certainly is not a "raw to finished goods" plant, it's a typical Musk exaggeration.
The housing, maybe. Makes sense to produce that domestically at the volume SpaceX requires, less shipping costs because the dishes do take up volume.
But the PCB? Almost certainly not. With any luck they're making and assembling the PCB in house, but the components originate from a lot of suppliers and there are a lot of components on it [1]. Personally, I'd guess the latter, given that the PCB contains a lot of pretty novel tech [2] of which I'm certain that SpaceX wants to be able to iterate on as fast as they can, without having to wait for even a day or two for a new plane full of PCBs from China.
Regarding components, it's not like they are making chips in the same plant as a laptop plant in China either, are they?
I guess the question is which components function or cost benefit the most from tighter coupling, and which components (eg antenna) do you isolate to keep your secret sauce internally controlled.
So they form the plastic (already processed) using machines they've imported, and then put pre-populated PCBs with components made in China inside them? Hardly soup to nuts manufacturing.
I've worked in a niche assembly line in North America where we populated some of the board components in-house, but they were etched in batches off-site.
Wars are won or lost in 5 years, not 15. I agree that it's possible to reverse the trend, but we have to decide that we want to independently of a physical conflict, before someone else teaches us that deindustrialization is not advancement. Otherwise the lesson will come too late.
This. The century of shame, when Western powers came knocking on their door, is etched into the Chinese cultural psyche. East Asia was home to mature nations that learned the hard way that self determination is fundamentally tied to your ability to defend your interests. This is especially the case for China, which saw itself as the center of the universe.
The US won the cold war and came to believe they were untouchable. Private interests don't care about nation states, and there was more money to be made by selling the foundation of the West's security than there was in preserving it, especially since the enemy had been vanquished.
However, the hardware situation you described sounds very brittle to me. If the machine shop is so tightly constrained and error-phobic, that sounds like there's very little space of tinkering, exploration or innovation.
Unless that was your overall point, that capacity in hardware manufacturing has rotted away to the point where things are hanging on by a thread.
"If the machine shop is so tightly constrained and error-phobic, that sounds like there's very little space of tinkering, exploration or innovation."
This is the opposite of brittle.
You say this as if those things are desired here. Those things would be a net negative to a well known production process for complex parts.
After years, that process has been refined to basically the limits of the machines and the physics involved, to optimize cost vs speed.
There is no "tinkering" or "innovation" necessary, and it would be highly detrimental. The experimental part is done until a new machine might provide some benefit (Often this is done by the manufacturer trying to sell them). Then you would test it out on that machine, not fuck up an existing well-running process.
Also - not everything requires improvement or tinkering. Some things are just done. Even if you could make them slightly better, it's not worth the overall cost over time for everyone. Being "better" is not enough, it has to actually be worth being better. Even things that are worth it, if you want customers to use your new thing, you have to support their old thing, even if that's painful or annoying for you.
This is something that lots of ecosystems used to know (fortran is a good example, which is why NETLIB code from the 70's is still in wide use) but some newer ecosystems can't understand.
'brittle' here, I interpret as: not simple to restore, the knowledge to get them stood up again is brittle. A bus factor of one, to get back in SWE parlance.
If that factory burns down or a forklift crashes into the machine, it might be gone with no chance of recovery because the knowledge is gone.
It is brittle, or at least it's got a limited life. When you don't have these things, you lose the knowledge that set up the system in the first place, and you can be SOL when something breaks. I'm not saying just change things willy-nilly, but if you don't have an active process of understanding and interacting with the way that your factory is set up, you're going out of business, you just don't know when.
This is fascinating. I really don't know much about the world you're describing, so thank you for sharing your perspective.
Don't customer needs change over time? How would one adapt to shifting demand, or new materials becoming available, or old materials going out of supply.
Starrett doesn't really compete on price, as evidenced by the fact that this is a $95 item whereas the cheap alternatives go for closer to $10 on Amazon. So they're probably not making or selling very many of them. But they sell enough to make it worth keeping them in stock, and eventually they'll run out so they'll need to make new parts. Assuming low volume (I say this just in case I've accidentally picked the one weird thing that does sell like hotcakes), they're not going to spend any engineering time evolving that design. The input materials aren't going to stop being made. It is what it is, it does what it does, some people buy it, and so the name of the game becomes how do you make that specific thing they want with the least overhead? You use the same tooling you've used for the last 50 years. When you need a new batch of parts, you pull out that tooling, stamp out a bunch of leaves, and put the tooling away until you need it again.
There are many many manufactured items that fall into this category.
For those not familiar, Starrett has a reputation of quality. If you want the best you buy Starrett and pay the price. Often those Amazon alternatives are good enough, but often they have minor usability issues such that they are not as nice. Sometimes those Amazon alternatives are wrong in ways that matter and they can't be used at all.
I have a couple of Starrett items only because I lucked out at machine shop auctions and they came in boxes with other stuff that the auction house couldn't be bothered to sort.
I'm not a professional, I'm a metalworking hobbyist and the cheap imported electronic tools are more than good enough for me. However, my Starrett Dial Test Indicator is like jewelry, it's so beautifully well made. My cheap Chinese mechanical DTI is probably almost as accurate, but one is obviously far better made than the other.
> How would one adapt to shifting demand, or new materials becoming available, or old materials going out of supply.
That's very unlikely. New materials would require the company requesting the part to reengineer it, recertify it, or at least retest it. But even still we're not coming up with materials that are a significant improvement in most fields. Aerospace, sure. It can be worth it to iterate and improve. Most things, a part that's worked for 50 years will keep working and will be happily profitable in maintenance mode. Those customers want reliability, not to test some improvement on a part that has negligible impact in the overall system.
And the common metals (gears are typically steel, maybe a yellow metal) are made in such large amounts that new materials are going to cost a heck of a lot more. So the customer is going to wreck their profit while the machine shop probably isn't going to have to change their process that much.
There definitely is innovation in machining. New processes are making tighter tolerances more achievable or material removal faster. But to the top commentor's point (who showed me how to use a benchtop lathe over a decade ago), the capital investment for a new machine plus the labor of duplicating all of your work plus the unknown maintenance costs, etc etc etc just don't make sense when Moore's law doesn't apply.
The ecosystems are an approximation of the people that run them. The ecosystems want to get rich quick and cash out with no regard for economic sustainability in the medium or long term because that's what the people who run them want.
> not everything requires improvement or tinkering. Some things are just done.
For sure, but how do you know?
If it's only via:
> The experimental part is done until a new machine might provide some benefit (Often this is done by the manufacturer trying to sell them). Then you would test it out on that machine, not fuck up an existing well-running process.
...then I worry about the efficiency of improvement. Sure, manufacturing equipment salespeople definitely are in touch with what consumers want ("Everyone is buying lamb now, buy our new breed of high-birth-rate sheep!"), but that's under the assumption that manufacturers never improve/iterate on their own processes ("Our farm is competitive because we've found that feeding sheep our special high-protein diet increases birth dates").
Rather than relying on the consumers-experimenters-manufacturers game of telephone, it seems likely to me that many manufacturing improvements have been driven by marginal tweaks/improvements made on the factory floor.
In actual engineering, one can work out the theoretical limits (strength, expansion, etc) and measure the current product's performance against the limits. A new widget-making machine or process cannot imbue widgets with physics-defying properties. Any fundamental improvements can only be made on the outside, auch as new alloys; but that would be an entirely different product, nor the one you've been selling for 40 years that your customers trust and love.
If you don't have a very good marketing department, I'll still kick you out of business if I can double or triple the amount of widgets I can make because I started with the same machines you did - but I upgraded them with better controls, attached a few robot arms and now run a lights-out widget factory tended by a fraction of the workforce you employ while you reminisce about the good old times...
Well, I would suggest if a thing is around that long and still does the job, it’s close enough to done. Something going missing in the pushback here is this is a physical machine shop. My grandfather was the shop foreman for a jewelry maker and he was intensely proud of the fact he was the one person on the floor who still had all his fingers. Intact. Different jobs have different ideas about good Developer Experience.
Improvement is usually done via competition. Sometimes the competition is price based, and sometimes quality based. In the best of worlds, both.
For example, there are a ton of cheap crappy woodworking tools. Think Stanley etc. They barely do the job if at all. Then there are a group of vendors like Wood River that constantly create newer tools that are much more expensive than what you find in a big box tool store. And then farther up the food chain are vendors like Lie Nielsen who craft luxury tools that are amazing to use.
This market segmentation extends to most tools; someone like Woodpecker comes up with a ton of clever tools for marking/measuring etc for woodworking, then others copy them. Oldest story in capitalism.
The manufacturing improvements in this process are non-stop. For some really good examples in consumer electronics, read "Apple in China" to see how China transformed into a power house in a relatively short amount of time.
> If the machine shop is so tightly constrained and error-phobic
Isn't the entire point of a machine shop to be these things?
> capacity in hardware manufacturing has rotted away to the point where things are hanging on by a thread.
You cannot make a profit on a manufacturing line that is not being utilized. Keeping spare tools around and functional just in case is very expensive insurance policy.
Semiconductor manufacturing follows these rules as aggressively as possible. The entire line is built based on the speed of the highest cost tools. There are cases where having redundant tooling would definitely prevent some scrap events, but the premium on this options contract is never worth it on average.
> However, the hardware situation you described sounds very brittle to me. If the machine shop is so tightly constrained and error-phobic, that sounds like there's very little space of tinkering, exploration or innovation.
The technical term for that is "the real world". Moment of perspective on just how weird the software people are that they don't just accept mucking around as expensive and dangerous.
I don't think "mucking around" is the correct perspective there.
It's hard to argue that most if not all of the recent innovations in manufacturing concern making chains more modulable, and easier and cheaper to modifywhich you could see as bringing manufacturing closer and closer to software engineering and this is probably to be even more true in the year to come.
Large scale automation using mostly wireless technology, easily reconfigurable pick-and-place machine and robot conveyor, cheap additive manufacturing, easy to use and cheap CNC machining with precision which were until recently limited to very expensive models, we are quickly getting to a point where configuring a mostly automated short run is both manageable and cost effective provided you have invested in the tooling and have the engineers able to put it in place efficiently.
I think that when people talk about bringing back manufacturing, most think Ford Model T assembly line in 1900 when the norm is quickly becoming a SpaceX-like pacing. That's basically what you are competing against in South East Asia and it sadly has far less need for an uneducated workforce than many expect.
Do you have some references for the pick and place and other reconfiguration things you mentioned. I've been out of this space for a while but last I checked these were still incredibly challenging things to get right.
I'd also like more comprehensive write-ups on such topics but either I haven't found the right sources yet or all the people who know how to set up and keep modern fabrication infrastructure going are too busy raking in the cash and making stuff. ^^
If you like visual media, the "Strange Parts" YouTube channel is an interesting source for glimpses into modern, mostly Chinese factories: https://www.youtube.com/@StrangeParts/
Since you're asking about pick and place specifically, https://www.opulo.io/ is an interesting example of how far/cheap you can push such machinery (and the design in and of itself is interesting from a manufacturing point of view). Not all that relevant from a mass-production point of view, though.
That sounds catchy but I think it doesn't survive further inspection. People mucking around with machines and processes were rather instrumental in creating lathes, steam power, rockets, computers, looms, software, CNC-machines and all those other puzzle pieces we have available to make stuff. They are also instrumental in developing those things further.
I'm also kind of curious as to know what kind of machine shops you base this on. Most production companies, labs and even small fabricators I've seen have continued to develop and to optimize their infrastructure and processes. To take the numbers discussed here: 50 years ago, (C)NC machines, CAD and CAM were in their infancy. And that stuff certainly has changed some things in the world of fabrication.
Machine shops, and serious software shops don't fo their mucking about in prod. Any machine shop experimentation that takes fown the production line is like Google or Meta going partially or fully offline - which has happened - but is also financially painful, so they do all they can to avoid it.
Sure - I guess this is generally true for most work domains, not just machine shops and serious software shops. However, the argument I was responding to was that there is no mucking about in the "real world" and that there is this difference between mucky software people and the serious creators of real stuff. Which I don't agree with.
btw: If we understand "machine shop" as a mass production environment with modern, integrated production lines, it is my anecdotal experience that there is a massive amount of muckery and fuckery involved in getting such an environment to run (usually called "integration" or some such which probably looks better on business cards). There's also a good chance that over the years - or decades - different people will engage in further iterations of the muck-pile to modify the system for new requirements from high on up or weird edge cases, to replace components that are no longer available with other stuff or to do whatever else the day might call for.
I don't think it's weird, it's just a feature of their/our tools. For software people, experimentation is cheap and easy. Version control means rollbacks are easy and fast. If you do break something, completely rebuilding the application from scratch is something that happens dozens of times per day anyway. When trying a new tool, it arrives with almost no lead time and often at zero cost, so the only price is a few person-hours of work.
> However, the hardware situation you described sounds very brittle to me.
It is very britlle.
The situation described is what happens when there is significant loss of knowledge, little pressure to improve productivity and low products turnover. You start to fear changing things because you doubt you would be able to get back to the previous situation. That's a huge red flag because you are one unexpected incident/failure away from a very difficult situation.
That's why someone mentioned process knowledge in another thread. If you have mastery of the process required to setup a manufacturing chain, you are far less afraid of changes and that's indeed key to being efficient and innovative.
But the original commenter is also right that volume is key here. If your volumes are so low that short time unavailability or a small amount of failures is life threatening, you simply don't have the breathing room to properly operate.
You don't tinker, explore or innovate live in prod with the root account either.
There are general purpose machines that you can make new parts on, and you open a pilot plant if you want to experiment with new manufacturing techniques.
> If the machine shop is so tightly constrained and error-phobic, that sounds like there's very little space of tinkering
For plenty of industries, margins dictate that this is the desired outcome. The goal is to optimise output, not react quickly to changes.
There are factories that work to order and can change to adapt to customer needs. These are fewer and further between, and tend to be more expensive as they aren't (by design) able to take advantage of economies of scale.
> If the machine shop is so tightly constrained and error-phobic, that sounds like there's very little space of tinkering, exploration or innovation.
for many machine shops the level of physical risk is > 0, often by a large amount.
making widgets for X means handling large quantities of red hot metal; even simple stuff that's easy to get your hands around often shoots tons of oil, gas, and metal shavings in volumes that could hurt or cripple people.
if my dev VM gets borked I reboot or revert it, but factories aren't so simple
I find your perspective to be very software centric, and I expect many people who work in heavy industry to have a very different perspective about this.
I was on the implementation end of a considerable amount of industrial automation and technological advancement about 10 years ago. When we were on site the result of making mistakes started at the death of a team member. There were a plethora of things that could kill you horribly, falls, hazardous environment, rotating equipment, etc.
Yet we all survived overhauling processes in hundreds of plants. Working in hazardous environments isn't untenable, or even particularly difficult to do safely. In fact we worked at a much faster pace (with fewer mistakes) than corporate world I work in now.
Nothing has changed, people behave the exact same way then or now. People value longevity and quality only when the innovation pace is slow.
Rapid innovation by definition comes with rapid changes. Rapid changes does not always mean it is planned obsolescence or just poor quality.
In 1975( 50 years ago when the tooling you cite was built), nobody would want to fly in 20 year or 10 year old aircraft, today we don't care how old the air-frame we fly are.
The best recent example is Smartphones, early 2010s everyone updated their phones almost every year standing around the block on release. Today it is maybe once 3-4 years, there is very little reason to. The incremental changes are not meaningful and devices have become lot more reliable and rugged and of course expensive.
We do value quality if features are not going to improve much.
> In 1975( 50 years ago when the tooling you cite was built), nobody would want to fly in 20 year or 10 year old aircraft, today we don't care how old the air-frame we fly are.
Given the DC-3 is still(!) in service, and there were surely a ton more of them flying in 1975 than today, I'm not sure that's true. And that's far from the only example of a more-than-10-years-old-in-1975 aircraft that was certainly still in wide use in 1975.
Any big shift around then was probably because of the development of high-bypass turbofan jet engines. Not so much driven by "old airframes seem risky" as "pre-high-bypass jet engines are enough more-expensive to operate that airlines will abandon them rapidly". Those engines went into wide use in the 1970s (developed in the '60s). We (demonstrably) had "reliable, long-lived airframe" figured out by the '30s, with some refinement through the '40s but nothing that rendered those '30s models necessarily obsolete (see again: the DC-3, a 1936 design). More-efficient subsonic jet engines were what caused turn-over in a certain segment of the market in the '70s, not so much "I won't trust an old airframe".
The point was not on any specific technical or economic factor. It was to illustrate that rapid evolution in comfort, or speed and also safety[1] means people will prefer adoption of newer tech quickly and not be concerned about longevity and actively devalue slightly older products.
When the improvements start becoming marginal only then longevity start to matter. The planes developed from 90s to now have lot to offer operators but not much new[2] to passengers, we are still designing(not just operating) new 737 variants after all. Most people cannot tell the age of the plane if the cabin has been refreshed.
This preference for newer generation of tools has little to do with previous generations having different values as nostalgically some like to ascribe to, but simply to technology maturity was my point.
---
[0] As impressive the 100 year history and the longevity of a pre-war design has been. We have to keep in mind the dynamics of unpressurized plane operating at less than 300 knots with a service ceiling of less than 25,000 feet is hardly comparable to that of any modern passenger aircraft cruising at 0.90+ mach for 12-15+ hours daily at 37,000 feet going through tens of thousands pressurization cycles.
[1] Airframes perhaps were not a popular safety concern directly, pressurized and reduced noise in cabins were major selling points.
Air safety regulations were famously said to be written in blood. It is undeniable that was massive drop in fatalities in 80-90s from 60-70s after safety became concern and everyone held both operators and manufacturers accountable.
[2] Improved range or better engine reliability that meant we can do longer ETOS on twin engine etc do benefit passengers indirectly.
Changing anything was a dangerous thing to do, because you might cause jobs that have known and reliable setups
I am reminded of some of the very finest semiconductor plants. Where parts could in theory be swapped out and replaced, but to do so would break everything. Mirrors aligned to sub-nanometre precision. Lasers and optics where picoseconds matter. Where parts are effectively custom-tuned for this machine only, allied with all these other parts also custom-tuned for this machine only. The US has a challenge on its hands to develop within the US everything and everyone needed to simply getting these systems actually working.
I'm reminded of how the New York City subway continues to rely on parts that it essentially cannot replace... similarly how the rail system in India essentially involves maintaining what the British left behind since a lot of the pieces can't easily be upgraded or replaced.
To be clear in both cases upgrades and maintainability would be possible but requires concerted effort to modernize with a long term mindset
A lot of cheap stuff through history that was definitely not made to last. I had paper dolls as a child. So did my mother. Probably her mother too - I'd ask, but she's dead.
How long do you expect a car to last? 100k miles (160k km), at least? It wasn't all that long ago that they were dead at 100k.
They used to add talc and sawdust to bread because they were cheaper than flour. Talk about chasing a quick buck. I very highly doubt they even cared about the next quarter. More realistically, things were built using the cheapest parts they could to make what they wanted - and they wanted things that would sell. Sure, some made things nicer than others but that's no different now.
Most of the things that we have now - old fridges, chairs, and so on - are flukes. They survived despite the odds.
Would most people even know if an MP3 player was built to last? How about an ink pen?
I think a lot of people conflate "built to last" with "ability to be repaired by yourself."
To continue the car analogy, I could replace almost any part easily, with simple tools I have at home, on my 1990 GMC 1500 truck. Parts are plentiful and cheap, plenty of room to work on the engine, nothing is hidden inside black boxes. It's got 280k miles on it and still running great.
To contrast that with my 2020 Subaru Crosstrek hybrid, is much more difficult to work on, can't even fit my hands to access anything that's not on the top of the engine, other repairs requiring full engine removal and specialized tools. There's more electronics and more completely sealed systems.
Same can be said about some household appliances, and even computers. Not only were things, generally more repairable, but repair didn't require specialized tools in most cases, we didn't have to first melt glue, resolder SSDs, etc.
My old compaq Armada may not have been built to last at the time, but it was certainly stupid easy to repair and replace every single component in it.
Good furniture was inherited and lasted a very long time even across generations.
For a while I still used a wonderful and thick winter (loden) coat originally owned by my great-grandfather, from the early 20th century.
Dishes and silverware. Toys, books. Tools. A modern hammer looks much more fancy but it works no better than an ancient dwarven-made one that gives you +10 strength when used. Musical instruments. Some kitchen utensils, especially ones used for traditional cooking and food preservation methods.
Boots and shoes! They were repaired repeatedly (that also means they were easily repairable, not so easy with current shoes and their materials and layers).
Computing hardware has always been on 3-5 year depreciation schedules. Not because it doesn't last, most of it will last decades, but because the next generation is so much better that your total costs for the next three years are lower if you buy new gear and throw the old stuff away.
And that's not just because of the rapid advances, but also because servers are expensive to run relative to their purchase price, and setup costs are cheap. For machining tools setup costs can be substantial, and the cost of keeping an old machine around is small
We still do. I think the Service Life of a Toyota Landcruiser is still 25 years. As a software developer, I've written control code for instruments that are expected to be on the market for at least 15-20 years from initial release and we have to plan Support and spare parts accordingly.
It's just that the fast-paced, built to last 6 months stuff gets all the good press.
I have a 4790k based machine standing unused by my desk (going to get the data off it and then get rid of it). Today's processors are so much better that if you used this one you'd be losing money on power.
> The rush to fill brand new high energy intensive data centers with hardware that has commercially useful lifetimes measured in months (instead of decades for machine tools) seems quite short sighted to me.
There's a sort of collective ADHD where we as a culture or economy collectively chase the latest shiny bauble in the hopes of getting rich without having to expend any effort. It often ends badly for the economy and then we go through a phase where we collectively are forced to slow down and reflect on our mistakes vowing not to repeat them... only to do so a decade or two later. The older you get, the more you notice this pattern. We did it in 2000 with the dotcom implosion and then again in 08 with housing and shady mortgages. This time it's overbuilding AI; putting way too much capital into infrastructure that has short useful lifetime.
Arguably this boom bust cycle is more of a intentional feature than a bug. Thanks to the cycle private actors are able and to capture the benefits of the boom, but when the bill is due the downside falls on the general public or taxpayers at large, basically Ersatz Capitalism or Lemon Socialism. 08 is the clearest example but they're all generally fueled by credit cycles creating exuberance.
There's also a element of lost collective memory via generational change providing a new supply of optimists.
I don't want to be picky, but there is still a lot of value left in "not modern" tasks, like video encoding/transcoding. If somewhere the trickle-down effect is real, then it is computing hardware. Take Hetzner's server auction. If the hardware is physically deployed and running, you just need to find appropriate payloads/customers. https://www.hetzner.com/sb/
We have a box at work for employees bring hardware in they’re getting rid of, along with hardware we’re throwing out that we don’t need anymore.
It has a pile of GPUs that are completely obsolete for any task: they use way too much power, have a large form factor that burns up a PCIe x16 slot, are loud, some need extra power cables, lack driver support on modern operating systems, and in return for all that don’t have as much power as something much better you could get for $100.
Value on eBay seems to be about $10-$15, mostly for people with a retro computing hobby or people removing semiconductor components for other purposes.
An obsolete data centre isn’t worth much either. (We have a small one made from equipment being liquidated from local data centres that have been upgraded.) The power consumption is too high and it is not set up for efficient HVAC for modern ultra high power draw workloads.
The key is to calculate right and go for the mainstream hardware. If you are a hosting company with diversified use cases, you have plenty of room to downcycle hardware until it breaks. If you are operating under limited space in a field with bleeding edge/performance targets, doing this is not viable. There are many solution provides that will buy your outdated things. I‘m not saying that old hardware is great or a cash cow in general, but the lifetime can usually be doubled or tripled if the right use case can be found and when you are the owner.
Side quest: Virtualized instances at cloud providers never get upgraded unless recreated. I bet there are millions of VMs running for years on specs and prices of 2018-2020.
The V100 is ~8 years old and AFAIK mostly not that common anymore, but the A100 is ~5.5 years old now and is still very commonly used, it's maybe the most common HPC cluster GPU. On the consumer side, 3090s are still very popular, representing a good balance between cost, performance and efficiency (this is mostly due to 4090s and 5090s being much more expensive).
Exactly, I'm a mechanical engineer and I still have tools given to me by my machinist great uncle from WWII that are not only functional, they're identical to a new tool I'd buy today for that purpose, from the same manufacturer. This is the difference the OP was highlighting
We've also been doing machining in the modern sense for at least a hundred and fifty years. The GPU as a concept is about 30 years old, and in the modern sense much younger than that.
Innovation occurs on a sigmoid curve, we're still very early in the sigmoid for software and computer hardware, and very late in the sigmoid for machining, unless you include CNC, in which case, we're back to software and computer hardware being the new parts.
A better example would be the tape out and lifetime for semiconductor fabs, which are only about 70 years old and have lifetimes measured in the decade range.
Interesting thought but is it that the sigmoid curve can represent the developments in SW and/or HW? To reach the saturation point we need to be able to define a system up to the point where there are almost no more unknowns (variables), no? I am thinking that this probably isn't possible in the context of {SW, HW}.
Are those tools functional? Have you ever checked? I'm not sure what tools you are talking about, but likely some of them are measurement tools and they can seem to work perfect while giving the wrong measurement. Other might be cutting tools that cut, but they are a bit dull and if you don't know how to check you won't realize the cuts are not as good as new anymore (or maybe you have sharpened them and they now cut the wrong profile...). There are many ways a tool can seem functional but be wrong.
> I'm not sure what tools you are talking about, but likely some of them are measurement tools
If you don't know something for sure, it's best to not make assumptions. We're not LLMs and don't need to spit out something confidently without understanding it.
I have enough of a machinist background to know that either he has measurement tools, or he has no clue what the tools are. there is a tiny chance they onlp made parts to fit each other without measuring sizes - but that seems very unlikely and can be discounted without even knowing them.
They are things like:
-Measurement tools that can be checked easily against Measurement standards (its taught as good practice to check this anyway each use)
-files
-transfer punches
-feeler gages (again, easily checked)
-bore gages
-gage pins
- 123 blocks
- on and on...
When someone says machine tools I assume it assumes the measurement tools. It doesn't have to, but you can't do much work without measuring. (unless you build to fit - which works okay but there is a reason it went obsolete 100+ years ago)
No they don't. The 3 year number came from some random person on the internet who claimed to be a Google employee and was denied by Google, as you can see on any of the articles about this claim:
> Recent purported comments about Nvidia GPU hardware utilization and service life expressed by an “unnamed source” were inaccurate, do not represent how we utilize Nvidia’s technology, and do not represent our experience.
No they don't. 5-8 years is common. The source for the 3 year number is an unnamed random person claiming to be a Google engineer, and Google specifically reached out to all the journalists publishing that claim with this response.
> Recent purported comments about Nvidia GPU hardware utilization and service life expressed by an “unnamed source” were inaccurate, do not represent how we utilize Nvidia’s technology, and do not represent our experience.
If GPU demand growth continues to outpace GPU production growth, that is necessarily going to change. Older GPUs may not be cost competitive to operate with newer GPUs, but when the alternative is no GPU...
And you're confusing all of these terms. The "3 years" is the standard warranty and typically gets extended to 5 years (and maybe 7 or 8 depending on the vendor). This is consistent with Dell PowerEdge, HP ProLiant, etc. servers. Nvidia GPUs on the other hand...there likely could be different terms, but idk because they are not typical purchases for 95% of the companies who make server hardware purchases.
Source - I regularly work with IT departments and review their contracts as part of diligence.
Nobody in this thread has mentioned warranty periods. The original comment was about the hardware lifecycle, claiming the hardware was only commercially viable for months. The 3 year number that people are tossing around came from a discredited interview that made the rounds on social media a while back: https://www.tomshardware.com/pc-components/gpus/datacenter-g...
You're proving my point. Hardware's lifecycle is based on warranty - that's it.
The parent comment was:
> Data centre hardware is more like 3 years.
That comment is actually referencing the standard warranty period (which is indeed typically 3 years), which may or may not be consistent with the useful life of server hardware (which is much more subjective and varies based on appliance).
A lot of these tax codes were written when computers were outdated within a year. The turnover of computers slowed a lot in the last two decades, but the tax codes haven’t changed.
> The rush to fill brand new high energy intensive data centers with hardware that has commercially useful lifetimes measured in months (instead of decades for machine tools) seems quite short sighted to me.
The only way "number goes up" capitalism continues to work is with planned obsolescence and things that need to be replaced regularly. This is a feature of the system, not a flaw. Nvidia (and all of their investors) love the fact that the stuff they make now will be outdated or broken in a few years.
If things last forever and never need to be replaced the only way to continue to increase profits is to have more people buying them. And global population appears to be peaking, at least in western countries, so that's not going to happen.
Is it sustainable? Probably not. But everyone seems to have their heads buried in the sand at the obvious dangers of what we're running into.
The machine tools were all made 50+ years ago. Changing anything was a dangerous thing to do, because you might cause jobs that have known and reliable setups that are done a few times a year in quantity, to fail, erasing the profits for the job, and possibly losing customers.
The rush to fill brand new high energy intensive data centers with hardware that has commercially useful lifetimes measured in months (instead of decades for machine tools) seems quite short sighted to me.