There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.
Moreover, we're talking about televisions and old Macs. TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price), and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
Much older computers continue to be used because they run software that newer computers can't without emulation (which often introduces bugs) or have older physical interfaces compatible with other and often extremely expensive older hardware.
If people actually wanted to replace their hardware instead of fixing it then they'd not be complaining about the inability to fix it.
>There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.
It depends. Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems. You're probably better off getting a RasPi or something, depending on what exactly you're trying to do. Newer systems have gotten much better with energy efficiency, so they'll pay for themselves quickly through lower electricity bills.
>TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price)
We're already seeing a limit here. 8k TVs are here now, but not very popular. There's almost no media in that resolution, and people can't tell the difference from 4k.
For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.
>and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen. It's possible they might want newer smart TV features too: older sets probably have support dropped and don't support the latest streaming services, though usually you can just get an add-on device that plugs into the HDMI port so this is probably less of a factor.
> Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems.
Even many Pentium 4-based systems would idle around 30 watts and peak at a little over 100, which is on par with a lot of modern desktops, and there were lower and higher power systems both then and now. The top end Pentium 4 had a TDP of 115W vs. 170W for the current top end Ryzen 9000 and even worse for current Intel. Midrange then and now was ~65W. Also, the Pentium 4 is twenty two years old.
And the Pentium 4 in particular was an atypically inefficient CPU. The contemporaneous Pentium M was so much better that Intel soon after dumped the P4 in favor of a desktop CPU based on that (Core 2 Duo).
Moreover, you're not going to be worried about electric bills for older phones or tablets with <5W CPUs, so why do those go out of support so fast? Plenty of people whose most demanding mobile workload is GPS navigation, which has been available since before the turn of the century and widely available for nearly two decades.
> For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.
Some people. Plenty of others who don't even care about 4k, and then why would they want to needlessly replace their existing TV?
> They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen.
That's the point. 1080p TVs and even some 720p TVs are still sold new, so anyone buying one isn't upgrading and has no real reason to want to replace their existing device unless it e.g. has a design flaw that causes it to catch fire. In which case they should do a proper recall.
>The top end Pentium 4 had a TDP of 115W vs. 170W for the current top end Ryzen 9000 and even worse for current Intel.
You can't compare CPUs based on TDP; it's an almost entirely useless measurement. The only thing it's good for is making sure you have a sufficient heatsink and cooling system, because it tells you only the peak power consumption of the chip. No one runs their CPUs flat-out all the time unless it's some kind of data center or something; we're talking about PCs here.
What's important is idle CPU power consumption, and that's significantly better these days.
>older phones or tablets with <5W CPUs, so why do those go out of support so fast?
That's an entirely different situation because of the closed and vendor-controlled nature of those systems. They're not PCs; they're basically appliances. It's a shitty situation, but there's not much people can do about it, though many have tried (CyanogenMod, GrapheneOS, etc.).
>Plenty of others who don't even care about 4k
Not everyone cares about 4k, it's true (personally I like it but it's not that much better than 1080p). But if you can't tell the difference between 1080p and an NTSC TV, you're blind.
>1080p TVs and even some 720p TVs are still sold new
Yes, as I said before, we're seeing diminishing returns. (Or should I say "diminishing discernable improvements"?)
Also, the 720p stuff is only in very small (relatively) screens. You're not going to find a 75" TV with 720p or even 1080p; those are all 4k. The low-res stuff is relegated to very small budget models where it's really pointless to have such high resolution.
> What's important is idle CPU power consumption, and that's significantly better these days.
It isn't. You can find both ancient and modern PCs that idle anywhere in the range from 10 to 30 watts, and pathological cases for both where the idle is >100W. Some of the newer ones can even get pretty close to zero, but the difference between zero and 30 watts for something you're leaving on eight hours a day at $0.25/kWh is ~$22/year. Which is less than the interest you'd get from sticking the $600 cost of a new PC in a 5% CD.
And many of the new ones are still 30 watts or more at idle.
> That's an entirely different situation because of the closed and vendor-controlled nature of those systems.
It's a worse situation, but if the complaint is that they abandon their customers long before the customer wants to stop using the device, they certainly match the criteria.
> But if you can't tell the difference between 1080p and an NTSC TV, you're blind.
Being able to discern a difference and caring about it are two different things. If your use for a TV is to watch the news and play 90s video games then the resolution of the talking heads doesn't matter and the classic games aren't in 1080p anyway.
> The low-res stuff is relegated to very small budget models where it's really pointless to have such high resolution.
Which is the point. If you have an old 30" TV and no space for a 72" TV, you do not need a new 30" TV.
For most videos, the difference between 1080p and 4k ain't that large.
But for certain video games on a large screen, I can definitely tell the different between 1080p and 4k. Especially strategy games that present a lot of information.
Btw, as far as I can tell modern screens use significantly less power, especially per unit of area, than the CRTs of old; even if that CRT is still perfectly functional.
More recent processors can do more with the same power than older processors, but I think for the most part that doesn't matter. Most people don't keep their processor at 100% usage a lot anyway.
As I said in a sister comment here, you can't compare CPUs by TDP. No one runs their CPU flat-out all the time on a PC. Idle power is the important metric.
Moreover, we're talking about televisions and old Macs. TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price), and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
Much older computers continue to be used because they run software that newer computers can't without emulation (which often introduces bugs) or have older physical interfaces compatible with other and often extremely expensive older hardware.
If people actually wanted to replace their hardware instead of fixing it then they'd not be complaining about the inability to fix it.