Hacker News new | past | comments | ask | show | jobs | submit login
The capacitor that Apple soldered incorrectly at the factory (downtowndougbrown.com)
454 points by zdw 1 day ago | hide | past | favorite | 193 comments





Well, today I learned to install one capacitor in reverse orientation on the PCB on a 34 year old computer...

Definitely starting Wednesday off productively.


Well, until today I didn't even know capacitor can have orientation! So more productive Wednesday than yours. In entry level electronics class I had decades ago it was always treated as a component that works the same way no matter in which direction the current flows.

Ceramic capacitors don't have polarity. Electrolytic ones do. Thing is, electrolytic capacitors have far higher capacitance for their size -- though also higher resistance.

It's something to check, but the polar ones should be clearly marked as such.


Electrolytic capacitors are kinda like lead-acid batteries in that they are polarized through manufacturing processes. A voltage is applied in the factory to anodize the anode with a thin oxide layer. For fun, I think it would be possible to buy a quality low voltage cap and reverse the polarity of it in-situ which would remove the anodization from the new cathode and deposit a new layer on the new anode (former cathode) hopefully without over-pressurizing it to bursting, albeit with much less anticipated lifespan.

PSA: Electrolytic capacitors have a rough lifespan of 10 years. Any much older than that need to be checked out-of-circuit for ESR and then capacitance. Also, tantalums (historically) suck(ed). [0] Quality audio equipment from the 80's like a/d/s/ car amps used only ceramic caps and other over-engineered passives, and have the potential (pun intended) to basically last forever.

0. https://www.eevblog.com/forum/projects/whenwhy-(not)-to-use-...


Or much shorter, around two years, if it was part of the Capacitor Plague.

https://en.wikipedia.org/wiki/Capacitor_plague#Premature_fai...

   The normal lifespan of a non-solid electrolytic capacitor of consumer quality, typically rated at 2000 h/85 °C and operating at 40 °C, is roughly 6 years. It can be more than 10 years for a 1000 h/105 °C capacitor operating at 40 °C. Electrolytic capacitors that operate at a lower temperature can have a considerably longer lifespan. ... The life of an electrolytic capacitor with defective electrolyte can be as little as two years.

This is also why so many LED bulbs are shit, lots of heat in a small space full of electrolytic caps.

Recently read that if you are going to be using an LED bulb in an enclosed space, buy bulbs designed for the high temperature, otherwise you WILL get premature failures in bulbs that will last for years in ordinary lamps.

https://duckduckgo.com/?t=lm&q=led+bulbs+enclosed+fixture+ra...


Alternatively, there are now much more efficient bulbs available. If they're passing A under the new EU Energy Label (from 2021) they'll barely be warm to the touch.

Intentional planned consumption/obsolescence by design. This class of problem is where under-regulation and lack of standards benefits only sellers and cheats buyers. PS: Also, Amazon should be required to test all of the electronic, safety, and food products on its site such that they can prove safety and standards conformance.

I am assuming you are an ee (like myself)...I have never designed a product with a built in expiration, nor have I ever seen any app notes or write ups on the engineering of it - something engineers love to do.

What I have seen done is cheaping out on parts in order to get the price as low as possible, because customers shop primarily on price.

Not to lash out, but it kind of hits a nerve for me, because people think we design products to purposely fail. Hell no, we try really hard to do the opposite, but everyone just loves to buy the cheapest shit.

The $25 LED bulb that will last for eternity will rot on the shelf next to the $3 bulb that will probably be dead in 6 months. And one more "they build these things to fail" complaint will be posted online.


To be fair this is hardly limited to EE and is the issue with the race to the bottom in all product categories. Make long-lasting high-quality 100$ pants? People prefer spending 10$ on Shein.

Additionally, the issue is that as a consumer, it's not easy to differentiate between quality markup and greedy markup. I don't see the cap manufacturer on the box so the 25$ light bulb might last 10 years or it might last 6 months just like the 3$ one. At least with the 3$ one I can come back and buy another...


I agree with what you said - engineers do the best they can with the budget but the budget is small because people won’t pay for things that last - but it’s worth saying that any boards with electrolytic capacitors have an inherent built in expiration. Any product with rubber has an expiration. Any product with permanent batteries, glued or sealed assemblies, or no spare parts. Much of that is with the customer’s budget, sure. But these days, even among expensive things, nearly nothing is built to last.

Not LED light bulbs specifically, but...

"The Phoebus cartel engineered a shorter-lived lightbulb and gave birth to planned obsolescence"

https://spectrum.ieee.org/the-great-lightbulb-conspiracy


I seriously doubt it's ever a deliberate conspiracy in engineering apart from shenanigans like what happened at VW, but it's net effect of product managers, accountants, and contract manufacturers who modify PCBs and BOMs after it's passed off to them to save money on retail products. And so it's likely unintentional with negligence, but it benefits the company. Except for some Samsung appliances made ~ 2010-2014 which seemed to fail just after their warranties expired. I suspect highly-optimized designs for "consumables" like incandescent lightbulbs and parts for cars use data to tweak design life, more often than not, in their favor. And, with the pressures of multinational oligopolies and BlackRock/Vanguard/State Street.. there is little incentive to invest $100M into a moderately-superior incandescent lightbulb using yesterday's technology that lasts 100kh and 5k cycles and sells for $1 more than the next one. Maybe if we (perhaps a science/engineering nonprofit thinktank that spanned the world and gave away designs and manufacturing expertise) had quasi-communism for R&D, we could have very nice things.

It's not my fault if other people are too dumb to comprehend TCO because I would buy the $25 bulb if it had a 30 year warranty.


> Except for some Samsung appliances made ~ 2010-2014 which seemed to fail just after their warranties expired.

And? That just sounds like they have good engineers. If you are designing a machine, you have an target lifetime. You'd obviously want the product to last through the warranty period, because warranty claims are a cost to the company.

Every choice of a component affects lifetime. Designers of mass-market products can't just use premium components everywhere -- the mass market will not pay steep premiums for otherwise equivalent products.

Value engineering and planned obsolescence are not the same thing, but they are often confused.

That being said, Samsung appliances suck and I hate them. Mine failed within warranty several times.

> And, with the pressures of multinational oligopolies and BlackRock/Vanguard/State Street.. there is little incentive to invest $100M into a moderately-superior incandescent lightbulb using yesterday's technology that lasts 100kh and 5k cycles and sells for $1 more than the next one.

It isn't that. It's pressure at the shelf that does it. Consumers behavior simply does not reward equivalent-feature products with premium components that claim (true or not) to have a longer lifespan. Unfortunately, they will buy based on their uninformed sense of quality first.

If you release a light bulb that is identical to the best selling one on the shelf, but claims 10x lifespan, your competitor will do something like gluing a weight in theirs, putting some marketing BS on the box, and will put you out of business. Consumers just don't pick products based on actual quality.


You're making a pretty awkward value judgement about what a "good" engineer is, but you're describing an unethical one with a bizword like "value engineering". I realize ethics are no longer understood by much of Western society because the culture teaches transactionality, worships trickle-down economics and greed, and hyperindividualism.

> It isn't that. It's pressure at the shelf that does it. Consumers behavior simply does not reward equivalent-feature products with premium components that claim (true or not) to have a longer lifespan. Unfortunately, they will buy based on their uninformed sense of quality first.

This is a failure of marketing and buzz of the sales channel(s) and manufacturers to educate properly, not the failure of the customer.


A good engineer is one that has a job, doesn't put their employer out of business, and produces work that fulfills the requirements they're given.

Many people think there's some unethical conspiracy going on, and consumers actually want a product that lasts a long time, but companies are refusing to give it to them. But this is projection of individual preferences on to the market as a whole. Consumers want cheap shit that is in fashion, and their buying preferences prove this time and again. Maybe you want a 50 year old toaster in your kitchen, other people are buying products based on other factors.

If consumers really wanted to pay a premium for high duty-cycle equipment with premium lifespans, they can already do that by buying commercial grade equipment. But they don't.

If you are familiar with the history of home appliances, you'd probably come to appreciate the phrase 'value engineering'. Even poor people can afford basic electric appliances now because of the ingenuous ways that engineers have designed surprisingly usable appliances out of very minimal and efficient designs.

If you look at ads for electric toasters 100 years ago, you'd see they cost over $300 in today's money adjusted for inflation. Thank god for value engineering.


A good engineer provides value to society. If they fulfill requirements that are bad for others then they are not good engineers.

I seems to me that there is also a social dynamic to things. If consumer grade products become a race to the bottom then it is going to become more difficult for regular people to purchase products which aren't low quality. There's also a degree to which society (e.g. in the form of government policy, cost of living adjustments, etc.) factors in differences in prices.


The fact that poor people can now afford to own some household appliances isn't a huge value to society?

It completely changed the way our societies operate. I think it is a good thing that people have the option to buy crappy washing machines, rather than being forced to use the washboard and bucket my grandmother used. Yeah, they sometimes do develop a bad belt, or the timer mechanism might fail. But it beats being unwillingly forced into homemaking as a career.

The world only has so much wealth to go around, and that isn't the moral quandary of the engineer picking an item on a BOM on Tuesday morning to fix. If anything, squeezing a few more pennies out of that BOM is going to lift some people at the fringes out of poverty. At the opposite end of the product value equation, every unused and functional component in every product that is no longer in service, is wealth that is wasted that could have been spent elsewhere.


Engineers are to consider public safety first. This is not negotiable for real hardware engineering. Poor people could always purchase used appliances.

I agree that products shouldn't be unsafe. And value engineering does not mean making products unsafe.

> Poor people could always purchase used appliances.

The reality in mid 20th century US demonstrates this isn't the case. Most went without the modern appliances that are commonplace today.


> Intentional planned consumption/obsolescence

No it isn't. It is simply optimization of price and the features/form-factor that many buyers have demanded.

If anything, the lifespan of a ~$1.50 household LED bulb is quite incredible. I'm not sure exactly how anyone would be able to increase the lifespan at that price point and keep the traditional Edison form factor.

> Amazon should be required to test all [..] products on its site such that they can prove safety and standards conformance.

No, the manufacturers should be required to... the same way it works for literally every other product with safety regulations.


> If anything, the lifespan of a ~$1.50 household LED bulb is quite incredible. I'm not sure exactly how anyone would be able to increase the lifespan at that price point and keep the traditional Edison form factor.

I don't think I've had any last more than 5 years.

If you bought a cutting edge LED bulb back in 2002 or so, those had a life expectancy of over 60 years, and the build quality was such that you could reasonably expect to get that.

There are plenty of teardowns on YT showing how poorly even major brand name LED bulbs are put together.


Yeah I would hope those bulbs were built pretty well, they were crazy expensive... expensive enough that they wouldn't be competitive in lifetime-per-dollar against today's crappiest bulbs even if they lasted a person's entire lifetime.

> I don't think I've had any last more than 5 years.

Do you shut them off every 3 hours? That's probably what the estimate on the box is based on. Run the same bulb half the day and you'll only get 2.5 years out of it.

> There are plenty of teardowns on YT showing how poorly even major brand name LED bulbs are put together.

I've seen them. And dissected my own. Still, at the price that modern LED bulbs are being made, I'm surprised they're built as well as they are. Brand name Sylvania bulbs are $0.79/ea in a bulk Amazon right now.


The problem is that the manufacturers lie and say that the LED bulbs will last for many years when they don't.

The claim they put on the box is typically true, but based on some damn modest usage. (e.g. 3 hours per day in ideal environmental conditions) And of course, a mean-time-to-failure figure to someone with one bulb built with minimal QA is just a dice-roll when faced with the bathtub curve of product failures.

That, and customers insisting on preexisting form factors. Fitting the electronics and LEDs into the space of a traditional lightbulb comes with compromises, such as not having proper heat dissipation on either.

Yeah, you would think they would be two separate devices by now...

Please think for a moment not only about whether it's feasible for AMZN to run a safety testing program for all possible consumer products of our modern technological civilization, but whether you really want them to be in charge of it. Maybe they should just require certifications of testing in the jurisdictions where those products are sold?

Isn't faking certs already a problem?

Probably. Is it a worse problem than Amazon inspecting themselves would be? Is it a worse problem than Amazon demonstrably already has with policing counterfeits? I'm just saying, you could hardly ask for a less-qualified authority for product testing. At least with independent certs it's vaguely possible to align the incentives correctly. With Amazon the incentives would be hosed from the start.

Here's a question for EE nerds that happen to be reading this (maybe in the future).

What if I have a stash of big electrolytics that have been out of service for 10+ years? I know that I need to reform them over a few days, but can they even run at spec after so long out of operation?

We're talking BIG stuff, 400v, 200+J each


> Any much older than that need to be checked out-of-circuit for ESR and then capacitance

And that's a very time consuming and somewhat risky operation on an old machine you want to keep running. Some old PCBs are quite fragile.

I wish there was a way to test capacitors without removing them.


You can test ESR in-circuit, with caveats. Here's a good thread from EEVblog [1].

[1] https://www.eevblog.com/forum/beginners/is-there-any-way-to-...


Bipolar electrolytic capacitors are a thing, I recently had to solder up a handful of them in some audio circuits.

Once you have experienced blowing up a reversed elcap you will never forget its orientation. I never understood though what makes it leak current and hence heat up.

Modern electrolytic caps don't burn like they used to.

The last few times I made a mistake, there wasn't even an explosion, even less a short-circuit. The thing slowly boiled and bubbled or unfolded.

Anyway, it blows up because the capacitor's insulation layer isn't some stable material, it's a tiny oxide layer built over the metal plate by anodization. If you put a high voltage on it with the wrong polarity, you reverse that anodization and short the liquid and the metal electrodes.


There's an aluminum oxide layer as a coating on both the anode and cathode inside the (electrolytic) capacitor. Under forward voltage it will gradually thicken but under reverse voltage it dissolves and causes a short. This increases the temperature which causes hydrogen ions to separate and bubble through the material, increasing pressure within the capacitor package until it bursts.

There are polarised and unpolarised capacitors. Stuff like basic decoupling capacitors tend to be unpolarised.

I actually have an LC III in storage, so I might actually be able to make use of this article.

I think this will allow me to classify today as productive.


Yeah, I have a Performa 450, which I believe is the exact same computer sold under a different name. So this is definitely important to know. I can go back to bed now, my job for today is done.

At least you made my Wednesday ;-)

[flagged]


It's the sort of mistake that one could make in a PCB design today.

Apparently people can't read.

I don't know which part of "to me" is not clear.

I don't design PCB boards (thus the "useless" word), and the comment was apparently a lighthearted joke in response to another lighthearted joke.


Commodore had 3 capacitors mounted backwards on the A3640, the CPU board of the Amiga 4000 with 68040 processors: https://youtu.be/zhUpcBpJUzg?si=j6UFmIJzoC-UDS6u&t=945

Also mentioned here: https://amiga.resource.cx/exp/a3640


ZX Spectrum +2 shipped with transistors backwards: https://www.bitwrangler.uk/2022/07/23/zx-spectrum-2-video-fi... This even caused visible artifacts on the display, which was apparently not enough for the problem to be noticed at the factory.

I think Clive Sinclair was notorious for wanting products to be brought to market quickly, with pretty aggressive feature sets. They very well may have noticed it at the factory, but didn't want to do a fix because it was technically functional.

Commodore just kept doing this. Just listing shoddy craftsmanship would take forever, and then we get to intentional bad decisions, like giving the A1200 a power supply that's both defective (capacitors ofc) and barely enough to support the basic configuration with no expansions, which is extra funny because PSUs used with weaker models (A500) had greater output...

This was the hardware patch I had to install to use a CyberstormPPC: https://powerup.amigaworld.de/index.php?lang=en&page=29

The number of used a500 power supplies I sold to customers when I upgraded their a1200 with a GVP 030 board + RAM...

Classic Commodore Quality :P

They also had backwards caps on the CD32 and A4000


There are so many cases of this sort of stuff it's unreal. But it gets even stupider.

I found one a few years back when I repaired a linear power supply. This required me to reverse engineer it first because there was no service manual. I buzzed the whole thing out and found out that one of the electrolytic capacitors had both legs connected to ground. They must have shipped thousands of power supplies with that error in it and no one even noticed.


That seems like one the least harmful mistakes you could make. Capacitors are sprinkled all over boards in excess’s because it’s probably better than not enough capacitance.

I have a 3D printer where presumably a smoothing cap just fell off the X axis controller section of the mainboard. Didn't make a lick of difference in anything operationally. Still works great.

Checks out, most boards are made with very conservative amounts of decoupling capacitance because it’s way easier than dealing with random failures due to not enough capacitance

I've understood that capacitors can be used for timing, or smoothing a voltage after a power regulator (I think).

How/what does adding capacitance help with?


Voltage spikes from line inductance, voltage drop-outs from line resistance. Basically you have little reservoirs of charge scattered all around the board (current flow isn't instantaneous in a real circuit).

It helps to always think of current draw in a compete loop, out the "top" of the capacitor, through your IC, and back into the ground side (this isn't necessarily what's happening physically). Shorter loop means less inductance, shorter traces less resistance.


Smoothing is part of the story: but the important question is what is causing the roughness? Switch mode power supplies have inherent output ripple that can be filtered, but that’s distinct from transient variations in the load. Decoupling capacitors are used to provide a low impedance path at high frequencies i.e. fighting inductance.

It could be there to control emissions. You’d need to analyze the circuit to determine its purpose.

Very possible! I actually have a 100MHz scope and sdrs that tune from 9khz to 2ghz, could be an interesting distraction on the weekend to see if that axis is any noisier than the others.

Way back when a co worker was powering up a fire alarm control panel. Poof, capacitor popped and damaged his eye

Name and shame!

Voltcraft. Can't remember the model number.

In the mid 80's I was the head of the CS student chapter. We ran the computer rooms for the science faculty. We had a room with about 20 Mac 128k. I do not know where Apple sourced their capacitors from, but these were not A-tier. A Mac going up in a puff of white smoke was a weekly occurrence. We had a few in reserve just to cycle them in while they were out to Apple for repair.

P.S. still my favorite Mac of all time was the IIcx. That one coupled with the 'full page display' was a dream.


On the other side, we had intern at our (very small) company and he used his own mac. One time he had to debug a mains-powered device. He decided that he will try connecting it to both mains AND programming dongle without separating transformer. He fried the dongle (it literally exploded, plastic lid banging on desk in sudddenly silent office is the most memorable thing), the company provided monitor and device, but somehow his private mac mini survived all this while being in the middle.

That sounds fishy, even if the debugged device directly interfaced mains, the Mac doesn't. And even if it did, how high would the probability be that both machines were on different circuits with phases so much out of sync that it would matter?

Unless I misunderstood your story


That device was a cheap wifi power plug, had cheap unisolated power supply, it was never intended to have user accessible electrical parts sticking out, so no need for isolation. In such cases device has common ground with ac voltage. I don't know all specifics, but NEVER connect any single terminal of 220V plug to your computer ground (usb ground in this case). When it's properly grounded, most devices will survive this. But somehow monitor connected to that mac didn't survive it. And several milliseconds of full 220V before circuit breaker reacted, made very thin traces in debugger pretty much vaporise and explode.

If i remember correctly, a lot of power supplies of cheap electronics have AC-coupled the low voltage side with the mains side. There's no physical wire, just a capacitor. You can often feel the AC when touching the 'safe' side of the adaptor.

Forget “cheap”. As far as I can tell, many modern ungrounded power supplies, including Apple’s, have enough A/C coupling from the line to the output that you can feel a bit of tingling when you touch a metallic object connected to the output.

How is this even allowed? My tv had it. My MacBooks since time memorial have it. They all feel “spicy”.

The Y capacitor is needed to allow the EMI to have a way to ground from the output rather than going out and getting radiated by the output lines.

I don’t believe for a second that this is actually necessary in a way results in that spicy feeling. I do believe that it’s far cheaper to use a Y capacitor than to come up with a better filter network that works well, though.

Common mode noise filtering is either going to be purely inductive or need a Y-cap. No other way around it.

One can build lots of things out of inductors and capacitors. I bet it’s possible and even fairly straightforward to built a little network to allow high frequencies to pass from output to the two line inputs with low impedance but that has extremely high impedance at 50 and 60 Hz (and maybe even at the first few harmonics). It would add components, cost and volume.

I bet this could be done at the output side, too. And a company like Apple that values the customer experience could try to build a filter on their laptop DC inputs to reduce touch currents experienced by the user when connected to a leaky power supply. Of course, the modern design where the charging port is part of a metallic case might make this rather challenging…

(Seriously, IMO all the recent MacBook Air case designs are obnoxious. They have the touch current issue and they’re nasty feeling and sharp-edged.)


The capacitor has to see the common mode voltage. Where do you put the other end?

Off the top of my head? Make a little gadget that’s an inductor and capacitor, in parallel, tuned to 60 Hz (i.e. a band-stop filter) and, in series with that, a Y capacitor. Wire up this gadget in place of the Y capacitor, so you end up with two of them (line to output negative and other line to output negative, perhaps). Or maybe you just have one, and you connect it between the normal pair of Y caps and the output. It will have very high impedance at 60Hz, enough impedance from DC to a few kHz to avoid conducting problematic amounts of current at DC or various harmonics, and low enough impedance at high frequencies to help with EMI. It might need a couple types of capacitor in parallel in the band-stop part to avoid having the high-frequency impedance of the presumably large-ish capacitor in parallel with the inductor being a problem, and it might be an interesting project to tune it well enough to really remove the annoying touch current, especially if you believe in 50Hz and 60Hz operation. Maybe a higher order design would work better, but the size would start to get silly.


My Fold 5 has that feeling along the hinge when charging too, no matter the charger I use. I guess it's considered safe, but it's weird.

Totally believable if the debugging device was doing something with a serial port. I once hacked something together to interface a PC serial port to a Raspberry Pi. The PC serial is real-ish RS-232, with negative voltages. The Pi side was just 0/3.3V positive. I had a nice 18-volt power brick laying around, and just split it's output down the middle--what was 0 volt ground was used as -9 volts, the middle voltage was now 0 volt ground, and the 18-v line was now +9 V.

At first everything seemed OK. but when I plugged a monitor into the PI I Was Made To Realize a) the nice 18-volt PS really was high quality, and although it was transformer-isolated its output ground was tied to the wall socket earth, b) monitors also tie HDMI cable ground to earth, and so c) my lash-up now had dueling grounds that were 9V apart.


With things like the Mac 128k, reliability issues may partly be down to Steve Job’s dislike of cooling fans.

To be honest, cooling fans never get the attention they deserve and end up whiney or buzzy.

That said, apple did a really good job with mac pro cooling fans where the shroud spun with the blades.

I think it did better than the the best PC cooling fans like noctua.


I always built PCs with the largest diameter fans possible - not sure why so many things come with tiny fans. Loads more airflow with less noise and even if they do spin up fast the noise they make is much more pleasant.

I was just thinking of the Apple III the other day.

If I remember, jobs had them not include a cooling fan. As it would heat up and cool down the chips in the motherboard would work their way out of the socket. So one of the official solutions to try if you were having issues would be to drop it a couple of inches to try and get the chips to re-seat inside.

Crazy.


Sounds like the person who designed the board followed a very simple and wise rule: always connect the negative side to the ground. Can't go wrong with that...

until you have to deal with negative voltage (-5V). Another out of bounds bug.


From around 2011-2015, I sometimes talked to an ex-Navy electrical tech who said he was also an early Apple rework tech in the SF Bay Area. He had no shortage of work fixing manufacturing problems, adding rework improvements, and building custom test equipment until they laid him off, outsourced his job to some random country, and then he was homeless until around 2016.

Anyone else a veteran of the Great Capacitor Plague? Seen more than one fire in the server room due to bad capacitors. "Burning-in" your server became literal.

It's a good thing that these machines don't even need -5 volts. With just the positive voltages provided, RS-422 still works, including LocalTalk.

I think the -5 volts is only there in case an expansion card needs it.


Apple should be required to do a recall for these motherboards.

If they do a recall, it will say they should be discarded. Sony has a recall on all its trinitron tvs made before the end of 1990 like this:

https://www.sony.jp/products/overseas/contents/support/infor...


This shouldn't be allowed at all: if the product was bad all along, they should be required to fix it, and shouldn't be able to say "well, it's old, so you should just trash it", which means they don't suffer any penalty whatsoever.

I don't think that's a reasonable expectation in general, and certainly not in this case. The affected TVs were all at least 20 years old - that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these. Nor is it clear what Sony could reasonably have done to repair them; even by 2010, a lot of the parts used in CRT TVs were out of production and unavailable.

Maybe you're too young to remember, but people used to keep TVs for much longer periods before HDTV and flat panels came out.

Also, these TVs are apparently fire hazards. It doesn't matter that they're 20 years old (at the point of the "recall" in 2010).

I doubt the parts necessary to fix them were out of production; you can get parts for truly ancient electronics still. Things like capacitors don't become obsolete. The recall doesn't specify exactly which component is problematic, but says it's age-related, which usually points to capacitors.


This. I’ve known a TV that was in more or less daily use for over 30 years. Not sure why we stopped expecting that from electronics.

>Not sure why we stopped expecting that from electronics.

For TVs specifically, the technology changed a lot. For a long time, everyone was stuck on the NTSC standard, which didn't change much. At first, everyone had B&W TVs, so once you had one, there was no reason to change. Then color TV came out, so suddenly people wanted those. After that, again no reason to change for a long time. Later, they got remote controls, so sometimes people would want one of those, or maybe a bigger screen, but generally a working color TV was good enough. Because TVs were glass CRTs, bigger screens cost a lot more than smaller ones, and there wasn't much change in cost here for a long time.

Then HDTV came out and now people wanted those, first in 720p, and later in 1080i/p. And flat screens came too, so people wanted those too. So in a relatively short amount of time, people went from old-style NTSC CRTs to seeing rapid improvements in resolution (480p->720p->1080->4k), screen size (going from ~20" to 3x", 4x", 5x", 6x", now up to 85"), and also display/color quality (LCD, plasma, QLED, OLED), so there were valid reasons to upgrade. The media quality (I hate the word "content") changed too, with programs being shot in HD, and lately 4k/HDR, so the difference was quite noticeable to viewers.

Before long, the improvements are going to slow or stop. They already have 8k screens, but no one buys them because there's no media for them and they can't really see the difference from 4k. Even 1080p media looks great on a 4k screen with upscaling, and not that much different from 4k. The human eye is only capable of so much, so we're seeing diminishing returns.

So I predict that this rapid upgrade cycle might be slowing, and probably stopping before long with the coming economic crash and Great Depression of 2025. The main driver of new TV sales will be people's old TVs dying from component failure.


Great points. The TV I have today is approaching my platonic ideal screen. It’s as big as it can get without having to continually look around to see the whole screen. Sit in the first row of a movie theater to understand how that can be a bad thing. The pixels are smaller than I can see, it has great dynamic range, and the colors can be as saturated as I’d ever want. There’s not much that can be improved on it as a traditional flatscreen video monitor.

> The human eye is only capable of so much, so we're seeing diminishing returns.

Or not seeing diminishing returns. Which is the point.


> At first, everyone had B&W TVs, so once you had one, there was no reason to change

Televisions improved over time:

- screens got flatter

- screens got larger

- image quality improved

- image contrast increased (people used to close their curtains to watch tv)

- televisions got preset channels


My experience of ancient CRT devices is that the display gets gradually dimmer. I once had a TV that was only really usable after dark -- but that's the only time I wanted to use it anyway -- and a huge Sun monitor that was only just about readable in total darkness, but we kept it because we also had a Sun server that we didn't know how to connect to any other monitor and we were worried that one day we wouldn't be able to SSH to it, but in fact the server never once failed.

> daily use for over 30 years

However that doesn't imply TVs were that reliable.

Before the 90s TV repairman was a regular job, and TVs often needed occasional expensive servicing. I remember a local TV repair place in the 90s which serviced "old" TVs.


> Not sure why we stopped expecting that from electronics.

Last years model only does 4k, my eyes need 8k


32K ought to be enough for anybody.

32K is going to look so lifeless and dull after you try 64k.

When will the pixels start to approach erythrocyte-level density like on the Vision Pro?

edit: Anywhere between 208K to 277K.


Because electronics got so much better so much faster, that the vast majority of customers did not want to use old hardware.

Especially if customers allowing shorter lifetimes allowed companies to lower the prices.


There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.

Moreover, we're talking about televisions and old Macs. TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price), and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?

Much older computers continue to be used because they run software that newer computers can't without emulation (which often introduces bugs) or have older physical interfaces compatible with other and often extremely expensive older hardware.

If people actually wanted to replace their hardware instead of fixing it then they'd not be complaining about the inability to fix it.


>There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.

It depends. Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems. You're probably better off getting a RasPi or something, depending on what exactly you're trying to do. Newer systems have gotten much better with energy efficiency, so they'll pay for themselves quickly through lower electricity bills.

>TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price)

We're already seeing a limit here. 8k TVs are here now, but not very popular. There's almost no media in that resolution, and people can't tell the difference from 4k.

For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.

>and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?

They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen. It's possible they might want newer smart TV features too: older sets probably have support dropped and don't support the latest streaming services, though usually you can just get an add-on device that plugs into the HDMI port so this is probably less of a factor.


> Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems.

https://en.wikipedia.org/wiki/List_of_Intel_Pentium_4_proces...

The Northwood chips were 50 to 70 W. HT chips and later Prescott chips were more 80 to 90 W. Even the highest chips I see on the page are only 115 W.

But modern chips can use way more power than Pentium 4 chips:

https://en.wikipedia.org/wiki/Raptor_Lake

The i5-14600K has a base TDP of 125 W and turbo TDP of 181 W, and the high-end i9-14900KS is 150 W base/253 W turbo. For example, when encoding video, the mid-range 14600K pulls 146 W: https://www.tomshardware.com/news/intel-core-i9-14900k-cpu-r...

More recent processors can do more with the same power than older processors, but I think for the most part that doesn't matter. Most people don't keep their processor at 100% usage a lot anyway.


As I said in a sister comment here, you can't compare CPUs by TDP. No one runs their CPU flat-out all the time on a PC. Idle power is the important metric.

> Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems.

Even many Pentium 4-based systems would idle around 30 watts and peak at a little over 100, which is on par with a lot of modern desktops, and there were lower and higher power systems both then and now. The top end Pentium 4 had a TDP of 115W vs. 170W for the current top end Ryzen 9000 and even worse for current Intel. Midrange then and now was ~65W. Also, the Pentium 4 is twenty two years old.

And the Pentium 4 in particular was an atypically inefficient CPU. The contemporaneous Pentium M was so much better that Intel soon after dumped the P4 in favor of a desktop CPU based on that (Core 2 Duo).

Moreover, you're not going to be worried about electric bills for older phones or tablets with <5W CPUs, so why do those go out of support so fast? Plenty of people whose most demanding mobile workload is GPS navigation, which has been available since before the turn of the century and widely available for nearly two decades.

> For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.

Some people. Plenty of others who don't even care about 4k, and then why would they want to needlessly replace their existing TV?

> They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen.

That's the point. 1080p TVs and even some 720p TVs are still sold new, so anyone buying one isn't upgrading and has no real reason to want to replace their existing device unless it e.g. has a design flaw that causes it to catch fire. In which case they should do a proper recall.


>The top end Pentium 4 had a TDP of 115W vs. 170W for the current top end Ryzen 9000 and even worse for current Intel.

You can't compare CPUs based on TDP; it's an almost entirely useless measurement. The only thing it's good for is making sure you have a sufficient heatsink and cooling system, because it tells you only the peak power consumption of the chip. No one runs their CPUs flat-out all the time unless it's some kind of data center or something; we're talking about PCs here.

What's important is idle CPU power consumption, and that's significantly better these days.

>older phones or tablets with <5W CPUs, so why do those go out of support so fast?

That's an entirely different situation because of the closed and vendor-controlled nature of those systems. They're not PCs; they're basically appliances. It's a shitty situation, but there's not much people can do about it, though many have tried (CyanogenMod, GrapheneOS, etc.).

>Plenty of others who don't even care about 4k

Not everyone cares about 4k, it's true (personally I like it but it's not that much better than 1080p). But if you can't tell the difference between 1080p and an NTSC TV, you're blind.

>1080p TVs and even some 720p TVs are still sold new

Yes, as I said before, we're seeing diminishing returns. (Or should I say "diminishing discernable improvements"?)

Also, the 720p stuff is only in very small (relatively) screens. You're not going to find a 75" TV with 720p or even 1080p; those are all 4k. The low-res stuff is relegated to very small budget models where it's really pointless to have such high resolution.


For most videos, the difference between 1080p and 4k ain't that large.

But for certain video games on a large screen, I can definitely tell the different between 1080p and 4k. Especially strategy games that present a lot of information.

Btw, as far as I can tell modern screens use significantly less power, especially per unit of area, than the CRTs of old; even if that CRT is still perfectly functional.


A decade old CPU would be a Haswell, not a Pentium 4.

Suppose they would recall all the old tv's with known faults, can those be fixed to become conform to (today's) quality and safety standards, while being full of old components with characteristics beyond original tolerances?

> that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these

A modern TV may have an expected lifespan of five years. TVs from several decades ago had lifespans of... several decades. Quality has plummeted in that market.


5 years? Is that really true? I’m currently using an LG from 2017 and cannot imagine needing to change it. I would be shocked if it stopped working.

I don't think it is true at all.

There's nothing inside today's monitors or TVs that can't run for at least 10 years. Our main TV, 42" 720p LCD, is from 2008, and I have monitors that are just as old.


Yep. My TV, a 42" Panasonic plasma, dates from 2009 and is still working perfectly. I haven't replaced it, because why would I?

I have an LG OLED from 2017. It started getting really bad screen burn/pixel degredation just after the 6 year mark (6 year warranty), I did a quick search on Youtube, and lo-and-behold, a whole bunch of other people, with the same model, started having the same screen burn-in issues at the same age!

It covers the middle third of the screen, top to bottom, and the entire bottom 1/4 of the screen with some odd spots as well, it's really distracting and essentially makes the TV useless (to me).


OLED screens are known for having burn-in problems like this. LCDs don't, though they probably have issues with backlights becoming dim with age.

I have an LG about that vintage and it’s starting to black out when doing 4K content. All components before it switched out and up to date in firmware. Reatarting works, sometimes all day, sometimes 1 minute.

My other TV about the same vintage is starting to have stuck pixels in the corner.

Modern failure modes aren’t nearly as graceful.


But when it does, it will probably be the capacitors in the power supply that have dried out.

Is that really the case? Because if so, it seems like simply replacing the capacitors would save a lot of waste and unnecessary purchases of new TVs...

This is a very common fault, yes. Power supply issues in general. It is also not uncommon for people to replace e.g. Wifi routers because the wall warts fail.

It comes down to a few people don't knowing a lot about it - and I'm not blaming anyone for that, we all have our interests and most people have more than enough to do already to worry about what goes on inside their stuff.

Also, electronics are, to a lot of people in a lot of places, so cheap that they would rather just curse a little and buy a new thing, instead of bothering with taking the thing to a shop. And of course a few hours of skilled labour in a big city in the west might also be almost as expensive as making a whole new TV in a factory in Asia plus shipping, so it might not even make economic sense.


> And of course a few hours of skilled labour in a big city ...

In many/most places, these repair shops don't even exist any more, because the products have gotten too complicated/integrated/parts-unavailable, and the economics are nonsensical.


Electrolytic capacitors are not solid state and likely #1 failure mode for most electronics. There are options for better (e.g. Al polymer) capacitors that are rather expensive - overall good capacitors are 'expensive', e.g. more than a dollar a piece in some cases.

The 2nd most common failure mode gotta be the mlcc (multi layer ceramic capacitor) cracks/shorts.


How can I even know which capacitor is faulty?

If your model was popular, there's likely a recap kit for its power supply. It usually makes senss to swap all the capacitors in the kit, unless the kit instructions say otherwise.

You can look for physical signs of degredation (bulgy, leaky, discolored), but to really test a capacitor for capacititance, you need to take it out of the circuit, at which point, you may as well put a new, high quality capacitor in.

The OEM capacitors may likely have a just right voltage rating, a new one with a higher voltage rating (and same capacitance, compatible type) may last longer in cirucit as well.


> new one with a higher voltage rating (and same capacitance, compatible type) may last longer in cirucit as well.

That's not necessarily true, higher voltage rating equals higher ESR which means more heat.


That would require some experience, yet the most common visual clue would be 'bulging'. There are some ways to measure ESR w/o desoldering but they won't be reliable at all times.

Measuring voltages, peak to peak, is a bit more work.


A TV used to cost a few weeks pay and now you can get a TV for the equivalent of a few hours pay. There just isn't much of a market for a $3000+ TV.

Few usually means 3-5 or so, a half decent TV would be at least half a grand. That's rather high hourly pay rate.

Explain to me why this tv for $100 [1] isn't perfectly suitable to replace a 2008 40" 1080p Samsung LCD with florescent backlight that 2was a deal at $1000. Yeah, you could get something bigger and better. Yes, price comparison on a sale week is a bit unfair.

[1] https://www.bestbuy.com/site/tcl-40-class-s3-s-class-1080p-f...


FYI: bestbuy is unavailable outside the US (the site I mean), or likely NA.

Only one metric of 'quality' has plummeted.

A rock lasts billions of years, but its quality as a TV is rather questionable.


"that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these"

People still run these Trinitron TVs to this day.


It is a legitimate business decision, to sell things that last less than 20 years. Fine, I think it is lame, but it is their choice.

But, we shouldn’t let companies get away with selling products that catch fire after working fine for 20 years.


> that's well beyond the expected useful lifespan of even a modern TV

What? That's nuts. Why bother buying a tv if you're immediately going to throw it in the trash


My radial arm saw ended up getting a product recall for simply being too difficult for the average consumer to use safely. The "recall" amounted to them sending you instructions to cut off a critical power cord and mail it in to them, and they send you a $50 check.

That is completely unreasonable. Companies can't be expected to take in and repair devices that old.

For 1993 hardware?

They don't do recalls even on modern hardware. But soldering hacks are no longer possible, all parts are serialized.

Louis Rossmann made many videos on this.


What are you talking about? Capacitor technology hasn't changed substantially in decades, and it's just as possible to change caps with a soldering iron now as it was 20 years ago. I have no idea what you mean by "serialized".

not capacitors, but more advanced components, like the camera, have serial numbers embedded in them, and the serial number needs to match, otherwise it won't accept the component. Components off a stolen device are put on a list and won't work in admirer another phone, so stolen phones aren't even worth anything for parts, driving down the market for stolen phones. It also makes the job of repair shops harder, which is collateral damage in Apple's eyes, but is very much material for anyone running a repair shop.

The only reason this is an issue for repair shops is they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up. On top of that the "non genuine parts", some of which really are utterly dire, show up in the OS as being not genuine parts. Buying genuine parts, which are available from Apple, eats into the margins. There is very little honour in the repair market, despite the makeup applied to it by a couple of prominent youtubers and organisations.

The amount of horror stories I've seen over the years from independent repairers is just terrible. Just last year a friend had a screen hot snotted back on their Galaxy.


> they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up

What represents a more efficient economy. The one where broken phones get reused for parts or the one where you have to throw them away?


The economy that isn't backed with criminal activity and loss for customers.

If you think Apple's part pairing policy has anything to do with consumer benefit, I have a bridge in Arizona to sell you.

> The only reason this is an issue for repair shops is they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up.

This is just incredibly dishonest framing and completely ignoring what the right to repair and third party repair shop issue is all about.

> Buying genuine parts, which are available from Apple,

It is not a margin problem, it is an availability problem. Apple does not allow third party repair shops to stock common parts, such as batteries or displays for popular iPhones. This is only possible when providing the devices serial numbers. This effectively prevents third party repair shops from competing with Apple or Apple authorized service providers because they have artificially inflated lead times.

Becoming Apple authorized isn't an option for actual repair shops because that would effectively disallow them from doing actual repairs when possible, rather than playing Dr. Part Swap. Everything what Apple does in the repair space essentially boils down to them doing everything they can to avoid having competition in the repair space.

> eats into the margins

Replacing a 45ct voltage regulator on a mainboard is cheaper than replacing the entire mainboard with everything soldered on is cheaper, but doesn't allow for very nice margins.

> There is very little honour in the repair market

There is very little honour in any market. Honour does not get rewarded nowadays, people are in <insert market> to make money, if you're lucky they still take a little pride in their work. If a repair shop offers good service or not should be up to the consumer to determine, not up to Apple (or any electriconics manufacturer that employs the same tactics).

> makeup applied to it by a couple of prominent youtubers and organisations.

That is called marketing, that's what Apple does also pretty good. They're also lying when they say they are environmentally conscious while they also have their genius bar employees recommend an entirely new screen assembly on a MacBook just because a backlight cable came loose.

> The amount of horror stories I've seen over the years from independent repairers is just terrible. J

The amount of horror stories I have experienced with Apple is no joke either. Apple is always taking the sledgehammer approach with their repairs. I've had the pleasure myself to deal with Apple repairs once for my old 2019 MBP. It wouldn't take a charge anymore, went to the Genius Bar and received a quote for a new mainboard costing well over 1000 EUR. Being familiar with some of the more technical videos of Rossmann etc, I found one electronics repair store that actually does board level stuff and got it fixed for a fraction of the price (iirc it was ~200 EUR).


Even if Apple has room for improvement here, I think it’s still worth it to try to curb the market for stolen parts, because that’s going to exist even if Apple sold spare parts in bulk at-cost simply because there exist unscrupulous repair shops that have no qualms with charging you OEM part prices while using gray market parts that cost a fraction as much on eBay, Aliexpress, etc.

For instance, maybe Apple could supply parts in bulk to repair shops but require registration of those parts prior to usage. The repaired iPhone would function regardless but loudly alert the user that unregistered parts were used to repair it. Gray market parts naturally aren’t going to be able to be registered (either due to serial not existing in their system or having been parted out from stolen devices), and thus the user is given some level of assurance that they’re not paid for questionable repair services.


I see. Yes, that is a big problem for component swapping. I was just thinking of electronics with old/faulty caps; those will still be repairable.

Doesn’t Apple offer a way to re-pair components if they are genuine and not stolen (unregistered from the previous AppleId)?

and Apple will very happily charge you for that privilege

TBH for such a critical piece of our modern lives, I would be more than fine to pay extra to be 100% sure I am getting original parts, put in professionally and in secure manner re my personal data. I wish ie Samsung had such service where I live.

We anyway talk about expensive premium phones to start with, so relatively expensive after-warranty service is not shocking.

This may actually eventually sway me into apple camp. This and what seems like much better theft discouragement.


I don't. Such mechanisms also disqualify 3rd party replacements. It is just a wasteful solution. Not that any smartphone would qualify as decent here.

But as a customer it will overall be more expensive for you.


There are things in life where amount paid is far from top priority, and phone is one these days. With sums we talk about, I just don't care anymore, and Samsung I have now is even more expensive and more wasteful.

Re wastefulness - a decent laptop causes 10x more pollution to manufacture than phone. Desktop PC 10x that. TVs. Cars. Clothing. Phones are very much down a very long line of higher priority targets for eco friendly approach.


It is not about stolen phones, it is about monetization of customer services. If stealing phones was legal, job description for procurement/purchase departments would look differently as well.

What’s the liquid in the old capacitors? PCBs? (as in polychlorinated biphenyls… that abbreviation collision always annoyed me.)

I think I know exactly enough about electronics to ask more annoying questions than someone who doesn’t know anything at all.


PCBs were only used in oil capacitors and some transformers. Generally these were used in motor and grid power applications. The only consumer applications are some motor capacitors and florescent light ballasts.

Yeah I pretty much grew up (unknowingly) playing in an illegal unmarked chemical waste dump so it takes a lot to get my attention, but I opened up an old fluorescent desk lamp from the 60s I had that fried itself to see if it was fixable — and found a small piece of crumbly asbestos shielding about the size of a business card stuck to a big leaky ballast. Pretty solid toxic waste combo. City hazmat got a sweet vintage lamp that day, sadly.

"Wet" capacitors contain any number of liquid electrolytes. Could be something tame like ethylene glycol, boric acid, sulfuric acid, or nastier stuff like organic solvents (DMF or DMA which are poisonous, or GBL which is less lethal).

Nothing as bad as PCBs as far as I'm aware.


Cool, thanks. I think I should learn how these components actually work. Individually they seem pretty simple.

It is not just Apple that did this, for example here is an equivalent from Atari: https://www.exxosforum.co.uk/forum/viewtopic.php?f=17&t=1698

I have a Quadra 700 of this vintage that hasn't been powered up in 25+ years. Kind of wanted to fire it up again to experience the glory of A/UX one more time, but sounds like I'd have to replace all the lytics :/

Do it sooner than later, the cap juice loves to eat PCB traces. same with the clock batteries, get those things out of there.

I have an original Mac that no longer turns on. I bet there is a capacitor to replace. This is giving me the energy to go look for it!

There’s a very good chance the battery has leaked and caused quite a mess. Well capacitors are a problem the battery is the biggest one.

The author seems to misunderstand PCB design flow. This is neither a "factory component placement issue" nor a silkscreen error. The error is in the schematic.

The layout CAD is often done by a different team that follows the schematic provided by design engineering. Automated workflows are common. The silk screen is predefined in a QA'd library. It is not their job to double check engineering's schematic.

The components are placed per the layout data.

Both those teams did their jobs correctly, to incorrect specifications. In fact, the factory performing assembly often is denied access to the schematic as it is sensitive IP.

If you're going to cast blame on a 30 year old computer, at least direct it at the correct group. It wasn't soldered incorrectly at the factory. They soldered it exactly how they were told to - backwards.


>The layout CAD is often done by a different team that follows the schematic provided by design engineering.

Just as a note, this is a fairly archaic way of working nowadays. At my place schematic design and layout go hand-in-hand, and we rejected a candidate because he didn't do the latter. The main reason is layout is no longer an afterthought, it's a key part of the electrical design of the system, and there's little room for a tedious back and forth between the circuit designer and the person doing the layout about what traces are and aren't important to optimize for various attributes.


Indeed, and this is true in other engineering activities such as mechanical design as well. Possibly with the exception of very large shops, there are no draftsmen any more, and the design engineer also creates the production drawings. And the software lends itself to this. Schematic / layout, and design / drawing, are joined together in the design software. It would be very hard to make a mistake like the one in TFA today.

Even the free software that I use -- KiCad -- would ding me.

We make bigger mistakes instead. ;-)


And yet it is not at all unusual for a production engineer to spot these faults and pass them back to the design engineers for rework.

Also true! Most common when you accidentally screw up a footprint and it doesn't fit the part on the BOM. A backwards part is the kind of thing they're not likely to pick up on (if it's marked on the silkscreen incorrectly, at least), but some do.

Brings back memories…

About 30 years ago I designed my first PCB with frequencies in the GHz range. It was full of challenging transmission line paths with frequencies in the hundreds of MHz and above.

I am still proud of the fact that all of the high speed signals worked as designed, with excellent signal and power integrity (the large FPGA was challenging). Emissions passed as well.

I did, however, screw up one thing: DC

I somehow managed to layout the DC input connector backwards!

These boards were very expensive ($2K), so an immediate respin was not possible.

I had to design a set of contacts to be able to flip the connector upside-down and make the electrons go in the right way.

The joke from that point forward was that I was great at multi-GHz designs but should not be trusted with DC circuits.


I've found a ground lug in a Kilowatt Grounded Grid amplifier... that didn't ground the grid.

I found a bad solder joint that looked ok, but was intermittent, and had been that way, in a Television built in 1948 and used for decades.

Bad design and assembly goes back forever, as near as I can tell.


The first board I ever designed and had manufactured had a reversed tantalum capacitor on the power rails and exploded somewhat dramatically when powered up. Lesson learned!

Commodore struggled with same mistakes on negative rail in Audio section, but also somehow on highend expensive CPU board.

https://wiki.console5.com/wiki/Amiga_CD32 C408 C811 "original may be installed backwards! Verify orientation against cap map"

A4000 https://wordpress.hertell.nu/?p=1438 C443 C433 "notice that the 2 capacitors that originally on A4000 have the wrong polarity"

Much worse is Commodore A3640 68040 CPU board aimed at top of the line A3000 and A4000 http://amiga.serveftp.net/A3640_capacitor.html https://forum.amiga.org/index.php?topic=73570.0 C105 C106 C107 silkscreen wrong, early revisions build according to bad silkscreen.


Typical Amiga fanboyism and Apple envy, if a Mac does something they have to prove the Amiga outdid it. “Only one model with a reverse polarity capacitor? With Commodore it was a systematic issue!”

> Typical Amiga fanboyism and Apple envy, if a Mac does something they have to prove the Amiga outdid it.

I think we're envious that Apple did a better job of engineering their systems


Apple should be mandated to issue a recall for these motherboards.

Does the -5V rail do anything other than power old RS-232 ports?

Macs have RS-422 ports, not RS-232. But, no.

what apple era are those machines? is this before or after Jobs shafted the engineering department on the sale and Woz had to give them bonus to keep them on the factory?

Didn’t this also happen on some Asus motherboards a couple years ago?

That one was Asus ROG Maximus Z690 Hero ~2years ago.

Sorry to hijack the thread, I couldn't directly reply to https:///item?id=42092845 .

The reason to not (just) use optical flow is that it isn't absolute. If you pattern your surface correctly, you can ensure that every few by few pixel region on a QR code like bitmaps surface is unique, and thus can be decoded into an absolute position. Basically a 2D absolute optical encoder fast enough to be part of a motor control loop.


I wonder if there were any bootleg boards that copied the silkscreen mistake, but didn't use those 16V capacitors, and ended up catching fire.

Why include that capacitor at all if it doesn't matter whether it works?

If you look at the traces you can see the capacitor is right next to the power connector, on the -5V rail (which is not used for much, only for the RS422 serial port). The capacitor will be there to smooth the power supply when the machine is just switched on, or there's a sudden load which causes the voltage to "dip" above -5V. Basically it's like a tiny rechargable battery which sits fully charged most of the time, but can supplement the power on demand.

So you can see why it probably didn't matter that this capacitor didn't work: It's only needed for rare occasions. RS-422 is a differential form of RS-232 (https://en.wikipedia.org/wiki/RS-422) so being differential it's fairly robust against changes in load if they affect both wires. And the worst that can happen is you lose a few characters from your external modem.

In addition, electrolytics can probably work when reversed like this, at least a little bit. It's not exactly optimal and they might catch fire(!).


Also known as the Madman Muntz theory of Engineering :-)

https://en.wikipedia.org/wiki/Muntzing


I never knew there was a name for this :)

When I was a demo coder my artist friend would just haphazardly go through all my assembler code and snip random lines out until it stopped working to improve performance.


I have my childhood LC II in storage

I wonder if it has the same defect


If anything you should open it up to check for any leaking batteries/capacitors.

I spent my mid childhood on an LCIII. One summer my friend brought his Performa over and we tried to play 1v1 Warcraft 2 over the serial port. LocalTalk or something alike?

But it just never quite worked right. I remember how frustrated and confused my older brother was. The computers would sometimes see each other but would drop off so easily.

Was this that?!


not the Flux Capacitor?!?!

They were probably expecting these to fail a few months after the warranty expired.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: