![](/static/790fef6/assets/icons/icon-96x96.png)
![](https://lemm.ee/api/v3/image_proxy?url=https%3A%2F%2Ffry.gs%2Fpictrs%2Fimage%2Fc6832070-8625-4688-b9e5-5d519541e092.png)
Ah, nice. Thank you for bringing your expertise to my nonsense.
Ah, nice. Thank you for bringing your expertise to my nonsense.
Someone elsewhere in the thread suggested it might be a marketing thing on Nvidia’s part, and that makes a lot of sense.
I think it’s doing some non-trivial amount of rendering, since it’s often syncing graphics with music played live.
Wikipedia says it’s 16,000x16,000 (which is way less than I thought). The way the math works, that’s 16x as big as a 4k monitor, so 16 GPUs would make sense. And there’s a screen inside and one outside, so double that. But I also can’t figure out why it needs five times that. Redundancy? Poor optimization? I dunno.
Yeah, sorry, that’s my fault. I don’t know why I made him a year younger in my head.
I have five guesses:
(1) That would require more diagnostics than an LED on a monitor is able to provide at a reasonable cost, (2) if you’re leaving the monitor on in a situation where burn-in is likely, you’re probably not at the monitor when it matters, (3) monitors are a mission-critical piece of hardware, meaning that them turning themselves off (or even just turning off certain pixels) randomly is not a great idea, (4) it’s probably the OS’s job to decide when to turn off the monitor, as the OS has the context to know what’s important and what isn’t, and how long it’s been since you’ve interacted with the device, and (5) it’s in the monitor manufacturer’s best interest for your monitor to get burn-in so that you have to replace it more often.
The actual answer is probably a combination of multiple things, but that’s my guess.
Honestly, setting a screen timeout (or even a screen saver!) is the solution to this problem. So the problem was more or less solved in the early 80s.
He’s less valuable to their bottom line now.
Interesting. I knew they were semiconductors, but I didn’t know they were also semimetals. Thanks for the details!
Weird. I’m seeing the C&H for today 1987.
I figured, but I didn’t play along very well.
Hey, he said please. That’s a big win. I mean, yeah, make him take the trike back out, but the unprompted “please” for a five six-year-old? Maybe Calvin isn’t as much of a brat as I thought before I had kids.
Yes, that’s the name of this community.
Fair point, I don’t know you. The average phone user, then. Most people use their phone about 4½ hours a day.
LEDs and OLEDs work the same way, the only difference is their composition. Standard LEDs use metals, OLEDs use organic compounds (which, yes, are more sensitive to breakdown over time, but come with the advantage of being smaller, lighter, more flexible, etc).
And actually, it’s that size and flexibility that makes an OLED panel possible. An LED display is actually just a color LCD display with a white LED backlight; you need OLED to have the individual pixels generate their own light. Burn-in on a non-organic LED display would be a completely different thing (and is possible but rare).
Burn-in isn’t a light being emitted when off, it’s a light being dimmer when on.
An LED works by passing current between two different semiconductors. When an electron jumps over the “gap” between those two semiconductors, it releases a photon of a particular color (determined by the size of the gap). But over time, as an LED is used, the gap can be damaged (by heat, by vibration, etc); when this happens, fewer electrons can jump the gap and thus fewer electrons produce photons. Or the properties of the gap are changed so that they emit a slightly different wavelength photon. So if you leave a particular set of pixels on, producing light, for an extended time, those LEDs will degrade more than the rest of the screen, leading that area to be discolored or dimmer. This is burn-in.
Most of the time, that’s fine, because the LEDs on your screen experience wear in a more-or-less uniform pattern. Your phone is somewhat less susceptible to this, since (1) you tend to have your phone screen off most of the time, (2) there aren’t as many persistent HUD elements even when it’s on, as every app has its own configuration of controls and UI elements, and (3) you tend to replace a phone more often than a monitor. When you replace your phone, it’s probably more-or-less evenly dimmer overall than it was when you bought it, but since you don’t have anything to compare it to, you won’t know; with burn-in, though, that comparison is right next to the burned-in pixels.
By contrast, a computer monitor will typically be on for 8+ hours at a time, and persistent display elements are a part of every major operating system. If you’re not using the LEDs in a panel more-or-less evenly, you’ll end up with a persistent image.
If you’re talking about the visual style, it borrows pretty heavily from SimCity 2000 and Roller Coaster Tycoon 2. You might like those as well.
I think Mythbusters is a little bit of a different case than something more narrative. There are always new myths to bust; every generation needs something that makes science cool. I guess now that’s a role mainly filled by various YouTubers.
“Replicate the circumstances, then replicate the results.”
Well, if anyone was going to be a K-A-M fan, it’d be Grandpa “my forehead transplant donor was Cerean” Simpson.