Hacker News new | past | comments | ask | show | jobs | submit login
The Secret Life of XY Monitors (2001) (jmargolin.com)
98 points by ibobev 7 months ago | hide | past | favorite | 103 comments



> It's also a safety issue, since the Cathode voltage might be substantial.

When I was a small kid, my father had a TV set repair side hustle. I was 7-8yo but I would sometimes be called to help him a bit. Once he asked me to clean a bunch of CRT tubes from substantial amount of dust. What he forgot is that he was just testing one of them and the charge wasn't properly dissipated and it did not have time to dissipate on its own.

I got hit with a spark so powerful, that it must have temporarily disrupted my brain because for a brief moment I was completely stupefied, and then I just resumed cleaning it. Then I got hit for the second time, then for the third time. Only after third time I realised that me cleaning the tube has something to do with it. Probably the discharge was much less powerful the third time and did not stun me completely.

I can tell you the voltage and amount of charge on these things is no joke. Multiple tens of thousands of volts is enough to stun you temporarily. Probably not enough to cause lasting damage unless you are extremely unlucky about where the discharge went (stun guns usually have higher voltage and capacity), but enough for you to lose control of your body, fall and hurt yourself.


I got zapped once as a kid. My grandfather's TV, I was pulling the tubes to take them to the testing machine. Fairly small screen but I knew it probably had some residual charge that I most definitely didn't want to come into contact with. The layout was such that I could pull the tubes while staying far away from it. It never occurred to me that safety would require keeping him from looking on in curiosity and bumping my arm. (He was impatient about getting his TV fixed, I saw that whoever designed it had had the courtesy to put the tubes in an area where even if your hand slipped it wouldn't approach the tube. That was no protection against my hand getting moved in a direction I was not applying force, though.)

I don't think such things are actually capable of direct lasting damage but you do lose control--I slammed my hand into the cabinet pretty hard, completely involuntary. My understanding is that tasers are AC, thus involuntary movements will be back and forth rather than just in one direction.


> My understanding is that tasers are AC, thus involuntary movements will be back and forth rather than just in one direction.

Tasers AC output is at low frequency but still significantly faster than human muscles can move, if you watch any video of a tase you'll see it causes the victim's muscles to seize up and not oscillate.


I was thrown across a room twatting around in the back of a TV. Many years later I still wonder if there were any sdiefeftcs.


"I don't hire programmers who spell correctly, I hire programmers who misspell consistently" or how does it go


>Probably not enough to cause lasting damage unless you are extremely unlucky about where the discharge went

You don't have to be that unlucky. Discharge across your heart can stop it, no problem. It is not at all difficult for that energy to route through your chest. You were lucky.


Well... one difference between a stun gun and CRT tube is that in a stun gun you usually have both terminals relatively close to each other. Little chance of the main pulse going deep across the chest. On the other hand working with the CRT you are likely using both hands and that makes the signal go through your upper chest, by default. So yeah, there is something to it.

One thing I learned as amateur EE is to always use just one hand to probe the device under test if there is any high voltage or AC involved. You put the other hand behind you so that there is no chance you create a short right through your chest.


> One thing I learned as amateur EE is to always use just one hand to probe the device under test if there is any high voltage or AC involved.

I remember in some EE course, the prof randomly stopped in the middle of a lecture and said roughly

"Ok now that you all know this theory, I need to remind you that you should not go do your own home electrical work, because you will get ahead of yourself and die from some kind of electric shock"

The implication being:

1. You're trained, but for EE not electrician work

2. You'll think you know what to do, even if you don't it'll cloud your judgement

3. As an EE you will make enough to pay a real electrician to do your electrical work

Needless to say in the real high power EE courses (t lines, grid, etc), this wasn't even mentioned, because it was just implicitly known that all that stuff was in the realm of "Don't physically touch this stuff in the real world ever because it will kill you and it will hurt the whole time you're dying".


I did not mean doing home electrical work. That I do leave to professionals.

But there are times I have a device on my bench that is consuming 230V AC.

I have a 3kVA separation transformer set up, which makes it safer for me to work with the device (there is no potential between the DUT and the ground).

Technically, even without the transformer the residual current protection should kick in and prevent you getting killed between device and the ground. I have tested mine couple of times to check whether it works. But I somehow still feel better with a separation transformer.

But there is still the problem that I could touch different parts of the device that are at different potential and then I could get zapped this way. There is not much protection there. Solution -- If you must, don't ever touch more than one place at a time.


You're describing the well-known Alanis Morissette technique for electrical work:

"I got one hand in my pocket, and the other one holding a DMM"


> One thing I learned as amateur EE is to always use just one hand to probe the device under test

I was taught to use the back of my hand to avoid gripping the damn thing and never letting go.


Same. And I do it like a superstition now.


And have someone nearby with a 2x4 to beat you off whatever you might be gripping/fell onto.


Just to add a bit more color here: most flexor muscles in your body are a lot stronger than the corresponding extensors, so when an external command comes in telling both to contract, for example from electricity in the thing you're working on, gripping wins. You can't let go. Great for grabbing a branch as you fall out of a tree. Not so great for repairing a power line.

Late edit: I've heard you can use this factoid to your advantage to avoid being eaten by an alligator or crocodile. Wrap your arms around its snout and hold on for dear life. Their jaws are strong enough to snap you in two, but not enough to force themselves open when you're holding them shut. I don't know what Step Two is, unfortunately.


Related fun fact - On many industrial robots that I've worked with, the teach pendant (the handheld controller you drive it around with), requires you to hold a 3 position spring-loaded switch in a middle position for the robot to operate, which requires you to hold with a rather precise amount of force. Squeeze it either too loosely or too tightly and the robot disables.

The idea being that not only will you dropping the pendant disable the robot, but it will also disable if you accidentally touch energized equipment and your hand clenches, or (more likely) you panic and squeeze the controller too tightly.

https://us.idec.com/idec-us/en/USD/Safety-Components/Enablin...


> I don't know what Step Two is, unfortunately.

step two is waiting for the aforementioned 2x4 wielder "to beat ... whatever you might be gripping/fell onto"


"Mother" on speed dial maybe?


The bit about holding an alligator's mouth closed is correct and I remember Step Two from an alligator wrestling demonstration: flip it over. They lose consciousness if you can hold them in an inverted position.


So do chickens, but they're a lot easier to wrestle.


> Discharge across your heart can stop it, no problem. It is not at all difficult for that energy to route through your chest. You were lucky.

As my dad used to say "It's volts that jolts, but mills that kills!"

(you can survive a jolt very large voltage, but only a few milliamps across the heart can kill you)


Not just heart-stopping-dangers! I had a dishwasher whose wiring was damaged by a rat. I unplugged the dishwasher, and started working on the wires - maybe 1mm thickness. Next thing I know, I've punched myself in the face and knocked myself to the floor from something discharging, and need an ice pack on my lip.


I had a dishwasher whose wiring wasn't installed properly, and it eventually caught fire. Seeing smoke and flames come out of a running dishwasher is something I won't soon forget!


That said, direct current is "safer" than alternating current; AC makes your muscles (and heart) spasm/contract 50 or 60 times a second like it's fibrilating, DC locks everything up like a defibrilator.

I recommend neither though.


high frequency alternate current is the safest, in the sense that the skin effect enters in effect, and the current doesn't touch your muscles (or at least your hearth). However, the danger of burns persists.


The nasty thing about HF burns is that you often do not notice that something is wrong until you either smell the burnt skin or the burn starts to deep enough to become acutely painful. It is perfectly possible to burn 1mm dia hole completely through you thumb and nail by inadvertly placing your thumb over output connector of CCFL inverter without you noticing that this is happening.


Stopping the heart is no problem --- that's what defibrillators do, after all! The heart stops, and then starts again with all its muscles in sync.

The problem is when the shock is not enough to stop the heart but causes it to lose sync and go into fibrillation.


But that takes current, not merely voltage. It's very hard to do with the discharge of static energy.


I also touched a CRT anode when fiddling with a TV set when I was 12. I was aware of the danger and scared of that thing (it was huge!) but still forgot to discharge it. Despite the ~26kV nominal voltage there was only enough charge to make me jump to the ceiling, or at least I thought so. It's a lesson you never forget.


> Very likely, the real story is simply that the potting material used in the transformers was not up to the task. Regardless of the cause, by the time the problem appeared the Operators had already made a good return on their investment. That was good news for the Operators but bad news for those of us trying to keep their 17 year-old Star Wars games alive.

I was a big fan of the "star wars" consoles; and they all had a peculiar smell. now i know what that was. cooking flybacks.


A decade ago, I taught physics to some high school students; they were amazed by the old CRT Tektronix oscilloscope. "So many knobs and controls!" "Does it only show green?" "Can it show pictures from the web?" Two of them figured it out and we used it to measure the speed of sound.

Recently, I dusted it off and let a couple neighbor teenagers fool with it (displaying speech, music, sine waves, and Lissajoius patterns. Again, they were fascinated.

Good stuff there!


>Can it show pictures from the web?

Even better, you can play Quake on it! https://www.lofibucket.com/articles/oscilloscope_quake.html


I got raised eyebrow at "pictures from web" a decade ago, but then realized decade ago was '13/14 and not '03/04...


It's a pity such monitors aren't made any more.

In the late 80s, I owned a small (12..13" or so) amber monochrome CRT monitor. Lovely phosphor color, non-flickering superb for text-based work, single composite video input.

Let go when it didn't seem to matter much, and CRTs were still everywhere.

But today I'd love to have such a screen. Commercially this would be a small niche market, but probably a stable one. And regardless of modern LCD/LED screens, such antique monochrome CRTs have properties that no modern flatscreen can match. As an in-between, LCD/LED screen designed for ~0 lag & X/Y vector input (monochrome or color), would be nice too. Doesn't exist afaik?


When LCD monitors/TVs came out they were a huge step backwards in terms of quality. They were relentlessly pushed on consumers because they were cheaper to make, cheaper to ship, and took up less space in stockrooms and on store shelves. I doubt the CRT will make a comeback. Even today's TVs are inferior to CRTs in some ways and so we're still compromising, but at least current LCD screens are so much better now than they were when they were replacing CRTs.


I don't remember LCDs needing any "push" in the customer segment, and they certainly co-existed for a while, until people stopped buying CRTs.

Initially they were wildly expensive, so all we could do is to sit and watch; but the moment they got affordable, people just started buying them. I remember swapping my CRT for a (smaller) LCD.. the extra space on my desk was so nice! And in a few years, I bought an LCDs display that was much larger than any CRT I could ever afford and fit on my desk...

And yes, viewing angle restriction was annoying. Also magazines would keep writing about latency, but I didn't care - I was not a gamer, I was a programmer, big size + desk space was much more important.


The sales pitch was that "HD" was all anyone was supposed to care about. What we got in exchange for HD and some extra desk space was terrible color accuracy, worse contrast and grey blacks, low refresh rates, high latency/input lag, motion blur, dead/stuck pixels, poor viewing angles, fixed resolutions, backlight bleed, and DRM. We also lost the degauss button. Kids today will never know how fun that was.

Some of that has gotten better over time, but it'd be nice if in 2024 we had screens at least as capable as what we had several decades ago.


Perhaps I just had poor ones, but my memory of CRTs was inferior blacks in practice due to the much higher reflectiveness of the glass surface. So while technically it was a perfect black in terms of direct emissions, it didn't help much when I could see the room behind me reflected in that "black" patch. In a dark arcade it's fine of course, but most of my monitor use is in a well-lit room. And I still dislike glossy modern monitors


It's true that there were CRTs that performed worse than others and that glare was an issue. CRTs could develop some weird image issues when they started failing too so for a lot of people used to old CRTs that needed repair getting a new LCD probably did seem like an upgrade. Very few people who'd been using a well functioning Sony Trinitron would have felt that way though.


As someone that had a 22" Sony trinitron monitor on my (sagging) desk in the late 90s, no, I was glad to switch to LCD.


There's a Sony Wega that's been sitting outside my back door for 10 years because it could only handle a toddler pouring apple juice inside it twice. Should probably make a trip to hazmat drop off one of these years.

Anyway, no. The reason said toddler was able to damage it was because I replaced that sucker with an LCD TV the instant they didn't cost an arm and a leg and moved the CRT into the basement for the kids to watch.


I'd post it on your local craigslist, etc, as a free pickup. No more Trinitron tubes are being made. It's possible it has value to somebody. No point in it going to landfill if somebody can make something good out of it (and you potentially don't have to move it).


> been sitting outside my back door for 10 years

I'm thinking you missed that part :-)

He and I talked about it today and it's probably been out there for closer to 12 years.


There's still a decent chance the tube itself is salvageable.


Sadly with CRT monitors (at least today) there's a tradeoff where at low brightness (black level) settings, dark areas of the image are underexposed, and at high brightness even pitch-black emits a bit of light. The ideal solution would be a nonlinear gamma-like curve where dark areas of an image have a higher slope than the rest (but without decreasing the slope of whites compared to mid-tones).


CRT black levels compared to LCD are night and day. Even a game like Genshin Impact (shit game I know, but it has good graphics) looks unimaginably better on CRT than any LCD, due to the high contrast as a consequence of the low black level. CRTs generally have less reflection than a glossy LCD due to some different kind of coating or glass treatment hard to find info on it). Last summer I used a CRT that is 99% dead (99% brightness / contrast) with the windows open every day letting sun in on a ~90 degree angle, and there was never an issue, nor with cheaper ones.


LCDs do give consistent color convergence over the whole display, and perfect vertical and horizontal lines. That counts for something, although you're certainly right about the cons.


Wait what, "HD"? Sales pitch where? I don't think I ever watched traditional ads for monitors, it was always catalogs (with dry specs) and computer magazines / friends (which always mentioned latency). And if there were any resolution-related acronyms, it would be SVGA, XGA and other weird things no one took seriously.

Here is a period shopping catalog: https://archive.org/details/computer_shopper_2000-07/page/n5... (page 5). Note you had a choice PerfectFlat E2 CRT panel (1280x1024, 17" viewable) and ViewPanel LCD (with same spec). At that time (2000's), LCDs were 2x-3x times more expensive than CRTs, so most people would still go with CRTs.

HD is a relatively new term which I think appeared when LCD came for consumer TVs? By that time, LCD monitors has long ago became prevalent with the PCs.


This is the correct history. LCD was pushed with only 3 points:

- sharp image

- less power consumption

- less space taken and less weight


I remember the constant eye strain from years of CRT monitor use. Blurry and flickery pieces of shit. I dumped mine in the trash and never looked back. Good riddance.

> it'd be nice if in 2024 we had screens at least as capable as what we had several decades ago

You don't know what you're talking about. Try using a high end LCD that was manufactured in the past 10 years. The screens today are way better than the shit we were forced to use in the 90s.

https://www.apple.com/pro-display-xdr/


Please edit swipes (like "You don't know what you're talking about") out of your comments here.

This is in the site guidelines: https://news.ycombinator.com/newsguidelines.html.


OK, will do, but of course it's too late to edit this one now.


Oh that's not a problem - the only thing we care about is that you'll follow the rules in the future. I appreciate the good intention.


You don't know what you're talking about.

LCDs were completely unusable until the mid 2000s, even then they were all 60Hz/75Hz with no black frame insertion, which made any animation what so ever, even text scrolling blurred to crap. It took me years to figure out that this was why I suddenly couldn't read any text scrolling on logs on terminals.

Yes, CRTs, flicker, it's a feature, it prevents motion blur. You're meant to put them up to 85Hz or so and flicker becomes less of an issue. There are two different types of flicker, the flicker fusion threshold, which is when humans still see flicker when looking directly at a flickering light source, and just general perceptible flicker, which can be seen easier at the corner of your eyes and in my experience goes up to 100s of Hz. The latter is an issue for all CRTs, and basically all LCDs up until 2015 or so when they started pushing flicker free. Before 2015, basically all LCDs flickered, typically between 100Hz-300Hz, and it was always visible, and also left artifacts on the screen during image panning (such as in a 3D game or when scrolling a map).

Yes, low end CRTs had lots of blur but by the late 90s any mid ranged model had somewhat acceptable levels of blur, and moreover much less blur than you get the moment an image pans across an LCD. As an experiment, if you simply move an image across the screen at 15pixels per frame on a 60Hz LCD, it will already be blured to hell.

Those Apple monitors have terrible grainy coating which some monitors choose to have for some reason. There's no tradeoff there, since there are plenty of LCDs without it. They also have bad viewing angles like all IPS contrary to marketing. The only advantage they have is high pixel density, and high color gamut which may or may not be functional or desirable but this is another can of worms. Wide gamut on a monitor is not just automatically good.


> You don't know what you're talking about.

I used CRT for years until 2024 and I've lost count of how many Dell Ultrasharp and Apple LCD screens I've owned.

> LCDs were completely unusable until the mid 2000s

We're talking about monitors today. 2024. I switched to LCD in the mid 2000's when they became usable.

> Yes, CRTs, flicker, it's a feature, it prevents motion blur. You're meant to put them up to 85Hz or so and flicker becomes less of an issue.

It's a feature that I hated and caused a lot of pain. Maximum mine could do was 75 Hz. An 85 Hz screen was very expensive back then. And the flicker would still be crap even at 85 Hz.

> As an experiment, if you simply move an image across the screen at 15pixels per frame on a 60Hz LCD, it will already be blurred to hell.

Talking about the blurry low resolution. LCD is sharp. Not talking about pixel response time. We have 144hz+ LCD screens now. There were no 4k CRT displays BTW. Pixel response time on gaming LCDs is 1ms nowadays.

> Those Apple monitors have terrible grainy coating

Apple displays are glossy, not matte. The matte coating is optional on external XDR displays. Typing this comment now on my Apple 16" XDR display which is glossy, like a CRT. No grainy matte coating on this screen.


This discussion made me want to refresh my understanding of current Apple monitors and if they have anything to offer.

I looked at the Apple Studio Display glossy and matte versions at the Apple store. The matte version uses "nano coating" instead of whatever regular matte LCDs use, and is a $200 addon despite being inferior.

-1 The reflections of the glossy screen are far worse than what you get with most (all?) CRTs. This is standard with LCDs for some reason, they are just perfect mirrors that reflect even a T-shirt in a well lit room.

+1 The pixel density is superb as expected. The difference from average (~100PPI) monitors is night and day. I don't know why for the last 20 years ~100PPI has been standard. It's terrible. For some reason there are almost no monitors with high pixel density other than this Apple and some Dells.

-2 Contrary to the IPS hype of the last decade, the viewing angle is terrible. At a wide angle, (like if you are just standing anywhere near the desk and not bobbing your head down to the level of where it would be if you're sitting) the image is completely washed out. Sitting in front of the white screen at normal distance (a few feet back), the edges are grey. Just moving your head 10 degrees in any direction causes abrupt color / luminance shift on any content. This is exactly the same as all IPS monitors, from $2-$3000. Nothing unexpected here, unless you thought the $1500 price would magically fix this.

+1 I correct myself on one point in previous discussions: The glossy screen is not obscured by grids like with most other glossy LCDs. Normally, with a glossy LCD you still have a very fine grid of black on top of the image, said to be the transistors blocking a small portion of each pixel. Perhaps the high pixel density here alleviates that issue.

+0.5 The color gamut impresses with high saturation. The accuracy and real (as opposed to claimed) coverage, however was not tested. Setting the color gamut to "legacy" modes in the Apple OS menus, like BT. 709 (basically sRGB, as I didn't see an "sRGB" option) made the image too dark and washed out, worse than what you'd get with a standard sRGB IPS LCD as well as disabled the brightness setting for some made up reason.

-1 The input lag was abysmal, as bad as TVs. Which could be due to the fact that the store only has wireless Apple mice, but could also be the monitor itself or the OS or many other things. I tested 4 Apples each hooked up to one of these monitors. Out of 75 monitors I have tested in recent years, only a few (such as the Dell 2407WFP, Dell 2048WFP, and NEC LCD2070NX) come anywhere close to as laggy as this. The lag is too high for even desktop use. For reference it's twice as bad as if you played a video game on 60Hz with vsync. Since the monitor has GBs of RAM I'm leaning toward the lag being from it and not the wireless mouse.

-1 It only does 60Hz which is terrible because that causes intense motion blur. Just scrolling in a web browser drops the resolution below 640x480 and increases the blur above even the cheapest oldest CRT from the 80s.

0 The "nano coating" is worse than average LCD matte coatings. With nano coating, all of the above results were the same, except the image is obscured by what looks like rainbow colored sand, that shifts color when you move your head, just like Dell 2408WFP or Dell 2007FP, from ~2007, or later VA crap from 2012 by BenQ. The grains of sand are finer than those older monitors, but still unignorably visible at normal distance (a few feet back) on a screen of one color or white or some pale color. The glossy version is better (and $200 less).

-1 For a monitor it has terrible unwanted features like built in ultra high res webcam, microphone built in chips which are allegedly just copied from a phone, reportedly with 64GB RAM but "only 2GB is used at any given moment" (whatever that means). Absolute nightmare for security, hopefully you could just mod the monitor and make it take a signal straight to the panel without any of the massive and pointless chips and hidden OS in between. I review only based on image quality but this is too much of an atrocity to ignore, even paying $1500 for hardly anything is not nearly as bad as this.

Rating: 5/10, OLED is better and the poor image quality is not worth the pixel density increase. You're basically getting a premium calculator screen. You're better of just getting a 3840x2160 27" OLED which is 160PPI, a small decrease from 218PPI of the Apple Studio Display. LCD remains a dead end tech, no matter how much premium and "engineering" you add to it. High end CRT is still vastly superior to this, and were half the price, they are slightly blurry but they have superior contrast, no viewing angle issues, no lag, and no motion blur which makes them sharper any time you view a moving image (yep I'm aware that you people don't understand this).


Wow. I'm impressed that you were crazy enough to come back and post this detailed review that no one else will ever read.

> and no motion blur which makes them sharper any time you view a moving image (yep I'm aware that you people don't understand this)

I understand they have motion blur. That wasn't my original point.

CRT made me sick. I was forced to use them for years and they fucked my eyesight. LCD resolved that issue.

Absolute zero fucks given to all of your points here. I have to use monitors for work and need something that doesn't torture my eyeballs. CRT is hell if I need to stare at it all day.

> You're a fucking dumbass.

No amount of autistic screeching bullshit will ever convince me to go back to using a CRT for daily work. Cheers.


Well, ironically CRT has less blur than LCD in practice unless you only ever look at still images. Both strain my eyes but CRTs would have much better focus by now if they kept on being developed. And as I said and you didn't understand, flicker on CRTs can be reduced by increasing to whatever rate you want, even 200Hz (although 85Hz should be enough) on a cheap 1999 model I have, and most LEDs in light bulbs and absolutely any appliance etc are already flickering at 120Hz at most, so you're basically just picking and choosing what things you think affect you.


When 85Hz CRTs were expensive, so were LCDs. CRTs were basically free by the time LCDs became remotely usable. There was basically no reason to ever buy an LCD until way after 2010. Everyone here who thinks otherwise has some dumb embarrassing reason like they thought they could finally get laid if they made their desk less cluttered.

> And the flicker would still be crap even at 85 Hz.

Oh yes, I'm sure, and you are just magically fine with 100Hz LCD flicker that existed in most of them until 2015+ and even then only on every odd model. You clearly know what you're talking about and not just assuming the values that matter align with your beliefs. Listen, you have zero clue what you're talking about. Please at least take the time to understand this next fact:

Practically all LEDs flicker, usually at some absurdly low level like 100Hz or even 60Hz in really cheap ones. This is on 99% of modern vehicle tail lights, light bulbs, shaver LEDs, dishwashers, microwaves, practically all electronics such as even an external hard drive bay or USB stick, laundry and every appliance. And it annoys me. Why? Because it's at the corner of my eye. A human's eyes are highly sensitive to flicker, but only for stuff on the sides of their head, not directly in front. Which brings me back to CRT. When I'm using CRT, I'm looking directly at it, and this makes the flicker imperceptible even as low as 75Hz or so. Only if I turned my head to the side such that the CRT was on the edge of my vision it would become noticeable. For this reason, outside of this backwards world, one would have non flickering lights everywhere, and the only thing that flickers would be their monitor (because this is needed to prevent motion blur, the only alternative is to have a huge refresh rate like 500Hz+ and software that can keep up). And you are implying things work otherwise. They don't.

> Talking about the blurry low resolution. LCD is sharp.

Which part did you not get? I said a panning object on LCD will not be sharp or anything remotely close to it, it will be worse than the worst CRT.

> Not talking about pixel response time.

As I said LCDs have motion blur. It has nothing to do with pixel response. We had 200Hz CRTs in 1999. 144Hz LCD is still badly blurry, and these "gamer" LCDs tend to have weird bugs adding more pixel response artifacts than there would be if the "gamers" didn't design the firmware.

> Pixel response time on gaming LCDs is 1ms nowadays.

No, they are not. Why did you even edit that in. Those marketing numbers are simply false. In reality you have a different pixel response for every pixel transition (0-255), (0-40), etc. And you continue to not know what you're talking about by thinking this would reduce the motion blur. The motion blur in LCD has absolutely nothing to do with pixel response.

> [i have the glossy apple]

Okay, and the glossy ones have terrible glare. Somehow glossy LCDs always have worse glare than most CRTs, probably because the vendor only cares about the wow factor of what can be achieved by just ripping off the anti glare coating of any $2 monitor. LCDs also have worse image clarity on a glossy screen than a CRT. A plain screen set to show one color on even a glossy LCD looks gritty, while on a CRT it looks clear.


[flagged]


Whoa - I don't mean to pile on but you broke the site guidelines egregiously here. We have to ban accounts that post like this, regardless of how wrong other commenters are or you feel they are, so please don't post like this again.

If you wouldn't mind reviewing https://news.ycombinator.com/newsguidelines.html and taking the intended spirit of the site more to heart, we'd be grateful.

Edit: fortunately it doesn't look like your account has been making a habit of this, so it should be easy to fix.


Please edit swipes (like "You don't know what you're talking about") out of your comments here. I realize the GP did it first, but that doesn't make it ok to break the guidelines yourself.

https://news.ycombinator.com/newsguidelines.html


Actually people who consistently spread misinformation that powers the corporate racket that normalizes low quality technology deserve to be shamed, you just naively value politeness over correctness, which I am very aware is a huge problem with sites like HN, Slashdot, and Reddit.

Where are you every time some stupid Apple or Tesla user explains how other people are just "poor" so they won't get why their overpriced garbage is so good? Nowhere. You're just an autistic moderator and will always be a bane to society with your social pedantry.


Happily you can make your correct points while still remaining within the site rules, so there's no conflict with correctness. If, however, you continue to break the rules, then we're going to have to ban you no matter how correct you are or feel you are.

https://news.ycombinator.com/newsguidelines.html


I fully admit that they have improved, but they still do not support everything CRTs did. I haven't tried that particular monitor. I'm unwilling to pay for the $1,000 monitor stand (sold separately), or the $20 proprietary Apple Polishing Cloths, but I really should stop into an apple store sometime to see in it in action for myself.


LOL the $1k stand. It wasn't until I bought a $200 stand when I realized how they can justify $1k for something that works properly.

These days I'm using a Chinese "professional 4k" monitor that covers 98% of DCI-P3 color gamut and the entire monitor costs less than the Apple stand. Can't compare this to any CRT I used in the 90s.

https://www.innocnmonitor.com/products-category/4k-professio...


That's because expensive models, and the only ones with any remotely decent picture (maybe scratch that as the sandy coating is abysmal) like the $1000+ Dell 2408WFP, had miles of input lag to the point where you couldn't even control your mouse on the desktop. Like most TVs now. However all LCDs also just do have high input lag, you're just not sensitive to it. On the more tame LCDs like any random TN from 2005+, it's just a minor annoyance, although it does make mouse based 3D games unplayable.


Lcds were way better than the crts consumers had. Most desktop crts were absolutely massive and took up the entire desk. We had to install special drawers in our desks that slid out and held a keyboard because there wasn’t room otherwise on most desks. The lcds that replaced them were far lighter, my grandparents could actually move them for once, they took up probably three inches of the desk, and they were bigger diagonally too with more io accepted.

For tvs flat screens only really started replacing crt and plasma when they were hd. Even a 720p tv looked a lot better than a crt and once again, was much lighter, was much bigger, and had way more io (early flatscreens had even more than today usually). Crts sucked, thats why everyone happily threw them to a curb.


LCDs were garbage at just about everything. It was so much worse when the screens were new, but even today I have yet to see an LCD screen that can display an image with even a single solid color accurately. Correctly representing an image on a screen is their one job and they still fail at it.

See my reply here (https://news.ycombinator.com/threads?id=autoexec#39670782) for a list of just some of the ways LCD screens were worse than CRTs


> even today I have yet to see an LCD screen that can display an image with even a single solid color accurately

Have you tried professional calibrated monitors? https://www.eizo.com/products/coloredge/


I doubt it. They aren't even terribly expensive.


Well those cost tens of thousands of dollars. But even a cheap (by comparison) Apple XDR display is properly calibrated and surpasses the color gamut of professional CRT monitors. (100% of DCI-P3 https://en.wikipedia.org/wiki/DCI-P3)


> Even today's TVs are inferior to CRTs in some ways

Get a high-end OLED. Totally exceeds the real and imagined advantages of CRT.


Motion blur is an area where impulse displays like CRTs still excel compared to sample-and-hold displays like LCD and OLED.


OLED can easily be implemented with black frame insertion. You could even do it yourself at from the GPU/software level given a sufficiently high refresh rate if the monitor units are still too stupid to provide it as a working option. I'm more worried about the terrible spectrum of light they emit as with seemingly all LED tech, and the viewing angle.


My Sony Bravia from 2020 is visible from any angle and can be configured to do black frame insertion.


In theory yes. You’d need a sufficiently high refresh rate (1000 Hz or better I would guess) and corresponding brightness.


The human eye is only sensitive at about 250hz, and even then it is only for brief flashes, as opposed to continuous motion. 1000hz is overkill.


What are you even talking about, an OLED running at 100Hz with BFI already would have zero motion blur. I'm not sure whether or not that's implemented yet but it's just a matter of time before the shitshow of monitor vendors figures that out.


You need shorter pulses than 10 or 5 milliseconds to reduce motion blur down to the level of CRTs. 100 Hz would only be sufficient if the BF takes up > 90% of the time, instead of the usual 50%. What causes motion blur is the frame being held for a period of time. BFI merely reduces that time by half. On CRTs, each pixel is only held for 100 microseconds or so.


Hmm, I haven't thought this far into it, what matters is probably how long the persistence of vision of the human is.

For the LG CX OLED (https://tftcentral.co.uk/reviews/lg_cx_oled) (oddly one of the only OLEDs that have BFI) we can see they have no problem pulsing the pixels on on for only 3-4ms (https://tftcentral.co.uk/images/lg_cx_oled/bfi_120_high.png) (5ms divisions), resulting in these images (which I can't tell how representative of a human are because I don't know if the camera exposure matches human vision and the camera is probably wobbling at such high speeds, adding more blur than a human would see which is why these pursuit camera pictures are always blurry to hell - IIRC all those LCDs from the last 6 years with half working BFI aren't actually blurry, they are just buggily implemented so they have the top of the screen and/or bottom showing double images, each image is crisp, just doubled):

https://www.tftcentral.co.uk/images/lg_cx_oled/pursuit_120hz...

https://tftcentral.co.uk/images/lg_cx_oled/pursuit_60hz.jpg

If you just take a 240Hz OLED and blank 3/4 frames to get back a 60Hz image I'd be half surprised if that actually looks blurrier than a CRT.

EDIT: yup, just tested one of my garbage LCDs even on 75Hz with BFI enabled in the monitor menu, the image is perfectly crisp, just there are artifacts everywhere, mainly double images. That wouldn't happen with OLED with BFI (with BFI in the hardware/firmware obviously to minimize the on time beyond just dividing the frame rate).


On second thought I think duration of persistence of vision (or afterimages) doesn't matter, but more the velocity of the object which your eye is following and the amount of angular "resolution" your eye can perceive. If your focus moves X amount while the object is lit, if that distance is more than the angular "resolution" your eye can perceive, it will appear as some blur behind the object.

You made me think about an interesting point I long avoided going into.


The problem is that the available market for CRT screens (and it does exist!) is well supplied for now by used and other older models.

Modern screens are better in many ways, but like CRTs, there are tradeoffs and many things are not yet designed around those tradeoffs.


Not well supplied at all. None of the three sub-sects are well provided:

a) high-quality video monitors like Sony PMW, great for gaming and old-school video

b) slow amber screens like mentioned above

c) large TV sets with good colors (and no semi-digital stuff messing with quality like 100Hz)

None of those can be easily found, especially not in working condition.

Oh, forgot about the substantial

d) Gaming CRT with insane refresh-rate and true blacks!

Of these, I think only monochrome sets and maaaaaaybe PMW-style sets have any inkling of a chance of new manufacture. Probably only monochrome monitors, they are much simpler to get right.


I turned on a crt recently and it was so noisy I could hear it outside, through the walls. My cat ran and hid. Did we really all collectively just tolerate that noise back then, or do these things decay over time somehow and get louder?


It it was that loud then I'd expect something was wrong with it. I can no longer hear those frequencies, but I could when I was a kid, and I don't remember it being annoying at all. Just one of those things you only really notice after it stops, like when the fridge stops humming. Our cat also didn't react at all from my memory


I tried to show off an Apple II to my then 5 y/o daughter using a CRT monitor. She the sound (presumably somewhere around 15Khz) and was really annoyed by it and clasped her hands to her ears. My wife and I were completely oblivious.

We didn't see if she could acclimate to it. The monitor works okay but it's early 1990s stock so, presumably, something might be wrong w/ it, too.

I definitely remember hearing CRTs as a kid and adults telling me I was making it up.


The horizontal refresh rate (scanlines per second, how often the beam retraces horizontally) of NTSC video is indeed 15750 Hz, and PAL is 15625. Just about every CRT will emit some amount of audio at that frequency, and yes that's just at the upper end of the audible range for some (mostly young) humans.


My family's Apple //e monitor has always emitted a high pitched whine, since I was a kid. (I can't hear it anymore but my niece tells me otherwise.) What worked for us has always been to wedge a folded up piece of paper underneath the bezel of the screen. Don't know why this stopped the sound but it does.


You couldn't hear it over the sound of your PC fans and hard drive seeking...


CRTs have some slight noise when you provide certain video signals, as others said it will usually be quieter than a CPU fan. Many LCDs will make the same kinds of sounds.


Slap it on the side. Coils may be loose due to age and glue drying out.


This is a pretty good emulation:

https://github.com/Swordfish90/cool-retro-term


It's absolutely worthwhile to click the comments link. [0]

[0] https://www.jmargolin.com/mail.htm


I used to have a 1948 Hallicrafters TV and it had yet another type of high voltage supply. There was a 100 kHz oscillator driving an air-core transformer in normal (not flyback) mode. The potting had gone bad on the transformer, and I couldn't find a suitable replacement, but I did find one that had the right secondary winding. Since it was air-core I just cut the secondary off that and the primary off the original and glued them together. Worked great.


Hallicrafters made TVs? I guess I shouldn't be surprised but I've only ever heard that name in the context of ham radio and shortwave listening.


It's a shame that when the SCART connector was conceived they didn't include X and Y signals along with the RGB ones. Would be cool to have every TV in Europe also being an X-Y monitor (and allow an extra cool free visualization mode when the TV is on but you are listening to music).


The deflection circuitry of a TV or raster-scan CRT monitor is not designed to be driven by random X-Y signals and optimized around the sawtooth patterns of the raster scan. One thing is that driving the deflection coils of large-ish CRT at typical raster scan frequencies requires considerable power (the typical X-Y display cannot draw all the "pixels" each "frame") and another is that there is a bunch of analog tricks (funky magnetics in the "output filters" and such) where various non-linearities of the system partially cancel each other out, which works only for the typical saw-tooth-ish deflection waveforms.

Another thing is that both CRT monitors and "modern" CRT TVs synchronize both their main SMPS and the EHT flyback to horizontal frequency of displayed image (one is tempted to say that typical CRT computer monitor is in fact one giant overcomplicated SMPS). The reasoning is more or less the same as to why the TV framerate is related to mains frequency: reduction of artifacts that move around in the image. Also, this is the reason why out-of-spec signal can damage (old, which essentially means not microprocessor controlled) CRT displays and TVs.


Always wondered how difficult it would be to hook up one of the old multicolor vector screens to an emulator and be able to play games like Star Wars and Asteroids in true vector presentation instead of a raster approximation. Going to guess the emulator would need a completely different rendering subsystem and the computer would need a custom video out to drive the vector monitor. Maybe without a good source of working multicolor vector monitors it wouldn't be worth the effort but it would be nice to have something closer to the real deal without needing to collect a large number of cabinets.


Back in the early '00s there was such a device - the Zector ZVG. It was driven via parallel port and used a custom version of MAME.

I'm sure somebody in the VAC scene started work on a replacement after these ended production.

I had an ZVG and it was pretty good. But now I just play my Star Wars cockpit.


Oscilloscopes are mentioned a few times, but not for the main reason I assumed they would be.

Many of them include an x/y mode to use them as vector displays, even some of the cheap ~$150 ones.


TIL: even today, they still make analog oscilloscopes [0]

The are not cheap though - almost $1000 for a 30MHz model which is super expensive, especially compared to digital model. I assume "$150" refer to price for a used oscilloscope.

[0] https://www.bkprecision.com/products/data-acquisition-record...


$150 for a new, cheap/small digital oscope that supports an X/Y mode. Yes, it's not truly a vector display, but accepts vector x/y input and rasterizes it. Not all of these have an X/Y mode though, so check first.

But, yes,old oscopes with CRTs is an option also.


I just want to mention Oscilloscope Music [1]. You can find other artists on YouTube. It can be played on analog oscilloscopes with an XY mode. I also recently made a little player with a Raspberry Pi [2].

[1]: https://oscilloscopemusic.com/

[2]: https://www.xythobuz.de/osci_music_player.html


The glow of CRTs creates an atmosphere that's surely missed by a lot of those who enjoyed arcades.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: