I think some younger people might have never really seen a CRT. And they're positively rare now. I encountered a CRT TV in the hospital waiting room recently and was a bit startled to see one. So for those only passingly familiar, if you get the opportunity, spend a bit of time experimenting with it visually. Jiggle your eyes, look away suddenly, and then back, and try oblique angles. Maybe you'll see what they mean about "you just can't recreate that glow".
It's hard to describe but the image is completely ephemeral. All display technologies involve sleight-of-hand that exploits visual illusion and persistence of vision to some degree, but the CRT is maybe the most illusory of the major technologies. It's almost entirely due to persistence of vision. With colour TV and fast phosphors the majority of the light energy is released within a few milliseconds of the spot being hit by the beam. If you had eyes that worked at electronic speeds, you would see a single point drawing the raster pattern while varying in brightness.
A bit of TEMPEST trivia: The instantaneous luminosity of a CRT is all you need to reconstruct the image. Even if it's reflected off a wall or through a translucent curtain. You need high bandwidth, at least a few megahertz, but a photodiode is all that's necessary. The resulting signal even has the horizontal and vertical blanking periods right where they should be. Only minor processing (even by old school analog standards) is required to produce something that can be piped right into another CRT to recreate the image. I'd bet it could be done entirely in DSP these days.
Once we lost CRTs we began this insane race for all these duct tape solutions; all sorts of anti-aliasing, lighting, etc, and a resolution race that can only be described as pathological. Try modern games in low resolution with minimal effects on a CRT. You'll be surprised how much better they look, but also how much better they run; you don't need such a powerful rig when you turn your graphical effects down when using a CRT.
> all sorts of anti-aliasing, lighting, etc, and a resolution race that can only be described as pathological
I enjoy CRT nostalgia now and then, but modern high resolution games are absolutely amazing. The blurry, low resolution, low refresh rate CRT look is fun for old games, but playing in 4K at 100+ fps on a modern monitor is an amazing experience on its own.
I got very sad when my CRT monitor died. I was using a Radeon RX 380X, part of the reason is that it was one of the few cards to still have analog output.
Then I went and played lots of recent games in lower resolution, but could turn on lots of expensive effects even with such underpowered card, because I could do low-res with anti-alias disabled and no scaling and have decent results.
But true pleasure was playing for example Crypt of Necrodancer on that screen, the game felt so easy. I eventually stopped playing after that screen died, I could never nail the timing anymore on modern screens, the response time is not the same.
Pre-Gameboy, when I was a child, my grandfather had a television— the kind that was furniture. Sometimes it would eschew modern trappings like colour and v-sync, and I would employ my Classical Vaudevillian training to set it straight with a wallop.
I might miss visual aspects of CRTs, but I mean most of them had a coil sound or some kind of cracking sound. May be as TVs, screens for gaming consoles they were fun, but as monitors I don’t miss the heat burning my face.
I’ve been searching for years for the old CRT Viewsonic Mac monitor that used RGB inputs. (Might have been CMYK). You plug in the DIP head to the video out and you attach each individual connectors to the color connectors on the back. The thing was massive, easily over 24”. Beige plastic that we all love.
Growing up my dad was a Mac guy and he had all kinds of Apple stuff. The weird page sized monitor, performa 600, trackball mouse, ergonomic keyboard. Granted my father was in software and this was the early 90s but it would _definitely_ define my initial passion for computers.
I’ve been looking for this monitor so that I can restore his setup. I have his Performa, peripherals, and restored those. I just need that giant monitor he used to use.
My father passed away last year. My world has been different ever since.
But you've now got me mulling over the implications of a CMYK-based subtractive colour process 'monitor'. I'm guessing the refresh rate wouldn't be too hot..!
Sometimes I think about the bizarre path computer technology took.
For instance, long-term storage. It would stand to reason that we'd invent some kind of big electrical array, and that's the best we could hope for. But hard drive technology (which relies on crazy materials technology for the platter and magnets, crazy high-precision encoders, and crazy physics like floating a tiny spring over the air bubble created by the spinning platter) came in and blew all other technology away.
And, likewise, we had liquid crystal technology since the 70s, and probably could have invented it sooner, but no need, because Cathode Ray Tube technology appeared (a mini particle accelerator in your home! Plus the advanced materials science to bore the precision electron beam holes in the screen grid, the phosphor coating, the unusual deflection coil winding topology, and leaded glass to reduce x-ray expose for the viewers) and made all other forms of display unattractive by comparison.
It's amazing how far CRT technology got, given its disconnect from other technologies. The sophistication of the factories that created late-model "flat-screen" CRTs is truly impressive.
The switch to LCDs/LEDs was in a lot of ways a step back. Sure, we don't have huge 40lb boxes on our desks, but we lost the ultra-fast refresh rate enabled by the electron beam, not to mention the internal glow that made computers magical (maybe I'm just an old fuddy-duddy, like people in the 80s who swore that vinyl records "sounded better").
Someday, maybe given advances in robotics and automation, I hope to start a retro CRT manufacturing company. The problems, such as the unavailability of the entire supply chain (can't even buy an electron gun, it would have to be made from scratch) and environmental restrictions (lead glass probably makes the EPA perk up and notice).
> like people in the 80s who swore that vinyl records "sounded better"
I'm not one of those people who ever thought vinyl sounded better than a properly recorded and mastered digital version and I've always believed a high-bandwidth digital audio signal chain can recreate the "warmth" and other artifacts of tube compressors well beyond the threshold of human perception, however a broadcast-quality, high-definition CRT being fed a pristine hi-def analog RGB signal can still create some visuals which current flat screens can't. This is only controversial because most people have never seen that kind of CRT because they were incredibly rare.
I got to see one of the broadcast production CRTs made to support NHK's analog high-definition video format in the 90s directly connected to HD broadcast studio cameras and the image quality was simply sensational. It was so much better than even the best consumer CRT TVs, that it was simply another thing entirely. Of course, it cost $40,000 and only a few dozen were ever made but it was only that expensive because these were prototypes made years before digital hi-def would be standardized and begin mass production.
In fact, I think if it was A/B compared next to a current high-end consumer flat screen, a lot of people would say that CRT looks more pleasing and overall better. For natural imagery a CRT could render the full fidelity and sharpness of a 1080 image but without that over-crisp 'edginess' today's high-end flat screens get. And those "cathode rays" can render uniquely rich and deep colors vs diodes and crystals. Of course, for synthetic images like computer interfaces and high-dpi text, a flat screen can be better but for natural imagery, we lost something which hasn't yet been replaced. I'd love to see an ultra high-end CRT like that designed to display modern uncompressed 4K 12-bit HDR digital video.
I had a music teacher that insisted analog recordings were different.
One day she said there is a simple way to prove it. Certain stringed instruments have the string move on their own to the correct note if you put them near a source of similar sound. If you put these instruments in front of a speaker playing from an analog source and have the strings move, then play the exact same music but from a digital source on the same speaker, the strings stop moving, even if to most humans it sounds exactly the same.
Sadly I never had the gear to test this, I am not a professional musician and was learning from that person as a hobby (she is a teacher for professional musicians).
If you do ever test this, and do it rigorously (i.e. using analogue and digital versions of the same recording, with no pitch inaccuracies) you'll find the strings will resonate equally well with analogue and digital recordings, all other things (volume, tuning of the instrument, etc.) being equal.
The problem is that all other things are no longer equal, and have not been for quite some time.
Retuning digital audio to 440Hz equal temperament is an industry norm now, even for (say) re-issued 1970s stuff. You just won't get modern digital versions that are the same as the analogue versions, and the equal temperament stuff thus won't pass a resonance test unless the test instrument is also equal temperament, which most string instruments of course are not.
The far easier test for amateurs nowadays is not to buy a whole string instrument, but to use pitch monitoring applications, which all too readily show when a sound is bang-on the specific equal temperament frequencies.
I find this dubious since the effect she was describing is caused by resonance frequency. Since, in the example provided, the source is an amplified speaker pushing air in both cases the outcome should be the same. The more famous test of this principle is the breaking of a glass and I would be surprised if this hadn't been done with digital signal inputs.
And China is still building, today, brand new CRT boards for CRT TVs and monitors. You can buy them on AliExpress.
I don't know if CRT themselves are still being built though.
I'm hanging on to my vintage arcade cab from the 80s with its still-working huge CRT screen. Hope I fail before that thing (and I hope it doesn't fail anytime soon!).
> The switch to LCDs/LEDs was in a lot of ways a step back. Sure, we don't have huge 40lb boxes on our desks, but we lost the ultra-fast refresh rate enabled by the electron beam, not to mention the internal glow that made computers magical (maybe I'm just an old fuddy-duddy, like people in the 80s who swore that vinyl records "sounded better").
CRTs don't have particularly good refresh rates. There is very little delay on the output scan, but 99% of the time the delays built into rendering make that irrelevant compared to fast screens using other technologies. And the time between scans doesn't go very low.
My first HD TV was a tube. The picture was flat, but it did have overscan. Compared to today's TVs, it was small. (~30")
Even though there were some aspects of the tube that were nicer, I'm not going back. I don't want the giant box, and now that I have a 70" OLED, I don't want to go back to a tiny screen.
> To put this special set in some context, there are more 18th century Stradivarius violins in existence than pre-World War II TVs and, to make it that bit rarer, this TV has only had two owners. “I’ve handled 38 pre-war tells and this is the finest and even comes with the original invoice,” said Bonhams specialist Laurence Fisher. “It cost a huge amount and the owner must have had wealth and means…It is a very rare thing and there are collectors who would love to have it.”
I played SNES and PS1 games on a CRT. I played them on LCD and OLED TVs. I can’t tell the difference.
I mean I can tell that hdmi cables never introduce chromatic abberation something which was quite common on these old TVs when the SCART cables I used to use got old and I never had a LCD screen catch fire something which happened to me twice with aging CRT screens.
I really don’t get the nostalgia or whatever it is called when some of the people who think it was better then weren’t born at the time.
Blowing things up to that size is not representative.
Back when I first started playing things on emulators we were using 12" to 20" CRTs or LCDs with much higher resolution than a TV, so whether CRT or LCD the pixels were chunkier.
None of the nostalgia is how I remember it at all.
The average CRT TV had crap color and poor brightness and going from that and the flicker of 1-to-1 size NTSC on a 20-something TV to an emulated "chunkier pixel" rendition on a progressize-scan 72+hz 1024x768-or-higher CRT or an LCD looked way better.
Take the side by side pictures and zoom WAY out on a high-res screen or go stand several feet away from your monitor so that they're the size they were designed and expected to be seen at, and the vast majority of the perceived improvement from making the CRT subpixels visible goes away. And then put them into motion - especially vertical motion - and those lines in between, and losing half on each frame becomes more noticable and distracting.
The 4th image there of the yellow monster is a good example. Even zooming to 50% on my high-res display makes the "bad" version suddenly look way sharper and detailed as the size starts to show how frequently "rounded dots with gaps between it" just looks like fuzziness instead of "better".
And these comparisons tend to cherry-pick and not show examples of things that lose clarity as a result of the subpixels and scanlines instead of gain clarity.
The article concerns 'PVMs', not a phrase I remember in period, even though we had hundreds of Sony D1 monitors, which were the pinnacle of 'professional digital monitors'.
These were different beasts to civilian TVs, even top of the line Trinitron. They had none on the RF circuitry of a regular TV and the inputs were typically component, or, in the late nineties, digital, but not the digital we know today, that signal came down one BNC connector.
We had an outside broadcast company which had massive trucks full of screens for televising sports, concerts and public events. A new boss decided to outfit the new trucks with domestic TVs rather than the expected mega-expensive D1s. The trucks did not last long, much to the amusement of the crew. The TVs rattled themselves to pieces before they made it to their first event.
Unlike the civilian TVs, the Sony D1 monitors were designed to be repaired. We had people for that and you could often see the innards of one of them if you went to see the engineers in their den. They generally did not need to be repaired, but, if you have hundreds of the things then you raised the odds of having a few need a little bit of servicing.
In the studio environment they were rack mounted with air conditioning and extremely neat cabling to some type of desk where you had the buttons to choose what camera, VT or other source went to the screen. Lighting in the gallery was also controlled, so the picture you saw was the definitive picture, with no fiddling of brightness or contrast necessary. The blacks were black, which flat screens were only really able to achieve decades later with AMOLED.
In the basement with the DigiBeta tape machines we had smaller D1s in the racks, often with an adjacent oscilloscope. You could tell if the content was 'adult material' by the oscilloscope, which I always found amusing.
The magic of TV in that era was the analog nature of the CRT. The studio set was usually very battered and yet you could put a few tens of thousands of watts of lighting onto it for the cameras to show something beautiful on the D1 monitors. The advent of HD was problematic in this regard as every dent and scratch would show, along with every wrinkle and blemish on the presenter's faces.
Video games of the era were designed around the hardware, in Europe this meant 720 x 576 PAL, with lots of that image as 'overscan'. Note that JPG was also designed for the magic of analog, with slow CPUs. You can change the look up table in JPG to make it work for digital and fast CPUs but only MozJPEG does that.
You mention flickering, and most CRTs would be flickery, think of electrical shops of the era and what you would see out of the corner of your eye. Clearly you would not want this in a studio gallery lest anyone collapse with an epileptic fit. In Europe we had 50Hz rather than 60Hz, so, even with interlacing, flicker was a thing but only in the electrical shop, not in the studio gallery. This had more to do with genlock (for analog) than phosphor persistence trickery.
Regarding the article, I am pleased that the D1 monitors of old have found a new fan base that truly appreciate them. In period we put a lot of work into our setups and, to this day, I struggle to come to terms with all of that expertise and expense having gone forever.
In broadcasting there has always been an 'old guard' that can remember the 'good old days'. I now feel like one of those fuddy duddies!!!
Absolutely. I love playing Atari 2600 games, and it seems sacrilegious to play on anything but an old-school CRT TV.
Also, I’ve heard a CRT is required for NES light-gun games like Duck Hunt. Anyone know if this is true? I don’t have an NES, and if I did, I’d hook it up to my CRT, so I still wouldn’t know the answer :)
The NES light gun works with the properties the CRT provides... Roughly what happens is ... When you pull the trigger, the next frame is all black, and then one frame per target with a white square for the targer. If you're on target, the photodetector (photodiode? photoresistor?) will make a step change when the beam hits the white square, and the game code is looping to detect that. If the light comes late, it won't count; if it's not a big enough change, it won't count. If the screen was too bright during the black frame (or you were pointing at a light the whole time), it won't count.
Most modern displays are going to show the square too late, some might not be bright enough.
If you have an LED matrix and the right driving circuitry, you could probably replicate the timing, and that might work too, but I've not seen it done.
Yes, light guns/light pens actually relied on vertical/horizontal sync of the CRT screen to identify the position you pointed at, so they won't work on a modern screen.
The most important element of the CRT look is the fast phosphor decay. This is why CRTs have so little sample-and-hold blur. No other hardware can simulate it perfectly, but a 480Hz OLED display comes close:
Most modern displays are calibrated, to some reasonable level, and can easily accommodate the very limited gamut of an old CRT, especially anything supporting HDR10. I suspect this is more of "they need to be fudged so they're wrong" more than anything.
Plasma has great contrast and a slightly wider gamut than a CRT. Neither one have particularly good gamuts unless you're comparing to sRGB. Many current screens can do much better.
Most modern displays are calibrated, to some reasonable level, and can easily accommodate the very limited gamut of an old CRT. I suspect this is more of "they need to be corrected so they're wrong" more than anything.
My favorite TV of all time was a 65" Panasonic plasma.
4k OLED looks really good in dark scenes, but something about turning that plasma up to its full 800 watt+ vivid mode is a completely different universe.
Practically unusable during the summer in Texas though. The efficiency is really bad and you essentially need a custom hvac solution if you want to put it in a proper "theater" room. The noises it made in high brightness scenes were also a bit distracting. It had to move a lot of power very quickly.
I host monthly retro gaming meetups and we use CRT TVs that we store at the local library. Luckily, more than enough are still given away for free here. Right now we have around 25.
They're just surprisingly good when paired with a good signal and an old gaming console. I still would love to have a professional monitor but they are too pricey for me. I also need to get myself a CRT monitor.
I regret taking all my old tube monitors to Goodwill back in the mid-2000s. I saved a Commodore 1942, at least, but I sent all the rest away to die.
I appreciate the CRT modeling in emulators, but a hardware device that passes thru a display signal and provided sub-frame CRT artifacting and phosphor modeling (particularly if it supported 240P) would be bitchin'.
FPGA based devices that can do this, and quite well, do exist, they're just expensive. The RetroTINK-4k Pro is the top of the line as of this writing but it's a $750 converter.
My son who is 12 has gotten into collecting old Nintendo.
Just the other day, I managed to find one of the consumer CRT holy grails on our local equivalent of Craigslist. A B&O MX4200. It's one of the last CRTs that B&O made from the mid 00s. It has an excellent picture, nice onscreen menus for calibrating the tube, and supports NTSC over composite so he can connect his early 80s Famicom that's been modded for composite out.
Back in the 80s, as the home computer revolution got going, computers were typically wired up to small, cheap, portable TVs as a display device. These TVs used shadow masks, and the computer video output was typically modulated to a TV signal, and the TV was 'tuned' to the computer. All of this added large amounts of blur and distortion even before the signal was displayed on the TV.
By the mid 80s, it was maybe more typical to buy a dedicated CRT monitor, and the computer connected via composite, or maybe even an RGB feed to the monitor, allowing higher resolution and much improved quality.
For the well healed, this route also led to the holy grail, a trinitron tube!
At each of these changes, the aesthetic of the display technology changed, but probably the best memories come from the original blurry stuff as the magical moment of actually getting something out of a home computer.
I’ve got a 27” CRT right next to my 65” LG OLED C9 (which is starting to feel ancient, too).
It sits in a cabinet that currently holds an NES, SNES, N64, GameCube, and PS2.
It doesn’t get a ton of playtime, but when my now 21- and 18-year-old sons were young, I’d play on them quite a bit (they were already retro even then), and as they got older, they would too.
My oldest is particularly fond of the retro consoles and playing on the CRT, so he’ll hop on it when he gets the itch for something retro.
I feel like there’s a charm that will never fade, not only with retro consoles, but also playing them on a CRT.
I’ll never get rid of our CRT.
My oldest son wouldn’t let me, even if I wanted to.
When I was a kid, I'd go to the TV repair shops and take old "unrepairable" vacuum tube TVs (no transistors!) off their hands. At home I tried to fix them. I had no idea how to fix them. But I had a lot of fun trying.
One of the fun things was to randomly swap around the vacuum tubes and see what would happen. Very entertaining! I used to have a box full of scavenged tubes. Sadly, I eventually tossed them out, never realizing how valuable they'd be in a few years.
My mom was convinced I was going to electrocute myself, and finally made me get rid of the sets.
Your mom was probably thinking along the right lines haha.
But don't worry, TV tubes are practically worthless. I've been to swap meets for an antique radio club in my area. Sometimes, there will be a few boxes of tubes from TV sets there (400-500 tubes) and people literally can't give them away. Tubes from audio equipment are the only ones people are after.
Oh, I did understand what resistors, capacitors (then called condensers), and inductors did, but how they worked together to make a TV work utterly baffled me. I couldn't understand how the vacuum tubes worked, either. I didn't know anyone who had any idea how electronics worked.
you may enjoy this.
My daughter is 10 years old, couple of months ago we went to an airbnb which had a crt tv and some vhs tapes. she looks at a vhs tape and asks, what is this? She had a look on her face like if she had just found a dinosaur egg.
I collect CRT TVs. I try to limit myself to only special ones. I own a clear RCA SecureView TV from a jail cell, and several Sony Professional Video Monitors.
I started this hobby JUST as it started to become a little more popular. Terrible timing.
one of the best for shooter games because of the high refresh rate. There is virtually no lag for quick scoping or fast playing. there is very little computing happening inside.
i have a 55" panasonic LCD. no apps or wifi.From 2003. Still works.
We have roku, which at some point in the last 4 years, it went rogue and auto updated itself for who knows what telemetry . We almost never use it now.
I plug my laptop via HDMI and the possibilities are still there.
My Sony Bravia XBR6 from 2008 is perfectly fine for my living room screen.
(I even programmed an old Sony remote to kludge sending theequivalent of the PS5 controller logo button, for the PS5 that the TV is plugged into, for streaming and gaming. And found the trick to get the TV to go to standby when the PS5's HDMI signal disappears, which isn't a standard feature, though waking is.)
I'll probably only upgrade if I relocate cross-country, and have Bay Area levels of money to spend on a much more expensive non-'smart' setup.
I’ve hunted down a couple old school big screen TVs, the fresnel lenses are awesome toys, you can melt just about anything using them as solar collectors.
I have a 32" Sony something or other. One of the very last ones - HDMI port and does 1080i or 720p. What a find! 30 dollars, and about killed myself getting it in the house.
The plan is to have all the light gun systems hooked up to it. Being able to get 1080i out of a PS2 while having the light gun work is a challenge I have not yet surmounted.
Unfortunately, you might have trouble with light gun games. Some of the HDMI CRTs do some video processing that adds video delay. :(
If you're US based, you would probably use component video (YPbPr) to connect to your 1080i display; you can hook a ps2 light gun to the Y cable (usually green) and it should work if everything else is cool.
I think some younger people might have never really seen a CRT. And they're positively rare now. I encountered a CRT TV in the hospital waiting room recently and was a bit startled to see one. So for those only passingly familiar, if you get the opportunity, spend a bit of time experimenting with it visually. Jiggle your eyes, look away suddenly, and then back, and try oblique angles. Maybe you'll see what they mean about "you just can't recreate that glow".
It's hard to describe but the image is completely ephemeral. All display technologies involve sleight-of-hand that exploits visual illusion and persistence of vision to some degree, but the CRT is maybe the most illusory of the major technologies. It's almost entirely due to persistence of vision. With colour TV and fast phosphors the majority of the light energy is released within a few milliseconds of the spot being hit by the beam. If you had eyes that worked at electronic speeds, you would see a single point drawing the raster pattern while varying in brightness.
A bit of TEMPEST trivia: The instantaneous luminosity of a CRT is all you need to reconstruct the image. Even if it's reflected off a wall or through a translucent curtain. You need high bandwidth, at least a few megahertz, but a photodiode is all that's necessary. The resulting signal even has the horizontal and vertical blanking periods right where they should be. Only minor processing (even by old school analog standards) is required to produce something that can be piped right into another CRT to recreate the image. I'd bet it could be done entirely in DSP these days.
Once we lost CRTs we began this insane race for all these duct tape solutions; all sorts of anti-aliasing, lighting, etc, and a resolution race that can only be described as pathological. Try modern games in low resolution with minimal effects on a CRT. You'll be surprised how much better they look, but also how much better they run; you don't need such a powerful rig when you turn your graphical effects down when using a CRT.
> all sorts of anti-aliasing, lighting, etc, and a resolution race that can only be described as pathological
I enjoy CRT nostalgia now and then, but modern high resolution games are absolutely amazing. The blurry, low resolution, low refresh rate CRT look is fun for old games, but playing in 4K at 100+ fps on a modern monitor is an amazing experience on its own.
They're blurry in brand new ways due to how sloppy AI upscaling is.
I got very sad when my CRT monitor died. I was using a Radeon RX 380X, part of the reason is that it was one of the few cards to still have analog output.
Then I went and played lots of recent games in lower resolution, but could turn on lots of expensive effects even with such underpowered card, because I could do low-res with anti-alias disabled and no scaling and have decent results.
But true pleasure was playing for example Crypt of Necrodancer on that screen, the game felt so easy. I eventually stopped playing after that screen died, I could never nail the timing anymore on modern screens, the response time is not the same.
The Slowmo Guys on YouTube have a great video showing the CRT scanlines.
https://www.youtube.com/watch?v=3BJU2drrtCM
Also don’t forget to rub your palm across the screen to collect the fuzzies that built up.
Ah, I can feel it from just reading your comment! That’s a feeling I haven’t felt in a while!
How about the smell?
You gotta give it a few slaps too, when the image isn't very clear.
Pre-Gameboy, when I was a child, my grandfather had a television— the kind that was furniture. Sometimes it would eschew modern trappings like colour and v-sync, and I would employ my Classical Vaudevillian training to set it straight with a wallop.
Put a magnet by the screen as well.
I might miss visual aspects of CRTs, but I mean most of them had a coil sound or some kind of cracking sound. May be as TVs, screens for gaming consoles they were fun, but as monitors I don’t miss the heat burning my face.
I’ve been searching for years for the old CRT Viewsonic Mac monitor that used RGB inputs. (Might have been CMYK). You plug in the DIP head to the video out and you attach each individual connectors to the color connectors on the back. The thing was massive, easily over 24”. Beige plastic that we all love.
Growing up my dad was a Mac guy and he had all kinds of Apple stuff. The weird page sized monitor, performa 600, trackball mouse, ergonomic keyboard. Granted my father was in software and this was the early 90s but it would _definitely_ define my initial passion for computers.
I’ve been looking for this monitor so that I can restore his setup. I have his Performa, peripherals, and restored those. I just need that giant monitor he used to use.
My father passed away last year. My world has been different ever since.
It'd've been RGB :)
But you've now got me mulling over the implications of a CMYK-based subtractive colour process 'monitor'. I'm guessing the refresh rate wouldn't be too hot..!
> It'd've been RGB :)
Yeah, sounds like one of those CRT monitors that has separate BNC connectors for each of RGBHV.
Sometimes I think about the bizarre path computer technology took.
For instance, long-term storage. It would stand to reason that we'd invent some kind of big electrical array, and that's the best we could hope for. But hard drive technology (which relies on crazy materials technology for the platter and magnets, crazy high-precision encoders, and crazy physics like floating a tiny spring over the air bubble created by the spinning platter) came in and blew all other technology away.
And, likewise, we had liquid crystal technology since the 70s, and probably could have invented it sooner, but no need, because Cathode Ray Tube technology appeared (a mini particle accelerator in your home! Plus the advanced materials science to bore the precision electron beam holes in the screen grid, the phosphor coating, the unusual deflection coil winding topology, and leaded glass to reduce x-ray expose for the viewers) and made all other forms of display unattractive by comparison.
It's amazing how far CRT technology got, given its disconnect from other technologies. The sophistication of the factories that created late-model "flat-screen" CRTs is truly impressive.
The switch to LCDs/LEDs was in a lot of ways a step back. Sure, we don't have huge 40lb boxes on our desks, but we lost the ultra-fast refresh rate enabled by the electron beam, not to mention the internal glow that made computers magical (maybe I'm just an old fuddy-duddy, like people in the 80s who swore that vinyl records "sounded better").
Someday, maybe given advances in robotics and automation, I hope to start a retro CRT manufacturing company. The problems, such as the unavailability of the entire supply chain (can't even buy an electron gun, it would have to be made from scratch) and environmental restrictions (lead glass probably makes the EPA perk up and notice).
> like people in the 80s who swore that vinyl records "sounded better"
I'm not one of those people who ever thought vinyl sounded better than a properly recorded and mastered digital version and I've always believed a high-bandwidth digital audio signal chain can recreate the "warmth" and other artifacts of tube compressors well beyond the threshold of human perception, however a broadcast-quality, high-definition CRT being fed a pristine hi-def analog RGB signal can still create some visuals which current flat screens can't. This is only controversial because most people have never seen that kind of CRT because they were incredibly rare.
I got to see one of the broadcast production CRTs made to support NHK's analog high-definition video format in the 90s directly connected to HD broadcast studio cameras and the image quality was simply sensational. It was so much better than even the best consumer CRT TVs, that it was simply another thing entirely. Of course, it cost $40,000 and only a few dozen were ever made but it was only that expensive because these were prototypes made years before digital hi-def would be standardized and begin mass production.
In fact, I think if it was A/B compared next to a current high-end consumer flat screen, a lot of people would say that CRT looks more pleasing and overall better. For natural imagery a CRT could render the full fidelity and sharpness of a 1080 image but without that over-crisp 'edginess' today's high-end flat screens get. And those "cathode rays" can render uniquely rich and deep colors vs diodes and crystals. Of course, for synthetic images like computer interfaces and high-dpi text, a flat screen can be better but for natural imagery, we lost something which hasn't yet been replaced. I'd love to see an ultra high-end CRT like that designed to display modern uncompressed 4K 12-bit HDR digital video.
I had a music teacher that insisted analog recordings were different.
One day she said there is a simple way to prove it. Certain stringed instruments have the string move on their own to the correct note if you put them near a source of similar sound. If you put these instruments in front of a speaker playing from an analog source and have the strings move, then play the exact same music but from a digital source on the same speaker, the strings stop moving, even if to most humans it sounds exactly the same.
Sadly I never had the gear to test this, I am not a professional musician and was learning from that person as a hobby (she is a teacher for professional musicians).
If you do ever test this, and do it rigorously (i.e. using analogue and digital versions of the same recording, with no pitch inaccuracies) you'll find the strings will resonate equally well with analogue and digital recordings, all other things (volume, tuning of the instrument, etc.) being equal.
The problem is that all other things are no longer equal, and have not been for quite some time.
Retuning digital audio to 440Hz equal temperament is an industry norm now, even for (say) re-issued 1970s stuff. You just won't get modern digital versions that are the same as the analogue versions, and the equal temperament stuff thus won't pass a resonance test unless the test instrument is also equal temperament, which most string instruments of course are not.
The far easier test for amateurs nowadays is not to buy a whole string instrument, but to use pitch monitoring applications, which all too readily show when a sound is bang-on the specific equal temperament frequencies.
Obligatory recent Fil Henley:
* https://youtube.com/watch?v=0x5dfbqE5hE
I find this dubious since the effect she was describing is caused by resonance frequency. Since, in the example provided, the source is an amplified speaker pushing air in both cases the outcome should be the same. The more famous test of this principle is the breaking of a glass and I would be surprised if this hadn't been done with digital signal inputs.
Have you looked at any high end OLEDs lately?
> It's amazing how far CRT technology got
And China is still building, today, brand new CRT boards for CRT TVs and monitors. You can buy them on AliExpress.
I don't know if CRT themselves are still being built though.
I'm hanging on to my vintage arcade cab from the 80s with its still-working huge CRT screen. Hope I fail before that thing (and I hope it doesn't fail anytime soon!).
> The switch to LCDs/LEDs was in a lot of ways a step back. Sure, we don't have huge 40lb boxes on our desks, but we lost the ultra-fast refresh rate enabled by the electron beam, not to mention the internal glow that made computers magical (maybe I'm just an old fuddy-duddy, like people in the 80s who swore that vinyl records "sounded better").
CRTs don't have particularly good refresh rates. There is very little delay on the output scan, but 99% of the time the delays built into rendering make that irrelevant compared to fast screens using other technologies. And the time between scans doesn't go very low.
I have no idea what you mean by internal glow.
The heated filament in many old CRTs would glow orange.
My first HD TV was a tube. The picture was flat, but it did have overscan. Compared to today's TVs, it was small. (~30")
Even though there were some aspects of the tube that were nicer, I'm not going back. I don't want the giant box, and now that I have a 70" OLED, I don't want to go back to a tiny screen.
If in the Toronto, Canada area, the television museum may be worth a visit:
* https://mztv.com
Pre-WW2 televisions seem to be quite rare:
> To put this special set in some context, there are more 18th century Stradivarius violins in existence than pre-World War II TVs and, to make it that bit rarer, this TV has only had two owners. “I’ve handled 38 pre-war tells and this is the finest and even comes with the original invoice,” said Bonhams specialist Laurence Fisher. “It cost a huge amount and the owner must have had wealth and means…It is a very rare thing and there are collectors who would love to have it.”
* https://newsfeed.time.com/2011/04/05/do-not-adjust-your-set-...
Some images to demonstrate how retro games look on CRT vs unfiltered on a modern display:
https://x.com/ruuupu1
https://old.reddit.com/r/crtgaming/comments/owdtpu/thats_why...
https://old.reddit.com/r/gaming/comments/anwgxf/here_is_an_e...
Modern emulators have post-processing filters to simulate the look, which is great. But it's not quite the same as the real thing.
This helps validate my memories of SNES and PS1 games looking so much better when I was a kid than on an emulator today.
With 25% scanlines on PC CRT's they looked pretty close to TV's. On LCD's, forget it. Not even close, even with CRT filters.
I played SNES and PS1 games on a CRT. I played them on LCD and OLED TVs. I can’t tell the difference.
I mean I can tell that hdmi cables never introduce chromatic abberation something which was quite common on these old TVs when the SCART cables I used to use got old and I never had a LCD screen catch fire something which happened to me twice with aging CRT screens.
I really don’t get the nostalgia or whatever it is called when some of the people who think it was better then weren’t born at the time.
Blowing things up to that size is not representative.
Back when I first started playing things on emulators we were using 12" to 20" CRTs or LCDs with much higher resolution than a TV, so whether CRT or LCD the pixels were chunkier.
None of the nostalgia is how I remember it at all.
The average CRT TV had crap color and poor brightness and going from that and the flicker of 1-to-1 size NTSC on a 20-something TV to an emulated "chunkier pixel" rendition on a progressize-scan 72+hz 1024x768-or-higher CRT or an LCD looked way better.
Take the side by side pictures and zoom WAY out on a high-res screen or go stand several feet away from your monitor so that they're the size they were designed and expected to be seen at, and the vast majority of the perceived improvement from making the CRT subpixels visible goes away. And then put them into motion - especially vertical motion - and those lines in between, and losing half on each frame becomes more noticable and distracting.
The 4th image there of the yellow monster is a good example. Even zooming to 50% on my high-res display makes the "bad" version suddenly look way sharper and detailed as the size starts to show how frequently "rounded dots with gaps between it" just looks like fuzziness instead of "better".
And these comparisons tend to cherry-pick and not show examples of things that lose clarity as a result of the subpixels and scanlines instead of gain clarity.
I'm the same way. The scanlined, subpixeled versions just look terrible to me.
The article concerns 'PVMs', not a phrase I remember in period, even though we had hundreds of Sony D1 monitors, which were the pinnacle of 'professional digital monitors'.
These were different beasts to civilian TVs, even top of the line Trinitron. They had none on the RF circuitry of a regular TV and the inputs were typically component, or, in the late nineties, digital, but not the digital we know today, that signal came down one BNC connector.
We had an outside broadcast company which had massive trucks full of screens for televising sports, concerts and public events. A new boss decided to outfit the new trucks with domestic TVs rather than the expected mega-expensive D1s. The trucks did not last long, much to the amusement of the crew. The TVs rattled themselves to pieces before they made it to their first event.
Unlike the civilian TVs, the Sony D1 monitors were designed to be repaired. We had people for that and you could often see the innards of one of them if you went to see the engineers in their den. They generally did not need to be repaired, but, if you have hundreds of the things then you raised the odds of having a few need a little bit of servicing.
In the studio environment they were rack mounted with air conditioning and extremely neat cabling to some type of desk where you had the buttons to choose what camera, VT or other source went to the screen. Lighting in the gallery was also controlled, so the picture you saw was the definitive picture, with no fiddling of brightness or contrast necessary. The blacks were black, which flat screens were only really able to achieve decades later with AMOLED.
In the basement with the DigiBeta tape machines we had smaller D1s in the racks, often with an adjacent oscilloscope. You could tell if the content was 'adult material' by the oscilloscope, which I always found amusing.
The magic of TV in that era was the analog nature of the CRT. The studio set was usually very battered and yet you could put a few tens of thousands of watts of lighting onto it for the cameras to show something beautiful on the D1 monitors. The advent of HD was problematic in this regard as every dent and scratch would show, along with every wrinkle and blemish on the presenter's faces.
Video games of the era were designed around the hardware, in Europe this meant 720 x 576 PAL, with lots of that image as 'overscan'. Note that JPG was also designed for the magic of analog, with slow CPUs. You can change the look up table in JPG to make it work for digital and fast CPUs but only MozJPEG does that.
You mention flickering, and most CRTs would be flickery, think of electrical shops of the era and what you would see out of the corner of your eye. Clearly you would not want this in a studio gallery lest anyone collapse with an epileptic fit. In Europe we had 50Hz rather than 60Hz, so, even with interlacing, flicker was a thing but only in the electrical shop, not in the studio gallery. This had more to do with genlock (for analog) than phosphor persistence trickery.
Regarding the article, I am pleased that the D1 monitors of old have found a new fan base that truly appreciate them. In period we put a lot of work into our setups and, to this day, I struggle to come to terms with all of that expertise and expense having gone forever.
In broadcasting there has always been an 'old guard' that can remember the 'good old days'. I now feel like one of those fuddy duddies!!!
Absolutely. I love playing Atari 2600 games, and it seems sacrilegious to play on anything but an old-school CRT TV.
Also, I’ve heard a CRT is required for NES light-gun games like Duck Hunt. Anyone know if this is true? I don’t have an NES, and if I did, I’d hook it up to my CRT, so I still wouldn’t know the answer :)
The NES light gun works with the properties the CRT provides... Roughly what happens is ... When you pull the trigger, the next frame is all black, and then one frame per target with a white square for the targer. If you're on target, the photodetector (photodiode? photoresistor?) will make a step change when the beam hits the white square, and the game code is looping to detect that. If the light comes late, it won't count; if it's not a big enough change, it won't count. If the screen was too bright during the black frame (or you were pointing at a light the whole time), it won't count.
Most modern displays are going to show the square too late, some might not be bright enough.
If you have an LED matrix and the right driving circuitry, you could probably replicate the timing, and that might work too, but I've not seen it done.
More details and options for LCDs https://www.retrorgb.com/yes-you-can-use-lightguns-on-lcds-s...
Yes, light guns/light pens actually relied on vertical/horizontal sync of the CRT screen to identify the position you pointed at, so they won't work on a modern screen.
> But it's not quite the same as the real thing.
To be fair, with modern "retina" HDR displays, it should be very very close.
The most important element of the CRT look is the fast phosphor decay. This is why CRTs have so little sample-and-hold blur. No other hardware can simulate it perfectly, but a 480Hz OLED display comes close:
https://blurbusters.com/crt-simulation-in-a-gpu-shader-looks...
> it should be very very close
It should. It isn't. For some obscure reason, VGA colours look different on every modern LCD.
Most modern displays are calibrated, to some reasonable level, and can easily accommodate the very limited gamut of an old CRT, especially anything supporting HDR10. I suspect this is more of "they need to be fudged so they're wrong" more than anything.
I don't think old CRT gramut is "very limited". Only plasma screens were as good.
Plasma has great contrast and a slightly wider gamut than a CRT. Neither one have particularly good gamuts unless you're comparing to sRGB. Many current screens can do much better.
Most modern displays are calibrated, to some reasonable level, and can easily accommodate the very limited gamut of an old CRT. I suspect this is more of "they need to be corrected so they're wrong" more than anything.
My favorite TV of all time was a 65" Panasonic plasma.
4k OLED looks really good in dark scenes, but something about turning that plasma up to its full 800 watt+ vivid mode is a completely different universe.
Practically unusable during the summer in Texas though. The efficiency is really bad and you essentially need a custom hvac solution if you want to put it in a proper "theater" room. The noises it made in high brightness scenes were also a bit distracting. It had to move a lot of power very quickly.
I host monthly retro gaming meetups and we use CRT TVs that we store at the local library. Luckily, more than enough are still given away for free here. Right now we have around 25.
They're just surprisingly good when paired with a good signal and an old gaming console. I still would love to have a professional monitor but they are too pricey for me. I also need to get myself a CRT monitor.
I regret taking all my old tube monitors to Goodwill back in the mid-2000s. I saved a Commodore 1942, at least, but I sent all the rest away to die.
I appreciate the CRT modeling in emulators, but a hardware device that passes thru a display signal and provided sub-frame CRT artifacting and phosphor modeling (particularly if it supported 240P) would be bitchin'.
FPGA based devices that can do this, and quite well, do exist, they're just expensive. The RetroTINK-4k Pro is the top of the line as of this writing but it's a $750 converter.
My son who is 12 has gotten into collecting old Nintendo. Just the other day, I managed to find one of the consumer CRT holy grails on our local equivalent of Craigslist. A B&O MX4200. It's one of the last CRTs that B&O made from the mid 00s. It has an excellent picture, nice onscreen menus for calibrating the tube, and supports NTSC over composite so he can connect his early 80s Famicom that's been modded for composite out.
Back in the 80s, as the home computer revolution got going, computers were typically wired up to small, cheap, portable TVs as a display device. These TVs used shadow masks, and the computer video output was typically modulated to a TV signal, and the TV was 'tuned' to the computer. All of this added large amounts of blur and distortion even before the signal was displayed on the TV.
By the mid 80s, it was maybe more typical to buy a dedicated CRT monitor, and the computer connected via composite, or maybe even an RGB feed to the monitor, allowing higher resolution and much improved quality.
For the well healed, this route also led to the holy grail, a trinitron tube!
At each of these changes, the aesthetic of the display technology changed, but probably the best memories come from the original blurry stuff as the magical moment of actually getting something out of a home computer.
I’ve got a 27” CRT right next to my 65” LG OLED C9 (which is starting to feel ancient, too).
It sits in a cabinet that currently holds an NES, SNES, N64, GameCube, and PS2.
It doesn’t get a ton of playtime, but when my now 21- and 18-year-old sons were young, I’d play on them quite a bit (they were already retro even then), and as they got older, they would too.
My oldest is particularly fond of the retro consoles and playing on the CRT, so he’ll hop on it when he gets the itch for something retro.
I feel like there’s a charm that will never fade, not only with retro consoles, but also playing them on a CRT.
I’ll never get rid of our CRT.
My oldest son wouldn’t let me, even if I wanted to.
Those aren't old TVs.
When I was a kid, I'd go to the TV repair shops and take old "unrepairable" vacuum tube TVs (no transistors!) off their hands. At home I tried to fix them. I had no idea how to fix them. But I had a lot of fun trying.
One of the fun things was to randomly swap around the vacuum tubes and see what would happen. Very entertaining! I used to have a box full of scavenged tubes. Sadly, I eventually tossed them out, never realizing how valuable they'd be in a few years.
My mom was convinced I was going to electrocute myself, and finally made me get rid of the sets.
Your mom was probably thinking along the right lines haha.
But don't worry, TV tubes are practically worthless. I've been to swap meets for an antique radio club in my area. Sometimes, there will be a few boxes of tubes from TV sets there (400-500 tubes) and people literally can't give them away. Tubes from audio equipment are the only ones people are after.
Most TV tubes aren't too valuable. Now if your TV was made by telefunken, that might be a different story
Oh, I did understand what resistors, capacitors (then called condensers), and inductors did, but how they worked together to make a TV work utterly baffled me. I couldn't understand how the vacuum tubes worked, either. I didn't know anyone who had any idea how electronics worked.
Have a look at https://github.com/mausimus/ShaderGlass it seems promising an emulating the CRT. I don't run windows so have not testet it myself.
you may enjoy this. My daughter is 10 years old, couple of months ago we went to an airbnb which had a crt tv and some vhs tapes. she looks at a vhs tape and asks, what is this? She had a look on her face like if she had just found a dinosaur egg.
I collect CRT TVs. I try to limit myself to only special ones. I own a clear RCA SecureView TV from a jail cell, and several Sony Professional Video Monitors.
I started this hobby JUST as it started to become a little more popular. Terrible timing.
one of the best for shooter games because of the high refresh rate. There is virtually no lag for quick scoping or fast playing. there is very little computing happening inside.
i have a 55" panasonic LCD. no apps or wifi.From 2003. Still works.
We have roku, which at some point in the last 4 years, it went rogue and auto updated itself for who knows what telemetry . We almost never use it now.
I plug my laptop via HDMI and the possibilities are still there.
> i have a 55" panasonic LCD. no apps or wifi.From 2003. Still works.
Most probably a plasma, not LCD. LCDs from that time had extremely bad colors.
If you're looking for an upgrade, try to find a Panasonic Viera VT/ST/GT50 or *60, but check for burn-in before you buy.
> I plug my laptop via HDMI and the possibilities are still there.
If the TV has HDMI CEC, then get yourself a RaspberryPi 4 (lowest memory model is enough) and LibreELEC. You'll thank me later.
My Sony Bravia XBR6 from 2008 is perfectly fine for my living room screen.
(I even programmed an old Sony remote to kludge sending theequivalent of the PS5 controller logo button, for the PS5 that the TV is plugged into, for streaming and gaming. And found the trick to get the TV to go to standby when the PS5's HDMI signal disappears, which isn't a standard feature, though waking is.)
I'll probably only upgrade if I relocate cross-country, and have Bay Area levels of money to spend on a much more expensive non-'smart' setup.
At that age, are you sure its not plasma?
Guilty. I just love analog TVs.
I’ve hunted down a couple old school big screen TVs, the fresnel lenses are awesome toys, you can melt just about anything using them as solar collectors.
I'm not really a CRT fan tbh but my neighbor was throwing away a working 24" Sony Trinitron and you don't just let one of those hit the dumpster lol
Hooked up my spare PS2 and got a light gun for it. Wish i had a way to play duck hunt though.
I have a 32" Sony something or other. One of the very last ones - HDMI port and does 1080i or 720p. What a find! 30 dollars, and about killed myself getting it in the house.
The plan is to have all the light gun systems hooked up to it. Being able to get 1080i out of a PS2 while having the light gun work is a challenge I have not yet surmounted.
Unfortunately, you might have trouble with light gun games. Some of the HDMI CRTs do some video processing that adds video delay. :(
If you're US based, you would probably use component video (YPbPr) to connect to your 1080i display; you can hook a ps2 light gun to the Y cable (usually green) and it should work if everything else is cool.
Wow, do I feel old right now.