On CRT vs LED monitors in Playing Analog Video Games

This document is now out of date. There exist commercial HDMI adapters which tap into the GC digital port directly, which means no artifacting and no lag. There is also now a code to reduce input polling to help compensate for monitor lag. The landscape has changed for sure. Still, I hope you enjoy the read.

 

Note: while this is specifically about SSBM, this applies (with some minor modifications) to most analog gaming, especially fighting games, shmups, or other reaction time critical games. 

 

tl;dr / ELI5 courtesy of Jib: “It feels different cuz it is different, even if it’s not laggy, and if you practice something under specific circumstances for a long time a difference in feel can really throw you off.”

 

A small quote from kadano on playing on CRTs vs LCDs: “LCD monitors have at least a few thousand times as much signal delay as CRT monitors and no real blacks because of backlight bleeding. GC and Wii don’t have anti-aliasing, the resulting jagged edges are more apparent on LCDs than on CRTs because the way CRTs output light naturally makes the picture smoother. Apart from weight/depth, power consumption and, in some places, availability, there aren’t any reasons I can think of not to go with a CRT for Melee purposes.”

 

To start off let’s get some terminology out of the way. I’m assuming 60fps, so one frame (1f) is 16.666(6-repeating) milliseconds. Im frequently going to round that to 17ms. I’m going to say ms a lot btw, which means millisecond.

CRT = cathode ray tube powered monitor. The big box TVs we know and love. While some late-model CRTs do lag on analog signals, we will only be discussing “lagless CRTs”.
LCD = Liquid Crystal Display powered monitors, a matrix of liquid crystals backlit by LEDs (light emitting diodes). Flatscreen gaming monitors, including the nice new ones from Asus or BenQ.

Input lag = the lag the game takes in processing your input and updating the game state. Has been measured by others, and for a gamecube/wii it is constant regardless of what you choose to plug your console into to view it. A WiiU modded to play gamecube games adds some additional input lag. Your shitty netplay pc with a usb-chip from 06 might also add some. All irrelevant for our purposes here. We will strictly be speaking about playing on gamecube/wiis where input lag is fixed (input polling glitch aside).

Display lag = the lag it takes from when the console puts the video signal out the back to when it shows up on your monitor. This is what is pertinent to this discussion. Ever plugged a gamecube into a fancy giant flatscreen and felt like you were playing underwater? That’s display lag.

Okay, with that out of the way we can start talking about why we’re here: playing on CRT vs LCD monitors, since technology has allowed us to play on both with almost the same amount of display lag.

To start with, let’s go over how the two different types of displays work (DISCLAIMER: i will be making many oversimplifications here. If you know enough to see where I’ve simplified, you are not the target audience. Feedback on where I’m straight up wrong is appreciated though. PLEASE STOP TELLING ME THAT THE HUMAN EYE DOESN’T CAP AT 60HZ. I KNOW. I’M TRYING TO ILLUSTRATE A POINT TO AN AUDIENCE THAT ISN’T FAMILIAR WITH THE DETAILS OF PERCEPTION AT HIGH REFRESH RATES. ALSO, ALL THE GAMING WE DO IS AT 60FPS, SO YES, YOU ARE CORRECT, BUT IT’S IRRELEVANT STFU AND JUST READ THE ESSAY). 

 

Anyways, a CRT has a cathode ray tube (duh) that accelerates electrons to near the speed of light, which strikes phosphorus on the front of the monitor. The phosphorus responds to getting hit by an electron and produces a photon. In scientific terms, this is really really really cool. You have a small particle accelerator inside your tv. Actually you have 3, one for R,G, and B. That’s why CRT-monitors are so “boxy”, and why they are such power-hogs. It takes space to fit these long CRTs inside, and have them aim correctly. Anyway, back to “the beam”. This beam of accelerated electrons scans across the screen, line by line, to produce the image. There’s then a brief period of time (VBlank) where the beams take time to get back to the top of the screen. Then it has to do it all over again. All this occurring in 17ms. If you’re using interlaced video, the beam scans across even lines on one frame, then odd frames on the next. If you’re lucky enough to have a progressive scan CRT, then the bean updates every line every frame. Regardless, the mechanism is that it the bean hits the phosphor, which creates an intense glow, which gradually fades over the next 17ms, and then the beam hits it again for the next frame (or in 2 frames if interlaced).

An LCD based monitor is a very different beast. It is essentially a matrix of Liquid Crystals, each of which are sent a signal each frame to emit a color. They are backlit by LEDs. If you’re curious how diodes produce photons, look up LEDs. There are some rather major differences in how this plays on on the ms-level timeline:

* There is no such thing as interlacing. All pixels are updated simultaneously on each new frame. Interlaced signals must be deinterlaced before they can be displayed on a flatscreen monitor. This usually incurs a small (1f) delay, but lucky for us, melee can produce progressive scan video, so we can not worry about that.
* While the pixels are updated in a similar way to a CRT (from feedback from several smart people, it turns out the “draw style” depends on the monitor, with some drawing line by line, and some drawing from top-left to bottom-right, and some doing it essentially all at once), there are still differences. When you hear monitors talk about “response time” (omg benq has 1ms response time that means no lag right?!?), they are referring to how fast the individual pixel (LC + LED) can emit its color and get back to neutral (roughly how fast it can turn on and off). Response time is cool, but has nothing to do with input/display lag.

This introduces some interesting effects when we talk about perceptual processing of motion/video. Remember, what we’re used to thinking of as continuous video is really showing us still pictures fast enough to create the effect of continuous motion. Human perception of how fast these pictures need to be fed through changes how the video “feels” (now we get to the subjective perception, the best part). Movies are/were shot at 24fps to give a certain “feel”. Human vision caps out at about 60 fps (plz stop messaging me about this, I know it’s an oversimplification, I used to work in a perceptual psychology lab. My point is, different frame-rates and different ways of drawing things give different perceptual feels), meaning we can’t (or rather, we get diminishing returns on) distinguish(ing) any additional smoothness past this threshold. At 60-rpm is when the helicopter blades turn into a continuous whurr. Dogs can’t see video on CRTs, but then can on flatscreens. Etc etc. There’s a whole field that studies how humans (and animals) perceive “video” if you find this interesting. All the gaming we do is 60fps, so we’re going to work on that scale. 


So, even before we consider the adapters (and oh boy, they’re a thing too), video on a CRT at 480i (interlaced) vs CRT at 480p (progressive) vs a LCD at 480p all *feel* different. This is not a placebo effect. In one model, we have a gradual sweep through the screen, with the lines (or every other line) being drawn left-to-right, lighting up up then gradually fading, then repeat. In the other, each pixel lights up, then changes to the next color for the next frame, repeat. These cause these displays to feel different. Consider that if we were to freeze in the middle of displaying a frame, the CRT would only have drawn half the lines, with the other half very faded, and the LCD would have all the updated pixels in their new state, with the unupdated ones still at their previous state/intensity (roughly, this also depends on things like response time and is monitor dependent). For me personally, spending too much time looking at LCD monitors gives me a headache. I can stare at my 480p CRTs for days. YMMV. 

 

Alright, now let’s discuss the “modern flatscreen setup”.  For reasons that are dubious to say the least, Nintendo, in their infinitesimal wisdom, decided to only give first model gamecubes the ability to output digital video, barely made the cables to use that port, and then ditched the idea completely for the Wii. In many circles, this has been regarded as a bad move(™). So, in order to play on a flatscreen monitor, we will need to convert the analog video the console outputs to a digital format the monitors accept laglessly. This conversion is performed by a chip called an ADC (analog->digital converter). While many LCD monitors often feature the ability to take component video and convert it internally, they do so with a fairly noticeable delay. Obviously, as fighting game purists, display lag cannot stand. And so, *goldblum voice* melee finds a way. And so we did with the introduction of what I’ll be calling “fastADCs”. The most notable of these are the sewell wii2hdmi, although there are others of various degrees of quality/price (notably, the Framemeister XRGB-Mini was a popular, although exorbitantly expensive, choice for retrogamers in the past). You can find some extensive testing on these done by Smashgg’s own Fizzi, in his article on MIOM (http://www.meleeitonme.com/this-tv-lags-a-guide-on-input-and-display-lag/) and in the follow up (https://www.reddit.com/r/smashbros/comments/26got5/quick_followup_to_miom_lag_article/). Kadano has also done some fairly extensive lag testing on different monitors/convertors (here are some seriously in depth google sheets: https://docs.google.com/spreadsheets/d/1pz8j58iCBf7iK1-H6luMvQ6JKq-K9I5nQAhGmyXEE-w/edit#gid=0 and https://docs.google.com/spreadsheets/d/1wV_DkgPmzaOwfgQZn2TPAWxFfbxNPdvZQ4FziZXfTq4/edit#gid=48074256 ). But, the important thing to note here is that, these “fast ADCs+fast gaming monitor setups” have about ~7-8ms of display lag (this is total, the fastADC + the monitor. The point is it’s about half a frame. Kadano has told me that its now less than 8ms, probably closer to a quarter of a frame, 4ms on a great monitor, 6ms on a average gaming monitor). Now, at first you might say, I AM SHOCKED, THIS IS UNACCEPTABLE (and i would ask you to turn your caps lock off). CRTs have display lag in the nanosecond range, 7/8ms?!?!! that’s too much man(™)! But, as astute readers have picked up already, this is an unfair comparison. A CRT takes almost the full 17ms to draw the image (spending the remaining time in VBLANK, moving the beams back to the top of the screen), line by line. So in reality, if we were to freeze a CRT mid-frame, we’d only have half the image updated, with the undrawn having faded substantially. The LCD monitor would have half the lines still on the old frame, and half on the new frame, still at full intensity (or somewhere in between, which is encapsulated in the somewhat nebulous concept of “response time”). In other words, if we converted CRT “lag” into the same system used to test LED-based monitor lag (which averages the time it takes to write to the top-left and bottom right), we’d calculate a CRT has half a frame of lag . So what does this mean? 

  • About half-a-frame of display-lag in the ADC+monitor will produce the same cognitive feel of lag as playing on a lagless CRT.
  • As stated and explained before, there is an empirically provable *feel* in the difference of playing on an LCD as a lagless CRT. The LCD monitors light up a LED, then quickly switches it to the color needed for the next frame. CRTs “pixels” (which don’t exist) reach peak intensity when the beam hits them, and then gradually fade in intensity.

 

Now, you may be asking, how is it that these magic fastADCs are able to do what monitors can’t (at least with the built in ADCs in most gaming monitors)? A good question my inquisitive reader! The way they do it is… they cheat. They don’t care about video quality. This is not a “dig” at the makers of these converters, it is a necessity in order to provide conversion at a speed fast enough that we feel it is lagless. You can have perfect quality ADC, or you can have fast ADC, but you can’t have both (yet anyway). This results in several effects that video nerds players may see/feel:

 

So, I hope by now I have convinced you that, despite a “modern flatscreen melee setup” being lagless (or, by a very weird way of looking at it, less laggy) than a normal lagless CRT setup, there are objective differences in the way they will be perceived. And perception is funny. It’s inherently subjective. We often can’t express differences in perception with more granularity than something “feels off”. But it is there. And here is where it interferes with melee (and other fighting games). I main sheik in melee. On certain characters, she needs to reaction tech-chase them, and doing so is something that is right around the limits of human reaction times. As a player, I have probably put a couple thousand hours into reaction-tech-chasing on CRTs. When I play on a “modern lagless flatscreen setup”, I agree it is lagless, but my reaction tech-chasing sucks. Something in the neural networks (“muscle memory”) that I’ve been training to process video as it comes off a CRT is getting thrown off by either the LCD, or the fastADC (or the combination of both), and so my reaction-tech-chasing game suffers. An anecdote, I was at a vegas local tournament before an evo, at their local venue (dope venue btw). They had a mix of flatscreen and CRT setups. I agreed all were lagless. Playing with a good falcon player on the flatscreen setup, I could not seem to tech-chase him. In tournament, I insisted on playing on a CRT, and won the set. This is just an anecdote. I am not particularly good at reaction-tech-chasing in tournament, as my nerves usually get to me and I start thinking about my childhood and all the people I’ve let down, and then I’m offstage getting killed. Nevertheless, when tech-chasing machines like Swedish Delight (who prompted me to write this essay after playing on flatscreens and saying something felt off), profess that they feel like something’s off on the flatscreen, and its not lag, I am forced to conclude, in light of the evidence, that they are almost certainly onto something. It is a perceptual/feel issue. If we had been training on flatscreens and fastADCs for 15 years, we would likely be fine. But that is not our history. Almost all our training comes on CRTs. And so, our brains have created sophisticated neural networks (also known as muscle memory, or in scientific terms, “learning”) to process this information and react quickly. When we change the means we provide this information to our brains, it changes the input our neural networks get. So no, you are not crazy or placeboing because the flatscreen *feels wrong*. And no, it’s not that it’s lagging. It’s just that it’s different. And that affects your play. And that’s perfectly reasonable. 

 

A parting thought: The gamecube (original models only), do in fact have a digital video out. Nintendo only made cables to produce component (or d-terminal, a port only japanese TVs have) output using this port. These cables have a DAC in them (in contrast with ADC, DAC can be done extremely quickly, on the order of nanoseconds rather than milliseconds. If you are wondering why, ask yourself why you need a capture card in the first place, or how a console outputs normal analog video). It is entirely possible (and in fact schematics exist on github) to produce DVI/HDMI cables that will plug into the gamecube’s digital video port, and produce an “un-artifacted” digital signal from these cubes, which can be plugged into a suitably fast gaming monitor. No, this does not eliminate the inherent differences between viewing video on a LCD vs CRT display, but *would* eliminate the various artifacts that fastADCs introduce. This may be a desirable middle-ground between using gaming-flatscreen with fastADCs vs. lagless-CRT with good old fashioned analog cables. I cannot vouch for the “feel” of this, as I’ve never played on one. If you are capable of constructing such a cable, hmu. Apparently they are in fact available here: https://zzblogs.wixsite.com/home/purchase-a-plug-n-play-2-0 and I’ve just added myself to the waitlist. 

Update: these are now available commercially: https://www.amazon.com/GCHD-Gamecube-HDMI-Adapter-Nintendo/dp/B078ZLMQH9/ref=sr_1_1?ie=UTF8&qid=1532549994&sr=8-1&keywords=gamecube+hdmi

 

Peace, love, and empathy

 

Leave a Comment

Your email address will not be published. Required fields are marked *

/* add by OCEANUS */