Hey hey, folks! Lumpz the Clown here, and as both a console and PC gamer, I have begun to notice both spirited and downright nasty debates online regarding what frame rate looks best and is consistent in certain games. For example, some PC purists were denouncing Mario Kart 8 shortly after its release, stating that it didn’t hold a consistent 60 fps during play. Because of this, some wouldn’t even give the game a try! As for me, since I had been watching Mario Kart 8 from its announcement by Nintendo and was fully aware of its amazing visuals and fast-paced action, I couldn’t believe what I was hearing. Denouncing a game simply because it didn’t output a consistent 60 fps?!
Well, instead of getting into a flame war with these people online, I decided to look into the aspect of frame rate and why it is so important to some of us. I came from an era where frame rate was not a concern when getting your game on, so what changed? As technology has advanced over the years, televisions transitioned from analog to digital signals, allowing for crisper and more-detailed displays. Video game systems from the Atari to the PS4 ran through the gamut from RCA to HDMI outputs, resulting in cleaner and more detailed audio and video signals. But is it an anally consistent high frame rate that makes a game great? We’ll get to that!
Where It All Began
Remember me talking about those awesome days at my grandparents house, where I would spend the majority of my time on their Atari 2600? They even had it hooked up to an old box TV from the 70’s with one of those converter boxes that screwed into the back! Scoff if you must, new-gen gamers, but this is in fact the best way to enjoy the Atari 2600 in all its intended glory! But what about those frames? The Atari 2600 outputs its audio and video signal through a single RCA cable that runs out of the back of the system. As for video, the Atari 2600’s output has a resolution of 160×192 at 60 Hz (NTSC). In PAL regions, the output is 160×228 at 50Hz. If the Atari 2600 were to be played on a modern TV with a progressive scan (which was not a feature found on older sets), it can result in missing graphics or colors. Try playing Asteroids or Space Invaders on your 52-inch, High-Definition TV and let me know how it goes!
However, the Atari 2600 went down in history as one of the most successful and relevant video game consoles. Many developers to this day still laud the 2600 as the console that inspired them to begin developing in the first place. But again, was frame rate important back then? Not even taking into account the technical limitations of the day, the Atari 2600 was the platform of some of the most monstrous hits in gaming history, including Space Invaders, Defender, Berzerk and Yars’ Revenge. I haven’t found any evidence to indicate that the 2600’s frame rate fluctuated any significant amount from 60 Hz, which may look like garbage on a modern display, but look amazing on an older TV set. Although, when the concept of playing video games at home was brand-new and infectious, did it really matter? People could now play their favorite games at home without spending their hard-earned money at the arcades, which were ALSO a new and promising concept back then!
Even if the 2600 doesn’t play well with newer displays, it still remains one of my favorite consoles of all time. For anyone looking to get their feet wet with the 2600, I always recommend picking up an old CRT, or cathode ray tube television set with an F adapter that converts the 2600’s RCA connection to a coaxial output. As I sit here now, I have my Light Sixer connected via the coax input and my Atari Flashback 2 connected to my video inputs on the same HD television that I use for my computer and haven’t had any graphical issues to speak of. My TV is an Emerson model (Model #LC220EM2) that has an NTSC and ATSC tuner built in, which may explain why it’s able to display the Atari consoles as they were intended without any missing layers or incorrect colors. The way I have them connected may also allow me to bypass my monitor’s comb filter, which has been the main beef with many current users trying to use an HD monitor. If I tried to connect the Atari Flashback 2 or any of my other plug & play devices to my 52″ Vizio that I keep in my living room, they’d look like shit! I haven’t tried to connect my 2600 to this TV via coax, but if I did, I may be able to bypass the comb filter. On the other hand, since the Flashback 2 is restricted to its RCA outputs (yellow and white), it falls victim to the comb and isn’t even worth playing on this display. On the smaller display that I use for my computer, I connect it the same way, but it isn’t negatively impacted by the comb filter. It’s a conundrum, folks, and is all dependent on the model of display that you are using, but I digress. Now, back to frame rate and the stir it’s creating in the world of more modern gaming!
Go Forth Into the Future
Fast forward a few years, Steam has taken the world by storm with its digital downloads and PC gamers have evolved from the casual days of LucasArts to an extremely prevalent force in the gaming world that takes itself very seriously. The loudest argument that I have heard between console and PC fanboys is…frame rate! I didn’t ensconce myself into these tirades, but remained an interested observer. I became curious as to why frame rate would either make or break a game. This argument was most obvious around the time of the release of the so-called “next-gen” consoles, where PS4 and XBox One owners were ripping at each others’ throats, but would unite in their shared hatred for the Wii U. Some even claimed that it wasn’t even a viable contender in the world of “competitive gaming.” The argument picked up traction afresh around the release of Mario Kart 8, when the Internet exploded with a metric ton of articles that dismissed Mario Kart 8 due to its supposed inability to maintain a constant 60 fps. Also, it was stated that the PS4 had a technical advantage over the XBox One, which caught the ire of Mike Ybarra, a Microsoft studio manager, who responded sharply to a jab from a user asking “when will the Xbone be able to hit 1080p“.
Call me naive if you must, but I never understood why anally fixating on frame rate has been elevated to the point of making or breaking a game. Then again, many users are utilizing more than one display in some cases, and others are using displays that tower over 60 inches! I have had an opportunity to speak with a few hardcore PC gamers who utilize multiple displays, and they all had one thing in common: they covet frame rates! Ybarra’s argument back to the so-called “troll” was that Forza runs in 1080p/60fps, and Harvey Eagle in Microsoft’s UK marketing division even “defied users to see the difference,” unless they were using a display that was over 60 inches. Even though many of these PC gamers I spoke with utilized multiple displays, all of them also stated that they use standard-sized monitors, as a larger display would negatively impact their frame rate and spike out their resources as their system struggled to fill a massive display. So in effect, is this whole frame rate argument all for naught when approached objectively? Are the loudest detractors simply trying to accomplish what is not feasible or even reasonable?
How Frame Rate Impacts Us
The science community hasn’t been able to determine the human threshold for viewing images, so that does leave room for improvement when approached with that in mind. The NTSC and ATSC frame rate standard sits at 29.97 fps, where the PAL region sits at 25 fps. According to Ybarra, Forza for the XBox One runs at 1080p/60fps, which, of course, would get along better with modern displays. But what are the benefits of an increased frame rate? Below, I will discuss what the human eye detects and examples from my own setup.
On average, the human eye can detect a moment of darkness lasting as little as 16 ms when looking at a lighted display and can detect a single image lasting as little as 13 ms in a series of different images. Returning back to Eagle’s earlier stated argument, users who attempt to play the XBox One using a display smaller than 60 inches would not see a difference, unless they attempted to run 2 XBox Ones side-by-side, with one connected to a 60+ inch display and the other to a smaller display. As the XBox One attempts to fill the entire large display, some users may experience a drop in frame rate, whereas the other user would not have that same issue. This would make sense, seeing how my previous interviews with hardcore PC gamers reflected their desire to play competitively while using smaller displays. Yeah, it would be awesome to play an HD game on a larger-than-life display, but after weighing all of the factors, it hardly seems worth it. I would rather play the game flawlessly on a smaller display than a larger one myself. I can only imagine that as the display size increases, graphical anomalies and slowdown can only get worse, which seems to make a stronger case for smaller displays if it’s a consistent frame rate you are looking for. Many of us have our consoles hooked up in our living rooms, where generally the largest TV set in the house is resting, which may also be the problem. The only consoles I have hooked up to my 52″ Vizio are my PS3, XBox 360 and Wii U, which all play very nicely with it and have not resulted in any graphical issues. All of my older consoles are hooked up to a 32″ CRT TV (with the exception of my Atari consoles and Wii, which are connected to my PC’s HDTV display), and none of them have any significant graphical issues or frame rate problems to speak of.
In the movie industry, standards have been changed over time to accommodate not only newer displays, but also the director’s vision for the film. In the 1920’s, 24p was the de facto standard for recording footage and was easy to convert to NTSC and PAL standards where needed without any damning artifacts or noticeable speed up. However, 24p would suffer in that it could not pick up fast camera motion very well, showing up either as a strobe or choppy motion. However, advancements in technology gave way to 48p, which was the standard used by Peter Jackson in The Hobbit trilogy. This frame rate is able to pick up fast camera motion with twice the detail as 24p, but utilizes more data storage space, bandwidth and requires more lighting. The last is true in that the camera has a shorter amount of time to capture the image than 24p, and without adequate lighting, would appear darker than if the same scene were shot in 24p. This proves true with my Logitech HD Webcam, where inadequate lighting has resulted in dropped frames and pixelation in the past. If you plan to shoot in HD, make sure that you have plenty of lighting to achieve the best result.
Even though it frustrates me at times, my Elgato Video Capture has done a tremendous job of capturing all of my retro console gameplay. The Elgato captures the video either in 4:3 or 16:9 and can detect which standard to use, whether it be NTSC, PAL or SECAM. Audio is captured at 48 kHz and outputs either in H.264 format or MPEG-4 (Mac Only). Since neither of these consoles exceed the 640 width dimension of the Elgato, the results are superb. When capturing gameplay from my NES or SNES, I will record using the 4:3 standard, but for a slight bit of flash, I will record using the 16:9 standard when recording my PS2 or original XBox for the best result. In the end, it’s all a matter of knowing your equipment and finding ways to cope with their limitations that will not detract from either your experience or that of your viewers. Even with an SD capture option, I’m still able to generate videos that are consistently complimented on for their quality! No magic involved, just knowledge!
We shouldn’t forget about the aspects that make a game fun, nor should we fault the game due to using equipment that will impede its performance. Games were designed to entertain and allow the user to escape every day life, and when we begin sweating the small stuff, we are missing the point. If an indie game could do with some brushing up, I would encourage you to get proactive and volunteer as a tester to help them fix the bugs. If your game plays/looks like shit on your large display, try a smaller one and see if it improves. 9 times out of 10, it will. My own aged plug & plays looks amazing on my CRT, but I also have the common sense not to hook them up to my Vizio if I want to experience them in the best way possible. Bottom line, do your homework and figure out which setup will work best with your console and keep everything in perspective, lest you miss the entire reason why video games were created in the first place! Lumpz the Clown OUT!