Introduction

Hey hey, folks!  Lumpz the Clown here, and as both a console and PC gamer, I have begun to notice both spirited and downright nasty debates online regarding what frame rate looks best and is consistent in certain games.  For example, some PC purists were denouncing Mario Kart 8 shortly after its release, stating that it didn’t hold a consistent 60 fps during play.  Because of this, some wouldn’t even give the game a try!  As for me, since I had been watching Mario Kart 8 from its announcement by Nintendo and was fully aware of its amazing visuals and fast-paced action, I couldn’t believe what I was hearing.  Denouncing a game simply because it didn’t output a consistent 60 fps?!
homer crazy

THAT’S CRAZY, MAN!

Well, instead of getting into a flame war with these people online, I decided to look into the aspect of frame rate and why it is so important to some of us.  I came from an era where frame rate was not a concern when getting your game on, so what changed?  As technology has advanced over the years, televisions transitioned from analog to digital signals, allowing for crisper and more-detailed displays.  Video game systems from the Atari to the PS4 ran through the gamut from RCA to HDMI outputs, resulting in cleaner and more detailed audio and video signals.  But is it an anally consistent high frame rate that makes a game great?  We’ll get to that!

Where It All Began

Remember me talking about those awesome days at my grandparents house, where I would spend the majority of my time on their Atari 2600?  They even had it hooked up to an old box TV from the 70’s with one of those converter boxes that screwed into the back!  Scoff if you must, new-gen gamers, but this is in fact the best way to enjoy the Atari 2600 in all its intended glory!  But what about those frames?  The Atari 2600 outputs its audio and video signal through a single RCA cable that runs out of the back of the system.  As for video, the Atari 2600’s output has a resolution of 160×192 at 60 Hz (NTSC).  In PAL regions, the output is 160×228 at 50Hz.  If the Atari 2600 were to be played on a modern TV with a progressive scan (which was not a feature found on older sets), it can result in missing graphics or colors.  Try playing Asteroids or Space Invaders on your 52-inch, High-Definition TV and let me know how it goes!

However, the Atari 2600 went down in history as one of the most successful and relevant video game consoles.  Many developers to this day still laud the 2600 as the console that inspired them to begin developing in the first place.  But again, was frame rate important back then?  Not even taking into account the technical limitations of the day, the Atari 2600 was the platform of some of the most monstrous hits in gaming history, including Space Invaders, Defender, Berzerk and Yars’ Revenge.  I haven’t found any evidence to indicate that the 2600’s frame rate fluctuated any significant amount from 60 Hz, which may look like garbage on a modern display, but look amazing on an older TV set.  Although, when the concept of playing video games at home was brand-new and infectious, did it really matter?  People could now play their favorite games at home without spending their hard-earned money at the arcades, which were ALSO a new and promising concept back then!

spaceinvaders

Hell Yeah! Space Invaders, bitches!

Even if the 2600 doesn’t play well with newer displays, it still remains one of my favorite consoles of all time.  For anyone looking to get their feet wet with the 2600, I always recommend picking up an old CRT, or cathode ray tube television set with an F adapter that converts the 2600’s RCA connection to a coaxial output.  As I sit here now, I have my Light Sixer connected via the coax input and my Atari Flashback 2 connected to my video inputs on the same HD television that I use for my computer and haven’t had any graphical issues to speak of.  My TV is an Emerson model (Model #LC220EM2) that has an NTSC and ATSC tuner built in, which may explain why it’s able to display the Atari consoles as they were intended without any missing layers or incorrect colors.  The way I have them connected may also allow me to bypass my monitor’s comb filter, which has been the main beef with many current users trying to use an HD monitor.  If I tried to connect the Atari Flashback 2 or any of my other plug & play devices to my 52″ Vizio that I keep in my living room, they’d look like shit!  I haven’t tried to connect my 2600 to this TV via coax, but if I did, I may be able to bypass the comb filter.  On the other hand, since the Flashback 2 is restricted to its RCA outputs (yellow and white), it falls victim to the comb and isn’t even worth playing on this display.  On the smaller display that I use for my computer, I connect it the same way, but it isn’t negatively impacted by the comb filter.  It’s a conundrum, folks, and is all dependent on the model of display that you are using, but I digress.  Now, back to frame rate and the stir it’s creating in the world of more modern gaming!

Go Forth Into the Future

Fast forward a few years, Steam has taken the world by storm with its digital downloads and PC gamers have evolved from the casual days of LucasArts to an extremely prevalent force in the gaming world that takes itself very seriously.  The loudest argument that I have heard between console and PC fanboys is…frame rate!  I didn’t ensconce myself into these tirades, but remained an interested observer.  I became curious as to why frame rate would either make or break a game.  This argument was most obvious around the time of the release of the so-called “next-gen” consoles, where PS4 and XBox One owners were ripping at each others’ throats, but would unite in their shared hatred for the Wii U. Some even claimed that it wasn’t even a viable contender in the world of “competitive gaming.”  The argument picked up traction afresh around the release of Mario Kart 8, when the Internet exploded with a metric ton of articles that dismissed Mario Kart 8 due to its supposed inability to maintain a constant 60 fps.  Also, it was stated that the PS4 had a technical advantage over the XBox One, which caught the ire of Mike Ybarra, a Microsoft studio manager, who responded sharply to a jab from a user asking “when will the Xbone be able to hit 1080p“.

Call me naive if you must, but I never understood why anally fixating on frame rate has been elevated to the point of making or breaking a game.  Then again, many users are utilizing more than one display in some cases, and others are using displays that tower over 60 inches!  I have had an opportunity to speak with a few hardcore PC gamers who utilize multiple displays, and they all had one thing in common: they covet frame rates!  Ybarra’s argument back to the so-called “troll” was that Forza runs in 1080p/60fps, and Harvey Eagle in Microsoft’s UK marketing division even “defied users to see the difference,” unless they were using a display that was over 60 inches.  Even though many of these PC gamers I spoke with utilized multiple displays, all of them also stated that they use standard-sized monitors, as a larger display would negatively impact their frame rate and spike out their resources as their system struggled to fill a massive display.  So in effect, is this whole frame rate argument all for naught when approached objectively?  Are the loudest detractors simply trying to accomplish what is not feasible or even reasonable?

Waaah!

Does baby need a baba?!

How Frame Rate Impacts Us

The science community hasn’t been able to determine the human threshold for viewing images, so that does leave room for improvement when approached with that in mind.  The NTSC and ATSC frame rate standard sits at 29.97 fps, where the PAL region sits at 25 fps.  According to Ybarra, Forza for the XBox One runs at 1080p/60fps, which, of course, would get along better with modern displays.  But what are the benefits of an increased frame rate?  Below, I will discuss what the human eye detects and examples from my own setup.

On average, the human eye can detect a moment of darkness lasting as little as 16 ms when looking at a lighted display and can detect a single image lasting as little as 13 ms in a series of different images.  Returning back to Eagle’s earlier stated argument, users who attempt to play the XBox One using a display smaller than 60 inches would not see a difference, unless they attempted to run 2 XBox Ones side-by-side, with one connected to a 60+ inch display and the other to a smaller display.  As the XBox One attempts to fill the entire large display, some users may experience a drop in frame rate, whereas the other user would not have that same issue.  This would make sense, seeing how my previous interviews with hardcore PC gamers reflected their desire to play competitively while using smaller displays.  Yeah, it would be awesome to play an HD game on a larger-than-life display, but after weighing all of the factors, it hardly seems worth it.  I would rather play the game flawlessly on a smaller display than a larger one myself.  I can only imagine that as the display size increases, graphical anomalies and slowdown can only get worse, which seems to make a stronger case for smaller displays if it’s a consistent frame rate you are looking for.  Many of us have our consoles hooked up in our living rooms, where generally the largest TV set in the house is resting, which may also be the problem.  The only consoles I have hooked up to my 52″ Vizio are my PS3, XBox 360 and Wii U, which all play very nicely with it and have not resulted in any graphical issues.  All of my older consoles are hooked up to a 32″ CRT TV (with the exception of my Atari consoles and Wii, which are connected to my PC’s HDTV display), and none of them have any significant graphical issues or frame rate problems to speak of.

Clown Cave

I present to you…The Clown Cave!  THIS is where it all happens! :-)
Many of the systems that you see above do not output anywhere close to HD, and yet, I don’t delude myself into thinking that they would work on my relatively ginormous 52″ Vizio downstairs.  Somehow, I’m able to capture them all beautifully with my Elgato Video Capture or OBS (with the exception of my Flashback 2, but that’s an entirely different discussion).  So, short of being a completely anal videophile, I really don’t see the argument of why frame rate would either make or break a game for any console.  From a technical standpoint, if these people in fact are turned off by drops in frame rate, I would recommend that they move to a smaller display, end of story.  My PS3 plays very nicely with my HD monitor, but it also does with my Vizio, so for the sake of total immersion without a sacrifice in frame rate, I connected it to my Vizio with stunning results.  If your output is subpar, I would encourage you to try a different display.  Don’t be that guy (or girl) who tries to play their NES on an over-blown display; respect the console’s limitations and make adjustments accordingly.

In the movie industry, standards have been changed over time to accommodate not only newer displays, but also the director’s vision for the film.  In the 1920’s, 24p was the de facto standard for recording footage and was easy to convert to NTSC and PAL standards where needed without any damning artifacts or noticeable speed up.  However, 24p would suffer in that it could not pick up fast camera motion very well, showing up either as a strobe or choppy motion.  However, advancements in technology gave way to 48p, which was the standard used by Peter Jackson in The Hobbit trilogy.  This frame rate is able to pick up fast camera motion with twice the detail as 24p, but utilizes more data storage space, bandwidth and requires more lighting.  The last is true in that the camera has a shorter amount of time to capture the image than 24p, and without adequate lighting, would appear darker than if the same scene were shot in 24p.  This proves true with my Logitech HD Webcam, where inadequate lighting has resulted in dropped frames and pixelation in the past.  If you plan to shoot in HD, make sure that you have plenty of lighting to achieve the best result.

Even though it frustrates me at times, my Elgato Video Capture has done a tremendous job of capturing all of my retro console gameplay.  The Elgato captures the video either in 4:3 or 16:9 and can detect which standard to use, whether it be NTSC, PAL or SECAM.  Audio is captured at 48 kHz and outputs either in H.264 format or MPEG-4 (Mac Only).  Since neither of these consoles exceed the 640 width dimension of the Elgato, the results are superb.  When capturing gameplay from my NES or SNES, I will record using the 4:3 standard, but for a slight bit of flash, I will record using the 16:9 standard when recording my PS2 or original XBox for the best result.  In the end, it’s all a matter of knowing your equipment and finding ways to cope with their limitations that will not detract from either your experience or that of your viewers.  Even with an SD capture option, I’m still able to generate videos that are consistently complimented on for their quality!  No magic involved, just knowledge!

In Conclusion

VGHSheader

Video Game High School Said it Best: It’s All About the Game.
In conclusion, from an objective and technical standpoint, I’m still at a loss to see what the big deal is surrounding consistent frame rate if a game is otherwise good.  Granted, there have been instances of bad frame rate and technical gaffs in the past (see Silent Hill Downpour) and they can certainly detract from the experience.  And that’s the key word here: consistent.  I have sunk about 5 hours total into Mario Kart 8 and haven’t noticed any frame rate drops or spikes at all!  If they do exist, they are so infinitesimal that it doesn’t matter.  To return back to a game that rightfully deserves detraction, Silent Hill Downpour suffered greatly from frame drops and freezing.  Why did this suck so much for users who complained about it?  Silent Hill is a horror game franchise where fluidity helps to generate the unsettling feeling of isolation and being stalked.  When the game is stuttering around like a drunken buffoon and tripping over its own two feet, it robs the user of that experience.   The inception of this franchise started with Konami’s 1999 title, Silent Hill.  Some may say that this title has not aged well, but there’s one thing that it has that Downpour doesn’t: fluidity.  There was never a moment where the user was robbed of a terrifying moment due to a drop in frame rate or other technical issues while playing Silent Hill.  If there’s an argument to be had regarding frame rate, it should be directed at games where it is the most prevalent and damning and not sweating the 1.67% frame drop that is experienced in Mario Kart 8 while playing it on a cinema-sized display.

We shouldn’t forget about the aspects that make a game fun, nor should we fault the game due to using equipment that will impede its performance.  Games were designed to entertain and allow the user to escape every day life, and when we begin sweating the small stuff, we are missing the point.  If an indie game could do with some brushing up, I would encourage you to get proactive and volunteer as a tester to help them fix the bugs.  If your game plays/looks like shit on your large display, try a smaller one and see if it improves.  9 times out of 10, it will.  My own aged plug & plays looks amazing on my CRT, but I also have the common sense not to hook them up to my Vizio if I want to experience them in the best way possible.  Bottom line, do your homework and figure out which setup will work best with your console and keep everything in perspective, lest you miss the entire reason why video games were created in the first place!  Lumpz the Clown OUT!

photo 1Lumpz the Clown

About The Author

Lumpz the Clown
Contributor
Google+

Lumpz the Clown is an avid lover of horror, vidya games and ninja tactics! Completely random at times, Lumpz always loves a good laugh! Retro and Indie Games are his bag! Gaming Rebellion RULES! Be sure to check out his website for more Clowny Gamer goodness!

Related Posts

  • Nice write up, Lumpz. Personally, I don’t care if the frame rate is 60fps or 24, and I can’t tell a flucuating 50-60fps from a locked 60. I think it’s mostly just marketing. And I guess it works if people would not buy Mario Kart, but would buy an HD-er remake of the already HD Last of US from 2013.

  • Good fuckin’ article, Lumpz!! I’m in pretty much the same boat as you. People were ragging on Mario Kart 8 for dropping to like, 58 or 59 frames per second every so often, and I remember reading somewhere that it wasn’t even because of a technical issue. Rather, it had to do with how the footage was recorded for the MKTV feature. So it’s not that the system was unable to do a consistent 60 fps for Mario Kart, but rather that Nintendo decided the best way to implement MKTV (which is an AWESOME feature in and of itself) was to sacrifice a frame or two every so often. And you know what? I’m damn glad they did that!

    However, I can also see things from the point of view where people WANT a consistent framerate for certain games. For example, in a fighting game requiring split-second reactions to your opponent and quick inputs, I can see how a fluctuating framerate would be irritating. I myself am eagerly anticipating the release of the new Smash Bros., and fluidity in a game like that is always appreciated. If something like Smash Bros. or Street Fighter occasionally dipped to 30 fps from 60, it would throw a player off. BUT if the game were consistently 30 fps, I would actually be fine with that! So I’m not a stickler for HIGH framerates or anything, but having a consistent experience is certainly important for a number of different genres. You bring up an excellent point about Silent Hill: Downpour. I really liked that game, but there are certainly points where it starts chugging like a rusty train! When you’re so drawn into an experience by the atmosphere, harsh stutters can ruin it and bring you back out. ‘Oh yeah, this is just a game,’ your brain subconsciously thinks.

    That all being said, a change in framerate needs to be SEVERELY debilitating for me to be taken out of the game. For example, Dead Space for the XBox 360 chugs every so often but they’re small blips that really don’t affect my experience overall because the atmosphere and the tension are just so damn heavy. As you said, it all depends on the game. I believe that having grown up with video game systems that struggled to run with 4 sprites on screen at once has really affected classic gamers, and they’re generally much more tolerant of small hitches in video games. Conversely, lots of people that are into the current gaming generation go nuts when a game is 1080p/60fps like it’s the end all/be all, but the fact of the matter is that if a game is good at 720p/30fps or 480i, it’s still a good game. We shouldn’t be yipping at each other and starting flame wars over a lost frame or two.

    I feel bad for anyone who won’t play a game that runs any lower than 60 fps…they’re missing out on so many incredible experiences that constitute the pillars of video gaming.

  • @Arcade Android Haha! Right? Good example! It seems that great gameplay is overlooked in many cases and that the relatively young console pushers (Sony and Microsoft) have resorted to fps dick measuring…sad…just sad! Thanks for reading tho, buddy! :-)

  • @ReplayAbility I totally agree, Adam! They are missing out on some great experiences due to snobbery, but if they want to live blind, by all means, I won’t stop them! Some games can get away with VSync and dips in fps, but I couldn’t BEGIN to imagine such issues happening in, say, V.S. Super Mario Bros or even Lost Levels! :-/

    I guess when you are raised up on a certain standard, it can be difficult to “go barckwards” and play an older console. Easy for us veterans tho, huh? :-) Yay, emotionally unavailable parents for crafting me into the gamer that I am today with a deep appreciation for the roots of my passion! :-) Thanks for reading, Adam, and enjoy the rest of your evening!