![]() |
Video Games And Movie Quality CGI
I was playing a relatively new FPS the other day and I was amazed at how far video game graphics haven't come since I first played Doom over a decade ago. I'm a programmer by trade and I'm surprised that no-one has yet wedded cgi software from the movie industry with video gaming. Okay, yeah, cgi uses lots of cpu. Big expensive render farms. Yadda yadda. But we've been hearing that story for years. Hasn't hardware performance improved? Hasn't the software become more efficient? Nowadays you can link together a couple of off the shelf motherboards to create mini-supercomputers for just a few thousand dollars. Why isn't anyone exploiting that fact to sell movie-quality gaming platforms to high end users? It isn't like they lack for customers.
I'm just amazed that movie producers routinely generate cgi footage that is indistinguishable from reality and yet the crap on my TV showing just 525 lines looks more realistic than anything on my 19 inch LCD monitor with super insane resolution. WTF? |
You have to remember that rendered CGI is just that, rendered CGI. Gaming involves multiple calculations to occur before the next frame is to even be rendered. If you ever watch previews of games to be released, stating that everything in the game is running on the console using there built in physics and other engines, it looks much better than the final product you play. It's because there's also the infinite possibilities of dealing with user input. So to sum this up, you have CGI, which is just highly rendered frames using all of the cpu power just for that purpose. Or, you have gaming, which deals with human input, then gets channeled into multiple mathematical models, then gets rendered into an image that includes the new changes, then transfered to your screen. All of this occurs anywhere from 30 to 60 times per second, and in the case of a game like forza2 who took there physics engine very seriously, it iterated through it's physics model 360 times per second with a 60 frame per second image.
|
Quote:
I think you are underestimating the CPU cycles involved here. Yes, of course hardware has improved. But the scenes you are referring to in movies may take 1000 CPUs 12 HOURS to render! So even if you made a $5000, 10 CPU parallel system of some kind, it still would not be able to render in real time like the movies. The movies don't have to react every 1/60 of a second to user input (from 32+ players in some networked games), render the changes, and paint the screen! It just comes down to horse power. Plus, for a super high end system to be viable, you have to have games, and to get games, you have to have momenutm. No gaming studio is going to commit millions of man hours to a platform that has a very limited audience. btw, are you REALLY comparing the original DOOM to todays games?....come on man..... Check out the new Crytek game, Crysis. Their new engine is insane. http://en.wikipedia.org/wiki/Crysis |
Quote:
However, Age of Conan, due to be released next year, looks seriously, seriously good. May just cause me to build a mighty beefy PC to play it... |
Crysis Demo
|
Such complaining.... I remember hitting the Go button on my render farm of 20 AMKLY 386s (cost me over $100,000) and waiting for 3 WEEKS for it to spit out a 2 minute 3D Studio animation. The stuff is awesome these days.... and all on a $300 console!
|
All times are GMT -8. The time now is 02:23 AM. |
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0
Copyright 2025 Pelican Parts, LLC - Posts may be archived for display on the Pelican Parts Website