Forum Settings
       
« Previous 1 2
Reply To Thread

So I was sometimes getting a bad frame rate...Follow

#1 Jan 16 2011 at 9:45 PM Rating: Excellent
While traveling around Cataclysm, I would occasionally see a bad framerate in a certain location. By bad, I mean 4FPS in dire maul east. I would also see momentary drops as I flew between zones. I didn't really believe it was hardware. I am using an i5 processor desktop, with 8Gb RAM and an Radeon HD5770 video card.

I looked for a good testing spot, and settled on the stairs in front of the Scholomance instance. There I consistently got 10 fps. I turned off all my addons (I use around 30) and my frame rate jumped to over 100 fps. Now I was on to something.

A little google searching later, I found people complaining about the Archy addon dropping framerate. As suggested, I updated to the current Beta of Archy, and gained 50 fps. With a little trial and error, I found that Carbonite was the other culprit, eating up another 50 fps when at that specific location.

So now my problem is solved. I get to keep Archy (beta version), and I'll just do without Carbonite for now. I hopped back in Dire Maul, and everything was fine.

If you have a framerate issue, you may want to try the Blizzard tip "Disable all addons" as you troubleshoot. I still love playing with addons, but I have to be realistic when specific ones severely impact gameplay.
#2 Jan 16 2011 at 11:05 PM Rating: Decent
**
460 posts
If you have 8 Gb RAM I assume you are using 64 bit windows which does have isues with software not written for it.
#3 Jan 16 2011 at 11:35 PM Rating: Decent
***
3,157 posts
That seems like an odd blanket statement, since I use 64 bit W7 with better performance in every conceivable way than the 32 bit W7 I replaced.
#4 Jan 17 2011 at 6:10 AM Rating: Good
***
3,441 posts
jaysgsl wrote:
That seems like an odd blanket statement, since I use 64 bit W7 with better performance in every conceivable way than the 32 bit W7 I replaced.


In 99.9% of the cases, yes W7-64 is better than W7-32.

However, there's always that annoying .1% of the software/situations where you get glitches/bugs/etc. That's the way it always is with computers, really.
#5 Jan 17 2011 at 6:33 AM Rating: Excellent
ViralVD wrote:
If you have 8 Gb RAM I assume you are using 64 bit windows which does have isues with software not written for it.


Yes, it's 64 bit Winodows 7. My goal is to not suffer the severe variations in frame rate. So far, I think the addons were the culprit. Perhaps someone else may be prompted to check theirs, if they are having any issues.
#6 Jan 17 2011 at 6:34 AM Rating: Excellent
Ghost in the Machine
Avatar
******
36,443 posts
Fight the system, man. Shouldn't have to settle for 50 fps when you can get 100.

Even if the eye can't register that much.

Fight the system! FREEEEEDOOOOOM!

/braveheart
____________________________
Please "talk up" if your comprehension white-shifts. I will use simple-happy language-words to help you understand.
#7 Jan 17 2011 at 6:38 AM Rating: Good
***
3,441 posts
The Honorable dadanox wrote:
ViralVD wrote:
If you have 8 Gb RAM I assume you are using 64 bit windows which does have isues with software not written for it.


Yes, it's 64 bit Winodows 7. My goal is to not suffer the severe variations in frame rate. So far, I think the addons were the culprit. Perhaps someone else may be prompted to check theirs, if they are having any issues.


Partly the reason why I've always been minimalist with addons.

I run Armory, Titan Panel, GoGoMount.

I'm about to nix Armory here soon, because it doesn't even freaking work properly anyways. Half of the time it doesn't correctly keep track of what you have, or what you're getting, displaying gear on other characters doesn't work (it claimed a character I had logged on an hour ago was naked even though I know he's not), etc, etc, etc.

And Titan Panel (I think it is Titan Panel doing this, don't see why Armory or GoGoMount would cause This) is causing Hyjal elementals to show up as Skinnable, instead of Mining.
#8 Jan 17 2011 at 6:43 AM Rating: Excellent
***
1,150 posts
jaysgsl wrote:
That seems like an odd blanket statement, since I use 64 bit W7 with better performance in every conceivable way than the 32 bit W7 I replaced.

You're jumping on someone for making a blanket statement?
#9 Jan 17 2011 at 9:30 AM Rating: Decent
*****
11,852 posts

Now assuming you're using a 60 Hz LCD, you should sync your framerate so it stays at 60 :)
#10 Jan 17 2011 at 9:30 AM Rating: Decent
*****
11,852 posts
EbanySalamonderiel wrote:
jaysgsl wrote:
That seems like an odd blanket statement, since I use 64 bit W7 with better performance in every conceivable way than the 32 bit W7 I replaced.

You're jumping on someone for making a blanket statement?


But all Mac users really are sheep!!!

I rated you up for a very well placed jab...

Edited, Jan 17th 2011 10:32am by Jordster
#11 Jan 17 2011 at 9:31 AM Rating: Default
Mazra wrote:
Even if the eye can't register that much.


Hah! Ya not even close to that frame rate. I guess the thought is that if you boost it up higher when it dips it will still be faster than the eye can notice? Oh well.

As for the 32 v 64-bit debate, now adays good hardware can make a 64-bit emulating a 32-bit still run smoother than a 32-bit system. Out-and-out better. As for software that refuses to work, it's just part of progression of technology.

I cannot run Tie Fighter on my computer anymore =( but thats the price you pay for upgrading past Windows 98!
#12 Jan 17 2011 at 9:35 AM Rating: Good
*****
11,852 posts

I don't buy the "your eyes can only see 27 fps" argument.

I can clearly tell the difference between a game playing at 30 FPS and a game playing at 60. I can not make out individual frames, but I can clearly perceive the smoother motion. Back in the days of CRTs, I could even tell the difference between 60 FPS and 150 FPS. I could never really tell the difference between 60 and 80 or so, so I concluded that the higher the framerate, the larger the difference has to be in order to be detectable.

There is another potential benefit to being able to manage super-high frame rates. 120Hz displays allow for 3D by displaying alternating images in each frame, essentially giving 60 FPS for each eye. If your computer can not handle 120 FPS smooth, it can not handle 3D. Personally, I don't care for 3D in it's current form, but it's worth mentioning.
#13 Jan 17 2011 at 9:54 AM Rating: Good
Eye frame rate is complicated. In darkness with 1 source of direct light the eye is sometimes as slow as 4 FPS. I did a research paper on it a long time ago.

Regardless, ~30FPS has been an industry standard for a long time and has just recently started improving because technology has been taken to the point that it is ridiculously easy to surpass that number. Most movies in the theater are still shown at 24FPS, though.

The point is not that the eye does not feel a difference between 24 and 100 FPS, but that the brain fills in the missing frames and after a millisecond of viewing the human brain can no longer discern each frame. For games, I do believe, as you said, that 60FPS is optimal because much less fluid motion, and in the case of 3d your point is definitely valid.
#14 Jan 17 2011 at 1:21 PM Rating: Default
***
3,157 posts
iSheep statement aside....
The 30 FPS ******** is just misconstrued information. It's something oft chirped by people who won't put the effort into actually thinking about what they're saying.
Yes, the eyes have a 'natural' refresh rate somewhere in the middle of 20 and 45 FPS, supposing 'average' viewing conditions.
However, a display showing new images at 60hz/FPS/whatever will indeed look smoother than one displaying at 30, 40, or 59 FPS.
Why? Common sense, man! Your eyes don't have a magical 'sync' tool that syncs up your eyes to the refresh rate of your monitor. It's VERY likely that the frames are drawn 'between' refreshes. This causes you to see jerky motion, aka choppy animations.
On a more primal level, if the screen draws the animation 120 times, but your eye is only refreshing 60 times, that's an animation twice as smooth to your eyes.
#15REDACTED, Posted: Jan 17 2011 at 1:23 PM, Rating: Sub-Default, (Expand Post) Now, iSheep, there's a big difference between stating that '64 bit operating systems have trouble with 32 bit programs' and 'Macs don't suck when comparing features and price to other brands.'
#16 Jan 17 2011 at 1:28 PM Rating: Decent
Well, another thing Jay when talking about film versus gaming, is that few games use motion blur the correct way. Very few games - with the exception of a few Xbox games I cannot even think of any, actually.

If you freeze-frame a 24 FPS movie picture, it literally is just that. 1/24 of a second's worth of light, so any fast movement creates a blur. Our brains identify that blur based on context clues and previous frames. This fake fluidity helps our brains piece together the frames as 1 solid, rolling motion.

If you freeze-frame a game it is almost always a still shot. This is not re-rendered by your computer as a still-shot. No, it's actually a still, complete with weird aliasing, sharp edges, and static lifelessness.

While I understand most of the workings of computers - I still don't understand how this concept elludes game design. I imagine it either has something to do with the inability to predict future movements resulting in too much time between input and output, or it is somehow just too much of a memory hog.

Edited, Jan 17th 2011 2:30pm by tzsjynx
#17 Jan 17 2011 at 4:22 PM Rating: Good
***
3,157 posts
VERY VERY VERY VERY LONG POST EXPLAINING MOTION BLUR!
Tl;DR: Games don't have motion blurs because they don't use cameras.



You're over-thinking it.
You actually mentioned WHY games aren't usually 'motion blurred' you just don't realise it.
See, a video game is a LOT different than a film.
You have 'actors' which are nothing but groups of polygons (also called 'models'). Each thing you see in the game, each and every little tiny thing, is an 'actor.'
I'll use a simple 3D modeling exercise to try to help explaining why motion blur in games is different than in movies.
You take a 2D plane, make it stretch to infinite proportions. This is the 'ground.' You take a simple sphere and put it 'on' the 'ground.' The apostrophes are very important.
Now, you add a key phrase at frame 0, and move to frame 100. Move the sphere 5,000 units on the positive x axis. The program will interpolate the 5,000 steps to get from 0 to 5000, and each of those 100 frames will have the sphere moving 50 units, each frame.
Now, you render it. Rendering is the action of taking simple polygons and applying scene lighting, anistrophy, anti-aliasing, and textures, as well as camera effects.
What happens is the computer 'draws' the picture 100 times, once for each frame. It draws a separate picture for every frame. This is how EVERY computer generated image works, drawing one frame at a time. Think like a cartoon or a flip-book if you're not computer art inclined.

OK, let's take a break from computers and talk cameras. A camera operates by opening its 'eye' aka lens, and collecting data about the light it sees. Every object which you can see reflects (bounces) and refracts (bends) light. If an object absorbs red light, you won't see it as red.
The camera's eye records EVERY bit of light it sees. If you have a stationary object, nothing changes the entire time the eye is open. If the object is moving, the eye records the light as it bounced off of the thing every time it moves. So, in our case, we're rolling a ball on the ground. If the eye is open for 1 second, the picture it produces will show EVERYTHING that was there in that second. Most point and shoot cameras are open for 1/250th of a second. If there's movement, you'll get a 'blur.'
In film, this turns into what you know and love as motion blur.

On the computer, on the other hand, the 'eye' of the 'camera' (The camera in 3D terms is the point at which the user's view comes out) is only open a bajillionth of a second. This isn't something you can change, it's an instant 'snapshot' of what you're looking at.
Motion blur, in a game, is done through post processing after-effects. This is something that the program you're viewing draws AFTER it renders the frame. A good example that most games have is the old 'bloom' lighting, which is just an exposure and contrast filter.
SOME games with lower acceptable FPS, such as Xbox games and PS3 games (Console games TEND to run at 45 FPS when possible, despite the PS3 being able to display up to 120. The exception is racing games), are able to process the after effect without affecting FPS.
There's also an older, simpler method of post processing where it doesn't refresh the image after each frame, which you can see if you pick up a copy of Need for Speed: Underground 2. It turns out looking terrible and doesn't add the illusion you're seeking.
#18 Jan 17 2011 at 4:54 PM Rating: Decent
Yes I understand - however I'm confused about your thinking that the way 3D environments are rendered are somehow distinct from film. They aren't. If you watch a computer generated scene from a movie or any professionally pre-rendered scene for TV/Film/Computer/whatever use, you will find motion blur in every frame because it is what our eyes prefer. 3D rendering is just a new medium of space to work with, but anything past a logo requires motion blur.

Good movie editing software has tools for purposefully putting in motion blur. Good gif animators and even good flash effects use motion blur as well. The common denominator is that these are all pre-rendered.

I have used 3D rendering programs before - when you do a final render setting the blur rate is an integral part of getting good results. It simply depends on how clean you want it to look. Anything that is meant to resemble real life incorporates blur. Setting up planes/polys/etc is beside the point. You mentioned cartoons but many of them incorporate 'blur' as well through distortion of moving objects.

Rendering a ball moving across a plane at 50 units per frame is a great example because a good render will incorporate blur into that render so that each frame depicts a slight amount of movement for fluidity. It will make that 1 frame resemble 10 more frames which is my point all along. Motion blur enhances the effectiveness of a single frame which is why 24 FPS works for films. You CAN shoot a movie with a Super High Definition High FPS camera, then go through and cut every 1/300 frames out and make it into 24FPS. It will look awful.

Video games are one of the only multi-media mediums that do not correctly integrate motion blur which is why frame rates higher than the eye can process actually can help - I'd suppose your eyes blur it for you. It must be because they are not powerful enough to render in real time - part of the reason rendering a scene on 3DSM/Maya takes many seconds per frame - and so they prefer to take the hit on graphics to boost performance. I'm not disagreeing with you I'm just saying.
#19 Jan 17 2011 at 6:17 PM Rating: Decent
***
3,157 posts
You've dabbled, and you certainly know your film, but your take on the 3D world, though not horrendous, is a little lax.
As I said, all 'blurs' are done post process, aka as filters (thank the gods I can stop using laymans terms).
3D is how I make the money I make, so I know a bit about it. IBF "Oh noes they know what I do now they can stalks me."
Again, just look at NFSU2 to see what a bad post processing blur looks like on a game.
Now, look at NFS:Shift to see what a decent blur looks like on a game. It really is done seperate of the rendering process, it's not quite as simple as you think.
When you go into Max (who uses Maya now that it's the same company?), when you render a scene, you have your Vray filters you can put on the camera for post processing. You can move a ball 10,000 leagues in 2 seconds and not get rendered blur.
When you render your ball at our optimal 50 GUPF, generic units per frame, you'll see the perfectly crisp animations you don't like.
If you add a 'motion blur' filter, you'll notice something. Each frame goes through the lighting process, then it draws the scene, THEN it applies the camera filters. That last step, post processing, is where your favorite magic happens.
It's not very game friendly tbh, especially when a game has a LOT of very simple models in it.
#20 Jan 17 2011 at 6:46 PM Rating: Good
Ghost in the Machine
Avatar
******
36,443 posts
I like pie.

Mmm, pie.
____________________________
Please "talk up" if your comprehension white-shifts. I will use simple-happy language-words to help you understand.
#21 Jan 17 2011 at 6:57 PM Rating: Good
Mazra wrote:
I like pie.

Mmm, pie.


This thread is now about favorite types of pie.

I like chocolate and apple.
#22 Jan 17 2011 at 7:27 PM Rating: Decent
***
3,157 posts
I prefer a good Cream Pie.

<.<
>,>
....
#23 Jan 17 2011 at 9:27 PM Rating: Decent
Your take on Cream Pie was not backed by evidence and is therefor void!

I, on the other hand, love Cream Pie.
#24 Jan 17 2011 at 10:53 PM Rating: Good
Ghost in the Machine
Avatar
******
36,443 posts
This thread just took a slightly disturbing turn.
____________________________
Please "talk up" if your comprehension white-shifts. I will use simple-happy language-words to help you understand.
#25 Jan 17 2011 at 11:02 PM Rating: Decent
***
3,157 posts
Isn't disturbing normal to you fun Scandinavian types?
#26 Jan 17 2011 at 11:17 PM Rating: Good
Ghost in the Machine
Avatar
******
36,443 posts
Yes, but I thought that someone might find it disturbing, and I wanted to beat that someone to the second derail.
____________________________
Please "talk up" if your comprehension white-shifts. I will use simple-happy language-words to help you understand.
« Previous 1 2
Reply To Thread

Colors Smileys Quote OriginalQuote Checked Help

 

Recent Visitors: 384 All times are in CST
Anonymous Guests (384)