Tremulous Forum
General => Troubleshooting => Topic started by: Deathrender on March 01, 2010, 10:40:26 pm
-
I know it's lag because I maintain a steady rate of 90 fps. It drops to 80 when this happens.
Video card: nVidia 9100m G
Sound card: RealTek audio
antivirus: AVG 9.0 and Spybot Search and Destroy
Windows version: Windows 7 home premium
Laptop.
I changed the quality to the lowest I could get, closed all other programs and i'm still laggy.
Ran as admin and XP compatibility.
-
Well honestly you shouldn't be too worried because 80 fps isn't bad lol
And it's not lag, it's FPS lag meaning it's just the framerate dropping. I might be wrong so in the console [~ key] type in /cg_lagometer 1 and if that changes from what it normally is then it's lag.
-
To the human eye there isn't a difference between 80 fps and 90, but it sounded like he was saying his fps drops AND he lags. I think more clarification is needed here by the OP.
-
Oh I see..uhh...hmm try turning cl_maxpackets to 125, make sure snaps is at 20, and also make sure rate is at 25000. Not sure if that will help, but when someone told me to do that it sure helped me.
-
The default rate is 3000, set for 56k modems. Setting it to 25000 makes a massive difference, especially when lots is going on.
And he specifically stated it wasn't an FPS problem >_>.
-
I never knew internet lag would affect FPS? I can get 400 ping on some servers and still maintain 90fps, but on transit with 24 players my FPS drops from 90 to 80 sometimes. I have 4gb ram, 3GHz dual core Geforce 260gtx... I think everyone gets FPS lag at SOME point in trem. I wouldnt worry about it
-
I play with cg_lagometer and fps displays on, and I occasionally get strobe-light motion (people jumping at 10 or less fps) while both the graphics card is running 90fps and the lagometer isn't freaking out (packet delay is stable at 100-150). So there is something going on with the server, not the client or the network, that can cause stop-action motion in crowded fights. The motion is a repeating cadence for a few seconds, so it's not something that random packet loss/delay jitter would cause.
-
I'm talking about high ping......
When multiple people are fighting (3v3 or more) my ping spikes to at least 150 but it seems like 250.
-
The engine is not very good when many entities are around, the renderer (rf+bk) needs only 2-3 ms in the following screenshot and the GPU 8 ms (gl), so ~100 fps would be possible, but the client/cgame needs ~50 ms, resulting in only 20fps. :'(
(http://img26.imageshack.us/img26/9207/shot0000a.jpg)
-
It is your network. I had that problem too.Then, I bought more bandwidth
-
The default rate is 3000, set for 56k modems. Setting it to 25000 makes a massive difference, especially when lots is going on.
And he specifically stated it wasn't an FPS problem >_>.
I accidentally skipped that part >.<
-
Local companies suck.
And parents are stupid for using them
Chances are i'll have to stick with the bandwidth I've got.
-
Deathrender: options -> system -> net & sound -> net datarate -> LAN/CABLE/xDSL
-
I don't fully understand your first post, but here are some general tips to help with network lag (writing this since there are many erroneous posts):
- rate 25000 is fine
- snaps 20 is fine (and also the default afaik)
If you have a server which supports a higher sv_fps you can increase it, but those servers are very rare.
- com_maxfps should be as steady as possible (if fpsdrops are frequent you might consider lowering it)
- cl_maxpackets and com_maxfps should ultimately be set at the same value or with a common divisor.
cl_maxpackets should never be above 125.
com_maxfps should always be as steady as possible to minimize lag.
Setting cl_maxpackets higher than your framerate is completely useless and will be ignored.
But the human eye cant tell the difference for anything more than 30fps or something like it.
Wrong wrong wrong. I would argue against it but this neat site already does it for me! http://www.100fps.com/how_many_frames_can_humans_see.htm
But my LCD is only 60hz anyway, a framerate above it doesn't make sense
Depends, if you have vertical sync enabled it's true, but then you cant set it higher anyway.
Otherwise it does make a difference. Because of the way your LCD works it doesn't draw the entire image at once, instead draws it in horizontal slices.
So while you will effectively see only 60 full images per second, some slices will be more up to date than if you would run it at 60fps.
So I should enable vertical sync? I find the slices annoying and the tearing looks ugly
No! It introduces a huge lag since it will wait to update your screen until the entire last screen is updated.
1000ms/60hz ~ 16.6ms. is the extra lag you will at least always have with vsync enabled on a 60hz monitor.
16ms might not sound a lot but it's actually really noticeable and will decrease your ability to play good.
I went a bit off topic but there you go :)
-
Setting cl_maxpackets higher than your framerate is completely useless and will be ignored.
this is incorrect, cl_maxpackets higher than your fps will cause a packet to be sent every frame, which is generally the goal.
But the human eye cant tell the difference for anything more than 30fps or something like it.
Wrong wrong wrong. I would argue against it but this neat site already does it for me! http://www.100fps.com/how_many_frames_can_humans_see.htm
But my LCD is only 60hz anyway, a framerate above it doesn't make sense
Depends, if you have vertical sync enabled it's true, but then you cant set it higher anyway.
Otherwise it does make a difference. Because of the way your LCD works it doesn't draw the entire image at once, instead draws it in horizontal slices.
So while you will effectively see only 60 full images per second, some slices will be more up to date than if you would run it at 60fps.
So I should enable vertical sync? I find the slices annoying and the tearing looks ugly
No! It introduces a huge lag since it will wait to update your screen until the entire last screen is updated.
1000ms/60hz ~ 16.6ms. is the extra lag you will at least always have with vsync enabled on a 60hz monitor.
16ms might not sound a lot but it's actually really noticeable and will decrease your ability to play good.
I went a bit off topic but there you go :)
that's also more or less wrong. the reason to have higher fps than your refresh rate is the fact that physics are dependant on fps (the magic numbers you want to hit are 41, 76, or 125; 333 is the next one but it can actually leave you in the air for so long to be dangerous, if you have a 75 or 120hz monitor, and your box can handle that fps without ever dropping to 37/60 (respectively) then you will probably benefit from using vsync at those refresh rates)
-
this is incorrect, cl_maxpackets higher than your fps will cause a packet to be sent every frame, which is generally the goal.
That's actually what I meant.
that's also more or less wrong. the reason to have higher fps than your refresh rate is the fact that physics are dependant on fps (the magic numbers you want to hit are 41, 76, or 125; 333 is the next one but it can actually leave you in the air for so long to be dangerous, if you have a 75 or 120hz monitor, and your box can handle that fps without ever dropping to 37/60 (respectively) then you will probably benefit from using vsync at those refresh rates)
You never benefit from using vsync because of the huge delay it introduces. The only thing it does is waiting for the entire frame to be drawn on the monitor before flippin' the buffer.
(Granted, at 120hz the delay is twice as low as on 60hz)
The reason why you have a higher framerate than your monitors refresh rate is that it appears more fluid.
-
partial frames a few ms faster aren't helpful, and 8ms (in the case of 120hz) is insignificant.
-
partial frames a few ms faster aren't helpful, and 8ms (in the case of 120hz) is insignificant.
By itself 8ms would be almost insignificant for the masses, but you have several sources that accumulates latency and in the end it's a lot more.
Why have more latency when you can have less?
-
because tearing is worse than latency.
-
because tearing is worse than latency.
I used to argue for using vsync too a couple of years ago.
Then I tried playing without it once.