Thread: Determining the rate/speed settings for your server

Results 1 to 12 of 12

Threaded View

  1. #1
    Administrator AdamR's Avatar  
    Join Date
    Mar 2004
    Cardiff, South Wales [UK]

    Determining the rate/speed settings for your server


    Drek over at HLDS forums has written a very comprehensive guide on how you should determine
    • sv_minrate - minimum amount of data to send for each player (in bytes/second)
    • sv_maxrate - maximum amount of data to send for each player (in bytes/second)
    • sv_minupdaterate - minimum number of event updates to send per second to each player
    • sv_maxupdaterate - maximum number of event updates to send per second to each player

    You can find his guide here: setting rates for small home servers (formerly running a lag free server).

    I've also come up with a calculator that processes his rules according to the connection speed you have and how many player slots you intend to host on it. You can find this here: Drek's HLDS / SRCDS rate calculator (mirror).

    If you're running a LAN server (sv_lan is 1) then the server has control over client rates too. Every player is forced to use the server rate defined in sv_lan_rate. This should always be at the maximum 25000, unless you're running a LAN server over Hamachi or something.


    Next I'd like to tell you a bit about how quickly your server will run. You define this yourself with a cvar called sys_ticrate. Hold on there Timmy, don't go changing it just yet. It's important you understand what kind of impact the value of this will have not only on the speed your game server, but how also how it will consume resources on your system, and also the impact when your system inevitably becomes overloaded.

    sys_ticrate is pretty much identical to the fps_max setting you have on your client-side. It determines the minimum interval for the event queue to be processed within 1 second. Quite simply, the higher the tickrate, the faster your server will appear to run. This is because the delay between processing the event queue is much tighter.

    As this diagram shows, the queue is handled at each line along the bar.

    • With sys_ticrate set to 10, this divides 1 second into 10 timeslots creating a delay of at least 100ms between handling the event queue.
    • With sys_ticrate set to 100, this divides 1 second into 100 timeslots creating a delay of at least 10ms between handling the event queue.
    • With sys_ticrate set to 1000, this divides 1 second into 1000 timeslots creating a delay of at least 1ms between handling the event queue.

    You can set your tick rate from 16(ish) to 1000! It's normally at 100, but may have trouble passing 60. See the bottom of this post to find out why.

    As you increase sys_ticrate, you will see the effect of this simply by looking at your ping on the scoreboard. Even the command to calculate your ping is done through the same event queue. So at a tick rate of 100, the minimum ping you could possibly have is 10ms as that is the minimum time in which your ping calculation will be handled. Likewise at a tick rate of 1000, the minimum ping you could have is 1ms -- although you will rarely see this even on a server connected directly via Gigabit Ethernet.

    You can also see this server side by using the stats command, and looking at the FPS value. Spam the command 5 times or so within a second to get a rough idea of how well your server is actually performing at that moment in time.

    So in short, a higher tick rate means a faster server.

    Why you should not run your server as fast as possible

    Before you go setting all your servers to 1000, you need to consider these impacts.
    • Doubling the tick rate means double work for the server processor - even when nothing is going on
    • Higher tick rate means more data update each player with, straining your Internet connection further
    • The big one: Players will notice heavy lag when your server processor is maxed out
    • The bigger one: A tick rate too high will severely screw up physics calculations in the game engine

    Let me explain that 3rd point in detail. So, you have a server happily running at say a tick rate of 500. A few players join at the start of a map, not much happening, still running at 500. The problem: more players start to join, players reach busy parts of the map, the server processor gets busy. The server processor can no longer keep up to run at a tick rate of 500, so it has to slow down to a tick rate of 50.

    The problem is not the fact that the server has to slow down, it's the fact that players will notice the server slowing down.

    If you were only running your server at a tick rate of 100 in the first place, there would only be a 50 tick drop in performance. This is quite small. Players would only incur around a 10ms ping increase during the busy period of the map. However as players were comfortable with your 500 tick rate performance, a hit of 450 ticks is 9 times larger - 9 times more noticeable with around a 90ms ping increase. This is when your players will complain about how laggy your server is, and often play elsewhere instead.

    This is why it is ideal to set a tick rate that is both fast, but also sustainable most of the time. I rarely exceed a tick rate of 200, as a 5ms event queue delay is hardly noticeable, and players get a very small spike when the server processor gets maxed out at times. It is also up to you to benchmark your server performance during busy parts of maps with plenty of players connected - preferable full. Benchmarking your server while it is empty is like building a fresh new road without knowing how much traffic will be using it.

    Now to elaborate on the 4th point in detail.

    Not all of the game physics, handled by the GoldSrc game engine, are synchronous. Different calculations appear to run at different intervals. Some side effects noticed certainly when testing is that moving brush entities will appear to move more frequently than point entities, if the server tick rate is high enough. This can result in point entities being inside a brush entity, for example a player ending up inside a lift floor. This will usually result in a moving brush entity killing players and NPC's just by "looking at them funny".

    Evidence of this can be easily reproduced on the map "Turret Fortress". Start it up, change the tick rate to 1000, and start the waves. Nearly every NPC will be crush by the lift moving the NPC's up to the walk way, even though nothing is on top of them to cause any crush damage.

    This is why you should not really go over 200 frames/second.

    Why your server won't run faster than 60 FPS

    This mainly applies to Windows servers...

    The timeslots that HLDS can provide is dependant on the timer resolution of the operating system it runs on. Windows has a default timer resolution of 60 ticks/second. There are plenty of ways to change the timer resolution though.

    I've not tested or seen evidence of this, but I've heard that the default timer resolution is the same as your monitor refresh rate. Needs confirmation.

    The oldest trick is to load up Windows Media Player, open a video inside it, but don't play it. Leave the video stopped, and minimize the player. This will give you a maximum timer resolution of 1000, also allowing a tick rate of 1000. However Windows Media Player can take up a fair chunk of your server memory.

    Secondly there are various plug-ins for Metamod (known as "ping boosters") that can also adjust the timer resolution of the operating system. These however have been somewhat unstable compared to other solutions, and take up more memory than the simplest solution available.

    The simplest solution is to download a small tool called "srcdsfpsboost". I've put up a copy here: This is a tiny EXE file that you simply run, then leave in the background. If you would like to run this as a service so you never have to know it's there, here's how:
    1. Extract "srcdsfpsboost.exe" to somewhere like "C:\"
    2. Open up the Command Prompt
    3. Run this command: sc create srcdsfpsboost binPath= "C:\srcdsfpsboost.exe" start= auto DisplayName= "SRCDS FPS Booster"

    You may also need to start the service after you create it with the command "sc start srcdsfpsboost".

    Why your server won't run faster than 500 FPS

    Again, this section only applies to Windows servers...

    This is a limitation in both Windows and your hardware. You can get up to 1000 FPS by running your server on an Intel processor on a motherboard with certain Intel chipsets. Otherwise, be happy with 500 FPS.

    Also as I explained earlier, it's not a good idea to exceed 200 FPS anyway.
    Attached Images Attached Images
    Last edited by AdamR; 11-02-2012 at 11:19 PM.
    Adam "Adambean" Reece
    Sven Co-op team

    Also on: Steam | Facebook | Twitter | YouTube | Twitch
    Released AMXX plug-ins: Bind number slots | NextMap with Sven Co-op fix | Sven Co-op administrator icons

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts