ADVANCED: Details about what the stat net command shows,
Some interesting info on statnet. This was copied from the NUB clan website (Nasty Unreal Butchas). They posted a UT info page, but this part is the same regardless of whether your using UT or Unreal 1. I found this to be informational, some of it may apply to your situation, or it may not. As with anything the server tick settings and maxclient rate settings depend on your own pc' cpu, memory, etc.. and the bandwidth you have from your providor.
Stat net offers more information than just your ping and pl. Lets see what we have there - from top to bottom:
|Should be self-explanatory.||the numbers of actors on the server that are currently relevant to you.|
All of the following are given for both directions - IN means from server to you, OUT means from you to server.
|Unordered/sec||number of packets that were received in another order than they were sent. If this is not constantly 0, something with your connection is seriously fucked up.|
|packet loss||percentage of packets that didn't reach the target. Should be 0 all the time. If it isn't, this is almost always caused by connection problem.|
|packets/sec||number of packets received/sent per second. See below for more info.|
|bunches/sec||number of actor updates received/sent per second.|
|bytes/sec/netspeed||number of bytes received/sent per second. guess yourself.|
From a player only perspective, the only thing that is interesting here (other than ping and pl, of course) is bytes/sec. This will always be capped by your netspeed (with a very few rare exceptions). For OUT side, it should actually NEVER exceed netspeed - because that is what the frame cap is for. If this reaches your netspeed on IN side, you might start missing out some information. If that happens only rarely in extreme situations, it's no reason to worry; however, if its a permanent condition, you might miss important data the server cant send you because your bandwidth is saturated. This usually happens due to unreasonably high tickrates on server side.
Tickrate basically is the "fps" the server is running at. It's basically the most important variable a server admin can change. There is a tradeoff here - higher tickrate means more cpu load and more traffic server->client, but also means better pings (both in display and effective) and generally a more precise simulation on the server.
The default tickrate (20 for internet servers) is what causes the horrible ping in stat net display - when the server runs at 20 ticks a second, that also means it can take up to 1/20th of a second before it can acknowledge a ping from a client - that means 50 ms! This also affects gameplay - having a server run at tickrate 20 means that player commands have to wait for 50 ms in worst case before they can actually affect gameplay. And aiming becomes less accurate - just think of playing with 20 fps. If you move your mouse fast, there are large "jumps" in your viewrotation at these low values, and that is exactly what the server will do with your aiming if it is running at such a low tickrate, causing "gaps" in your aiming movement. This is also the cause for the seemingly increasing damage for the continously firing weapons (pulse secondary and minigun) - aiming just gets more precise, causing more of the actually single shots to hit the target.
In the past, server admins seem to have discovered that at a large scale. However, some of them went too far when fixing that misconfigured default setting - you can find servers running at tickrate 100 out there. Now while that is fine for people with a lot of bandwidth (as long as the server cpu and network connection is capable to sustain it), it is NOT fine for the average ISDN player.
The server->client traffic increases nearly at the same rate as the tickrate does, and tickrates of 100 generate too much traffic for a 64 kbit isdn line. That results in either packet loss (if maxclientrate on server is high enough and netspeed of the client is set to more than the line can actually handle to increase fps rates - see above for that..) or in the client missing game information ranging from decals to really important stuff like player movement.
On a sidenote - the reason for "f1-ping" being higher than stat net ping is simple: most clients will run at more fps than the server. That means the server will get an acknowledge to a ping request faster than the client will get it from the server - he just has to wait 1/fps seconds (plus the actual icmp ping, of course). For some strange reason (probably an attempt to filter out those dependencies) the client substracts half its frame-time (1/fps) from its own ping - that is why you'll always see a lower ping on your own machine than others will see for you.
There is one thing left to mention about packets/sec - if your fps get lower than this, you will most probably experience lag; this is the situation called "invisible packet loss" because it feels exactly like pl, but no pl is shown. I've yet to see that phenomenon myself, but it seems that UT has problems with handling two waiting packets in one frame. No solution is known to this - I'd suggest tweaking your UT for more fps. If you're willing to give up the "looks", there are many ways to improve your fps. Try one of the many many UT tweaking guides out there. Or buy a faster machine.