Actually it does, but its a bit more complicated to predict how much network traffic one single player is consuming in reality.
First of all, the used bandwidth is not just a static value, but it depends on several factors, and is a value somewhere between the hard limits of your server's configuration.
One of these factors is the number of players that are currently connected to the server. In a nutshell: if there are many players, each individual player needs to receive a lot of data, which describes the movements of all the other players. On first glance, this sounds as if the required bandwidth were exponential to the number of connected clients. But luckily this is (by far) not the case, because the netcode is ultra-optimized.
Another factor might, for example, be the rate by which players shoot their weapons. If there's a lot of shooting going on, then more bandwidth is required. And still it is really hard to predict how much the gun fire really affects your line, because.... ah, i could go on and on, but to make it short, I'll just skip to the conclusion.
If you want to know a reliable value of the bandwidth that your server is eating, then just measure it yourself for a couple of days, and make your own conclusions.
If you only want to be on the safe side, then you can use the server's maxrate for clients to estimate the max. bandwidth your server would require in the worst case scenario (even though this value would most likely be far higher than the actual traffic).
P.S.: Gunfire has a significant impact on your server's desire to eat CPU cycles. But thats kind of off-topic.