By Benjamin Hodgson, LucidView
Online gaming is arguably one of the most enjoyable and immersive experiences available. Truly, there is no better way to pass the time than to sign in, join the rest of the squad in a raid, defend your Ancient or even try to ace that chicken dinner. While there are many online options available, to suit all types of gamers; we are all subject to the same avenue of frustration – ping.
Anyone who regularly frequents the online realm has undoubtedly experienced a fluctuation in their latency, resulting in misplaced item drops, headshots that definitely should have landed, or in the worst case, a series of screenshots detailing your last few living moments. Sometimes the problem is evident, like someone else trying to watch that extremely important YouTube video, or even trying to FaceTime the friend they saw three hours ago. Other times, inexplicable packet loss and slow ping times to the server in your own country will incessantly plague you, even after you’ve reset the router for the second time.
As so many gamers are aware, there are a plethora of factors that can affect your online experience. While some of them are out of your control, a thorough understanding of these variables can greatly assist you in creating the best setup possible, to always guarantee a winning experience.
The two most prevalent concepts that are always discussed with regards to an effective online presence, are bandwidth and latency. Latency, ping, or delay is no doubt the defining value of a good online experience.
Most online multiplayer games are a real time experience, and require information to be transferred to and from the server as quickly as possible, to create a fluid, enjoyable experience.
If the latency is the defining factor, then why so often is it recommended that we “increase the bandwidth” to solve any online issues that are experienced? In some cases, this may alleviate obvious problems, particularly for a network with many users. However, in certain cases; increasing your bandwidth may generate problems further upstream that weren’t present before.
More bandwidth, does not necessarily mean more speed. More bandwidth increases the capacity of connections, which means that more content can be delivered sooner. Creating the analogy of a busy highway, one can liken the bandwidth, to how many lanes that highway has. All vehicles on this highway travel at the same speed limit. More bandwidth, means more lanes, which means that more vehicles can reach their destination at the same time. The vehicles all were still travelling at the same speed, so technically, transporting is not “faster”, just “bigger”.
At this point in time, it would be important to remember that Internet Service Providers only have a limited total amount of bandwidth supplied to them by their Tier 1 supplier. This introduces a new factor to your internet connection – Contention Ratios.
The Contention Ratio details the amount of users that share the same bandwidth link. Remember the highway from before? It was just made public. Even though you may have access to a 100Mbps link, there are now 19 other users that share that same link. This will result in a contention ratio of 20:1. More vehicles on this shared highway, means less lanes for your vehicles, and therefore a slower throughput of your data. This type of delay introduces inconsistent ping times to your connection.
A general rule of thumb for Subscriber or Contention ratios, is that the higher the bandwidth, the higher the ratio of sharing. This concept makes sense, as ISP’s are always trying to maximise the usage of their own connections. Essentially, the more you take, the more you will have to share. Contention ratios should not be confused with Shaping or Throttling techniques, as they relate to different concepts altogether.
Latency is defined as the time taken (in milliseconds) for a data packet to travel along a route, to its destination. Unfortunately, this value is mostly affected by factors such as the physical medium of transport, the distance between the two points, and in some cases, even the weather.
LTE, 5G and other wireless transport mediums are often very alluring for their massive bandwidth capabilities but can be extremely volatile to the whims of mother nature. While the weather is often not such a problem, wireless connections on average are not as consistent as a wired connection in terms of latency times.
Introduced latency on a network is often an insurmountable issue. After all, there are other people that have the same right to access the internet, to use for their own purposes.
Often, bandwidth intensive applications are what adversely impacts time sensitive applications. Managing the bandwidth available on a local network, can guarantee that latency sensitive applications will always be able to achieve throughput, as quickly as physically possible. Dividing the available bandwidth up into designated lanes for each connected user will ensure that each user will get their specific traffic delivered within the confines of their own lane. This is greatly beneficial to any time sensitive application. Never before has this task been easier to achieve, than with the LucidView Enforcer and it’s FairShare™ technologies. One simply needs to specify the maximum value to be shared amongst each detected user (remember, less is best), and the rest is handled. It’s extremely easy to configure, and can be turned on and off at any time.
In summary, bigger bandwidth does not guarantee better latency times for time sensitive applications such as gaming. Smaller bandwidth may mean that streaming, or downloading is adversely impacted, while ping times have never looked so good. But, as in everything else, it is all about balance, and accepting the trade offs for the benefits. The LucidView Enforcer, with it’s FairShare™ has made managing these trade offs uniquely simple.