Subscribe now and choose from over 30 free gifts worth up to £49 - Plus get £25 to spend in our shop
Say there are 2 sites with 2 remote desktop users running at 2Mbps and 1 at 4Mbps with 4 users. So we have 8 users and a 'total' bandwidth of 8Mbps.
So, if there's one hosted server connected to the Internet at 100Mbps that has a 300 Millisecond latency to the client sites and one that's at 2Mbps but with 50Millisecond latency which setup would give the best performance for the users?
with RDP sessions latency is going to really piss you off so the 50ms latency site would perform much much better.
The chances of the connections ending up like that are pretty slim though, unless one of them is on a modem/satlink though
more pipes
Depends a lot on the RDP configuration (can significantly impact the data usage by each session) and other stuff like any packet loss but with that low a user count I'd say the 50ms one would give the best performance in most situations.
[i]The chances of the connections ending up like that are pretty slim though, unless one of them is on a modem/satlink though [/i]
that's what I thought. Apparently for the second solution bandwidth is about £1000/Mbps per year. Now being asked what the minimum they can get away with...
These are VPN's though aren't they?
You're not going to tell me they're connecting to RDP sessions over the internet are you?
I'm now wondering if it'll be cheaper for them to have multiple hosted RDP servers each running at the 'included' 1Mbps than boosting the available speed on one of them higher.
Latency is important for some applications, less so for others. RDP is one of those where it is important 🙂
As above you can tune RDP for lower or higher bandwidth connections.
Samuri (sorry for delay been to dentist!) you can use Windows Server 2012 Essentials to setup a VPN that users than login to RDP through via internet?
