Behind the scenes of WoW's bandwidth
We heard a little while back that it was AT&T who provide data center hosting to Blizzard and this gigantic game (and actually, we've had outage problems before due to maintenance on AT&T's end), but our friend Tamara Chuang of the Orange County Register went straight to the source, and spoke with the big bandwidth provider on just what it takes to keep the servers up. There's some good information in there, especially if you're interested in all of the motherboards and wires that run the World of Warcraft. MMOs are apparently AT&T's biggest gaming customers, and they run the wires for companies like Blizzard as well as Konami and Turbine. They originally helped run Battle.net, and when Blizzard wanted to expand with World of Warcraft, AT&T's gaming division expanded with them.Unfortunately, there's a lot of secrets here -- given that they're selling a service, AT&T doesn't speak too frankly about how much downtime they're really responsible for, and of course as a trade secret they can't give any numbers on how much bandwidth is passing through and where it's all going. But they will say that they've got latency levels down to milliseconds (in their testing, I'm sure -- lots of players would probably suggest it's a little worse, depending on which ISP you use), and that they offer services like Synaptic Hosting. During times of hard usage, Blizzard can ask (for a price, of course) to open the floodgates up and make sure there's enough bandwidth to go around.
Fascinating stuff. It's too bad we can't get a really unbiased view of just how big Blizzard's operation is -- right now, with so much money in the system, everyone's got a financial interest to keep the details of how it all works to themselves. But maybe someday, when the game is shut down and Mike Morhaime writes his autobiography, we can get a nice wide look at all of the technology behind keeping Azeroth running.
Filed under: Analysis / Opinion, Virtual selves, Blizzard, Economy, Hardware






Reader Comments (Page 1 of 2)
Dightkuz Mar 24th 2009 9:05AM
kom igen nu britt-marie
Tumleren Mar 24th 2009 9:10AM
okay hvad
rekked Mar 24th 2009 9:25AM
Preciss!
Amritrao Mar 24th 2009 9:33AM
Vem är Britt-Marie?
Snakeman Mar 24th 2009 9:27AM
Quote: "...someday, when the game is shut down ..."
That day will never come. NEVAH!!!
Malvolius Mar 24th 2009 9:27AM
I would love to visit their server farms someday, or even get a job maintaining and working on them. The hum of the air conditioning units, the circulatory system of wires and cables, and the symphony of green and yellow blinking lights...it's an IT guy's dream, really.
Obnixus Mar 24th 2009 9:33AM
You and me both dude. 7 years doing IT work for Uncle Sam, 5 of that from a combat zone, and best I can do is a low paying county job 'cause I went strait from HS to the sand box and am just now starting on my degree...
...damn I miss large-scale IT operations....
Faidwen Mar 24th 2009 9:34AM
Until you experience and outage... :) Then it's an IT guys nightmares... although, just like experiencing an ingame event for the first time, figuring out the strategy to handle a situation is 110% of the fun!!!
Erogroth Mar 24th 2009 12:35PM
I work in a decent size data center as a hardware tech. I run lots of cables and rack and repair servers. Although my company does not do MMOs we do another large internet industry (use your imagination). The first time I ever walked into the data center my jaw hit the floor. It was the coolest think I have ever seen.
As far as maintenance, to be honest once its up and running there is very little you have to do. We schedule maintenance windows for down time and thats about it. As far as that "hum of the A/C" you mentioned, you should here how quite it gets when the power goes out. Of course the battery backup systems kick in right away followed by the gas generators, which can run for nearly 2 weeks without needing refueling and can be refueled while running. The whole system is really neat. Now my company, which rents space from data center companies, is building its own data center. Its gonna be so cool. can't wait. I would love for a MMO to host with us.
Rihahn Mar 24th 2009 4:41PM
One thing working in a bigger server room will do is really let you know how much the world stinks.
My office is in an environmentally stabilized, humidity controlled, electrostatically filtered to some small micron-level room that has air purer than the Swiss alps.
When I hit the parking lot of an evening I'm shocked at how bad Denver smells...
Barra Mar 24th 2009 9:34AM
So darksorrow-eu is not located near the coffee machine at blizzard? pff (give us better rats to run!)
Pantyraider Mar 24th 2009 9:49AM
I work in a data center. And honestly, it's not really exciting at all. It's loud and cold.
Rihahn Mar 24th 2009 4:55PM
But you have to admit that the biometric hand scanners, RFID ID cards, and air locks are kinda keen. :)
Well, and having 500Mbit to my desk is kinda nice too. ;)
Disco Ball Mar 24th 2009 9:58AM
I am pretty sure they are running EMC storage arrays. Probably Symmetrix arrays to store all the data.
sadric Mar 24th 2009 10:45AM
EMC, lol, no they are probably using high end SAN devices like Fujitsu. EMC is not all that. Hell Netapps has better devices than EMC does.
Litex Mar 24th 2009 11:21AM
I'd be much more interested in the server/database/storage architecture behind it all, rather than how big of a pipe they have to the Internet. How they handle redundancy, node outages, replication, backups, upgrades...
rihahn Mar 24th 2009 4:35PM
TMS RamSan - it's what all the AAA titles use. ;)
BulletzBill Mar 24th 2009 10:03AM
I would just love to know the specifics of what they are actually doing during the weekly downtime.
rihahn Mar 24th 2009 4:37PM
Letting Oracle take a shower and get a change of clothes, and pushing DB updates to 38 datacenters then running consistency checks.
If I had to guess.
nick Mar 24th 2009 10:14AM
The funny thing about bandwidth, it's misleading to think a bigger pipe will always net better performance. The distance, or RTT has a huge impact on throughput performance. So you will always have inconsistencies in performance depending where you are in relation to the datacenter. You can optimize your traffic somewhat by modifying your default window size, which will allow larger more efficient chunks of data to be sent. Decent wiki article on bandwidth delay product (BDP) and throughput. http://en.wikipedia.org/wiki/Bandwidth-delay_product
another link on window scaling for better performance:
microsoft xp window scaling