I try to keep my cellphone on “vibrate” during meetings; no one needs to know my taste in ringtones. But more importantly, I keep it on so that I can use the browser and web search ability, which I wind up doing almost as often as I check my email.
Yesterday was a good example of the utility of ever-handy web search. My group and I had brought together an international group of government officials to visit Microsoft Research’s annual TechFest. Tuesday was the “Public Day” with the media and outsiders allowed, so we spent that day at the Microsoft Convention Center with Craig Mundie, Rick Rashid, and a hundred others, but Wednesday morning we offered a series of side briefings for the group at Microsoft’s Executive Briefing Center, on topics of interest to large government and defense agencies. Topic one was data centers.
The brief provided an overview of Microsoft’s data center management, discussing for example the scale of our planned Chicago data center. It’s massive, nearly 500,000 square feet, and yet the briefer suggested that it was being designed like our other large new centers to run optimally, with few onsite personnel necessary. Naturally enough, the folks around the table wanted to know exactly how many, but the briefer couldn’t bring to mind the number.
But I remembered that when the news came out about the center last fall, I had read at least one account which cited the personnel required. A quick Live Search, filtered to only “News” sources, found the right article, so that within the same conversational thread I was able to point out that “published reports” stated that only 30 employees will be required to run it. They were astounded by the skeletal staffing. Ask yourself how many a government-staffed facility of that size would take; as a taxpayer, ask yourself if we can’t do better.
That sparked another round of discussion about “best practices” in data center management. Again, still fingering my mobile, I noticed that another more recent story among the News results seemed immediately relevant: “Microsoft Customers to Get Data-Center Saving Tips,” a ComputerWorld story just posted yesterday covering the European CeBIT expo. Steve Ballmer in his keynote promised to make available a compendium of the company’s best practices on the very topic, for free download in the near future.
That should be forthcoming, but in the meantime here is another excellent read: a white paper by Microsoft Research’s James Hamilton on “An Architecture for Modular Data Centers,” which is guiding some interesting thinking inside Microsoft. Interesting how? Well, in my old job we considered buying and deploying a “data center in a box” sold by Sun; just hook up power and water and it’s good to go! Sun even parked one of the big, black containers outside the DIA headquarters at Bolling Air Force Base for a day, and we had engineers touring through. Hamilton’s paper goes further, though, and “introduces a data center architecture based upon macro-modules of standard shipping containers that optimizes how server systems are acquired, administered, and later recycled.” Think of a lego-block system and you’ll start to get some ideas…
Meanwhile, back on the TechFest floor, I saw a live demo of the latest work produced by Microsoft Research in tools for operationally optimal management of massive global data centers. While that work is still under wraps, here’s a suggestive paper from the group within the past year about their work and its impressive returns even then.
Meanwhile, I’m hoping that vibration in my pocket’s just another email alert…
Filed under: Government, innovation, Microsoft, R&D, Technology | Tagged: cellphone, Chicago, Computerworld, Craig Mundie, data centers, DIA, Government, innovation, IT, IT architecture, James Hamilton, Live Search, Microsoft, Microsoft Research, mobile, R&D, research, Rick Rashid, search engines, servers, Sun, tech, TechFest, Technology, white papers |
It’s awesome to pay a quick visit this site and reading the views of all friends about this piece of writing, while I am also eager of getting familiarity.
LikeLike