A cloud data centre‘s the brains of the Internet. The engine of the Internet. It is a giant building with a lot of power, a lot of cooling an a lot of computing machines.
It’s row, upon row, upon row of machines, all working together to provide the services that make Google function. We love building and operating data centers.
We’re responsible for managing the teams globally that design, build and operate, Google’s data centers. We’re also responsible for the environmental health and safety, sustainability and carbon offsets for our data centers.
This data centre, here in South Carolina, is one node in a larger, network of data centers all over the world. Of all the employees at Google, a very, very small percentage of those employees are authorized to even enter a data centre campus.
The men and women, who run these data centers and keep them up 24 hours a day, seven days a week, they are incredibly passionate about what they’re doing.
In layman’s terms, what do I do here? I typically refer to myself as the herder of cats. I’m an engineer. Hardware site operations manager.
We keep the lights on. And we enjoy doing it. And they work very hard, so we like to provide them with a fun environment where they can also play hard as well. We just went past the three-million-man-hour mark for zero lost-time incidents.
Three million man-hours is a really long time, and with the number of people we have on site, that is an amazing accomplishment. I think that the Google data centers really can offer a level of security that almost no other company can match.
We have an information security team that is truly second to none. You have the expression,
“they wrote the book on that.” Well, there are many of our information security team members who really have written the books on best practices in information security.
Protecting the security and the privacy of our users’ information is our foremost design criterion.
We use various layers of higher-level security the closer into the centre of the campus you get.
So, just to enter this campus, my badge had to be on a pre-authorised access list. Then, to come into the building, that was another level of security. To get into the secure corridor that leads to the data centre, that’s a higher level of security.
And the data centre and the networking rooms have the highest level of security. And the technologies that we use are different. Like, for instance, in our highest-level areas, we even use underfloor intrusion detection via laser beams. So, I’m going to demonstrate going into the secure corridor now.
One, my badge has to be on the authorized list. And then two, I use a biometric iris scanner to verify that it truly is me. OK, here we are on the data centre floor. The first thing that I notice is that it’s a little warm in here. It’s about 80 degrees Fahrenheit.
Google runs our data centers warmer than most because it helps with the efficiency. You’ll notice that we have overhead power distribution. Coming from the yard outside, we bring in the high-voltage power distributed across the bus bars to all of the customized bus taps that are basically plugs, where we plug in all the extension cords.
All of our racks don’t really look like a traditional server rack. These are custom designed and built for Google so that we can optimized the servers for hyper-efficiency and high-performance computing.
It’s true that sometimes drives fail, and we have to replace them to upgrade them, because maybe they’re no longer efficient to run.
We have a very thorough end-to-end chain-of-custody process for managing those drives from the time that they’re checked out from the server til they’re brought to an ultra-secure cage, where they’re erased and crushed if necessary.
So any drive that can’t be verified as 100% clean, we crush it first and then we take it to an
industrial wood chipper, where it’s shredded into these little pieces like this.
In the time that I’ve been at Google – for almost six and a half years now – we have changed our cooling technologies at least five times.
Most data center have air-conditioning units along the perimeter walls that force cold air under the floor.
It then rises up in front of the servers and cools the servers. With our solution, we take the server racks and we butt them right up against our air-conditioning unit.
We just use cool water flowing through those copper coils that you see there. So the hot air from the servers is contained in that hot aisle.
It rises up, passes across those coils, where the heat from the air transfers to the water in those coils, and then that warm water is then brought outside the data centre to our cooling plant, where it is cooled down through our cooling towers and returned to the data centre.
And that process is just repeated over and over again. To me, the thing that amazes me about Google and the data center is the pace of innovation and always challenging the way we’re doing things.