Thursday 14 July 2016

Lets Have A Look Inside The Facebook Data Center.

We are running late to the Facebook Data Center. I keep checking my watch, as if the seconds might start moving backwards if I stare hard enough. This is not a particularly safe way to drive. I am, apparently, more concerned with being late than I am with possibly totaling a car. Being late is far more heinous act.

But when we arrive everything is weirdly fine, because we are in Iowa, and people in Iowa are understanding, solid, reliable. They’ve got work to do but they’re more than happy to make the time for us. The one-hour tour we were about 20 minutes late for ends up being nearly three hours. I remain unsure if this is Midwestern nice or Facebook nice.

Of all the large corporations I'd contacted trying to get a tour of their data centers for this road trip, Facebook was the only one that was immediately responsive. Of course, it would be pretty off-brand to shy away from sharing their work with the public (Facebook, we are told during the tour, is a Sharing Company).


Still, I was nervous entering their turf. I quit using Facebook two years ago—did they know? If they didn't know, would they find out? Would they care? Would they make me join Facebook again? (No one asked, probably because no one actually cared.)

The last time I'd been on any Facebook property was the summer of 2005. It was one of the earlier offices, the one across the street from the Aquarius Theater in Palo Alto. It was small, it had weird graffiti art and if I recall correctly, gross carpeting. My friend who worked there showed me a taped-together tapestry of printer paper listing every “interest” of every user of the site, ranked by popularity. This was the first time I actually understood what people who ran platforms could do with them, and the first time I considered Facebook a potentially powerful tool and not a dumb MySpace for people at Ivy League schools.

This time, Sam and I are asked to hand over our IDs to a security guard after arriving at Share Way, one of the roads in Facebook’s 202-acre compound that is home to two, soon to be three data centers. I try to imagine my friend’s printed list today, a stack of printer paper reaching to the moon. In the decade since that August evening, the idea of going to a Facebook building had transformed from the weird office of a weird startup to entering the belly of the beast.

The beast has a model of a wind turbine in its lobby and wants to know if it can get us anything—water, coffee. No one seems that upset that we’re late. They understand we are weary travelers, coming off three consecutive ten-hour drive days. We are told by Ken Patchett, the manager of all of Facebook’s “West” region data centers (“West” being everything west of the Mississippi, which, we are told, on a technicality also includes Singapore), that Facebook’s data centers are meant to be “shared with everybody.” I remain somewhat guarded as we begin our journey through the fiber-filled heart of the Sharing Company.

Before we are taken into the server room, we receive a brief crash course in the history of human communications. We are taken to a corridor on which one wall has been lined with stock photography of communications systems over time. At its beginning there is a human handprint painted on a cave wall. At its end, there is a Facebook “like” icon. The arc of history is long, but it apparently bends toward sharing.

In between the handprint wall and Facebook walls, there are stock images of the printing press, televisions, blank cassette tapes. Patchett tells us stories about how expanding communications capacity has helped society. Most of his present-day stories are about improving consumer choice—i.e., if I can share with friends about the quality of some shoes I bought, this will ultimately help my friends and the shoe company. He uses a lot of second-person descriptions with these user stories, at one point saying to me, “You are a branding engine.” I am a branding engine, I write in big block letters in my notebook.


To be fair, it’s not actually an exaggeration to say that Facebook has been a part of a major transformation in human communications, or that it hasn’t had meaningful impact. I think of a friend who reports on and in war zones telling me that she keeps in contact with sources on the ground primarily through Facebook or Facebook-owned WhatsApp—and that getting those sources to use any other means of communication is next to impossible.

The Altoona data center is Facebook’s fourth major data-center project after its initial foray into building its own data centers in Prineville, Oregon; Forest City, North Carolina; and Lulea, Sweden. Prineville is where Facebook really began pushing its hardware and infrastructure designs to improve energy efficiency. It’s where, infamously, a humidity-control problem led to literal rain in the cloud “It was not rain,” Patchett corrects me when I ask about this. “We just hit dewpoint, which caused condensation.” It’s also where Facebook learned the lessons that became the foundation of the Open Compute Project, an initiative that open sources and shares components of data-center design. (Facebook is a sharing company.)

They’re also an extremely business-savvy company. As Patchett and Brice Towns, the Iowa data center’s operations manager, explain the Open Compute Project to me it becomes rapidly apparent that the open source of OCP isn’t the open source I’m more familiar with, the one of earnest broke geeks with over-stickered laptops. It’s the open source of Enterprise Business Solutions, of suited up people in conference rooms brokering deals. These are the folks who actually run Facebook, not the manic pixie dream hackers that make up Facebook’s mythology.

Essentially, they explain, OCP is a solution to a consumer-choice problem. If you think of Facebook as “a consumer” and not a corporation. Options for server configurations, network switches, generally any of the stuff that goes into making a data center, have been historically pretty limited to a few options from a few companies. These companies haven’t had much market incentive to increase efficiency or offer modular solutions. By open-sourcing those solutions, Facebook argues, it improves the market and the industry as a whole.

Simplicity and functionality were always Facebook’s strengths. That’s what made it distinct in the era of resplendently rococo MySpace pages.

And they’ve managed to get other companies behind this idea. People from places like Rackspace, Intel, and Goldman Sachs (really big banks are increasingly inclined to build their own data centers than rely on enterprise services) are on OCP’s board, and its list of participating organizations includes companies like Cisco, HP, Bank of America, Huawei, and Samsung. Microsoft and Apple incorporate OCP hardware in their data centers, and contribute back to the project. HP and Foxconn are building and selling OCP-specified hardware.

The entrance to the server room where all of this hardware lives is behind both an ID-card reader and a fingerprint scanner. The doors open dramatically, and they close dramatically. It is only one of several server rooms at Altoona, but just this one room also seems endless. It is exactly the glimmering-LED cyberpunk server-porn dreamscape that it is supposed to be.

Towns shows me the racks, which hold machines stripped down to their most raw parts—motherboards and cables exposed, no bothering with encasing the equipment. He compares the creation of Facebook’s hardware to using Legos—taking apart a server, throwing out the bits that aren’t important, and using the ones that are important to “build just what we need to run the application.”

When I later ask for clarification I am told that “the application” means whatever Facebook application a user might be engaged with at that moment, but its totalizing ring makes me imagine a rebranding of Facebook as The Application.“I invited you to my birthday party on The Application,” “I can’t believe what she posted to The Application.” I wonder how many years out Facebook is from that rebranding.

The space between the racks is sealed off to capture heat from the servers and funnel it back into the data center’s ventilation system. These sealed spaces have doors, and we go inside one, where we’re immediately greeted by a blast of warm air. Between racks there are strips of cardboard, apparently a once ad-hoc but now standardized solution to directing airflow appropriately through the network.

As we walk through more of the Gibsonian hallucination of a server hall, Towns explains how servers are delicate creatures—highly sensitive to dust, temperature, humidity. Part of the design process for these servers entailed essentially pushing them to their furthest limits, learning the furthest thresholds of their hardware to maximize efficiency. Data-center-operations managers love to talk about efficiency—of computing power, but especially of energy. We ask how much of the data center’s power comes from wind and are told “all of it.” Facebook underwrote the cost of constructing a new wind farm in Iowa that, while technically part of the larger MidAmerican Energy power grid, essentially cancels out the data center’s energy use.


As we head out to see more of the ventilation system, we pass a man either installing or making repairs to the 30,000-plus miles of fiber-optic cable in racks running along the server room ceiling. There is a weird comfort to seeing an actual human at work in this hyper-polished space whose architecture prioritizes the needs of machines over humans. The Altoona data center has about 100 full-time employees, of which we are told about 80 percent were already local to Iowa, as well as a number of Iowa-based contractors.

The ventilation system is basically an entire floor, with each step of air filtering and cooling process separated into massive rooms. The first one features a wall-to-wall grid of fuzzy beige air filters that are one layer of Vaseline away from being a Matthew Barney installation. These corridors are where cool air from outside gets pulled in and where that hot air trapped between servers downstairs gets pushed out. It is massive and unbelievably simple and simultaneously really hard to explain. Later, Sam comments that the building felt basically like a really, really big science-museum exhibit on how HVAC systems work.

Facebook as a brand has always struggled, and continues to struggle, with coming across as genuine, as authentic, as human, despite quite earnestly wanting to.

It also felt like a strangely perfect physical manifestation of Facebook as a technology company. These stripped-down, practical choices are the aesthetics of Facebook, or at least Facebook as it was 10 years ago. Simplicity and functionality were always its strengths, and in those early years that’s what made it distinct in the era of resplendently rococo MySpace pages.

After the ventilation systems and server rooms, we get a quick glimpse into the office side of the data center. There is a sunny cafeteria painted a cheerful sunflower yellow that gets good light in the afternoon. There is a break room with large monitors on which staff can play video games. They are as generically bespoke as every well-funded tech-company cafeteria and break room I’ve ever seen—meticulously considered, clearly expensive, but the kind of expensive that wants you to know it’s casual, effortless.

In the central open-plan office area, there's a mural painted on one of the walls. (You can see it at the top of this story.) This is a trademark Facebook office interior-decoration choice—seemingly every Facebook office in the world has to have some kind of giant mural. (I remember very little about the mural in the office from 2005 across from the Aquarius Theater, other than it featured a woman with waist and chest measurements that defied basic principles of physics.)

In the Iowa data center, the mural, created by local artist Tony Carter, features half a dozen people apparently connecting with each other in a blue-to-purple gradient field, lined at the bottom with fluffy pale-orange clouds. They are all staring at phones or laptops, their stylized elongated bodies semi-horizontal. It is unclear upon first glance whether the people in the mural are suspended in space or falling through it. The intention appears to be more about the beauty of liminality. It may not matter whether these figures are suspended in liminal equipoise of The Cloud or hurtling through the eternal void. Or, it may not matter to the figures. They are busy. They are sharing.


Facebook is a sharing company, and as a technology company they are meticulous and incredibly capable. But Facebook as a brand has always struggled, and continues to struggle, with coming across as genuine, as authentic, as human, despite quite earnestly wanting to. Patchett and Towns are both completely genuine and clearly sincerely care about the work they’re doing. The same could probably be said for Mark Zuckerberg himself, although he has the misfortune of being so personally ingrained into Facebook’s origin story that it’s hard to differentiate him from the brand itself (and as Kate Losse documented in her memoir The Boy Kings, there actually may be no differentiation).

But Patchett and Towns aren’t Facebook and, despite the fact we’ve never seen the two of them in the same room, Mark Zuckerberg isn’t Facebook. Facebook is a corporation, and corporations are not so much people as a form of highly advanced artificial intelligence that people operate within. Maybe this is why some of the moments where conversation switched from the technical operations to Facebook-speak felt so awkward, but unintentionally so, like when Facebook’s algorithm decides to fill your Year in Review with pictures of an ex-boyfriend. It’s a brand that becomes harder and harder to empathize with the more it insists on trying to be empathetic, maybe because it’s not clear if there’s a distinction between an empathy engine and a branding engine or maybe because I am generationally disinclined to trust anything that’s too big to fail.

As we wrap up the tour, I finally ask a question I've been holding back all day, partly from embarrassment and partly because I expect they’ll say no. “So we have this quadcopter,” I start.

In quite possibly the weirdest near-future moment of my life, we are immediately given an answer, and it is so precise it suggests that it's an answer they've had to give before. We are told that it's totally OK to fly the drone over and around the data center we just toured, but if we go anywhere near the currently under construction parts, we will get in trouble. The construction company, they apologetically note, has a pretty strict drone policy. Later, I try to imagine the business meeting where Facebook and their contractor carefully compared drone policies. I hate the future.

From this distance the scale of the data-center campus, with the just completed Phase 2 building and power substation in the background, suddenly becomes a lot more legible, and with it Facebook’s larger position in the scheme of the network. The Internet is a massive technology built on top of and piggybacking on past massive technologies, both legacy and living. Looking at the Altoona data center from a distance, I considered the extent to which Facebook is less and less a website on the Internet and more and more a telecommunications paradigm unto itself, piggybacking off of the Internet. It is as much a part of the Internet as the Internet was part of the telephone system in the days of dialup.

Video of Inside The Facebook Data Center:
And this is maybe why Facebook’s forced empathy feels so uncomfortable—as a company, it continues to make decisions that suggest an assumption that Facebook isn’t so much part of the Internet as it is the natural next evolution of it. Some of its efforts intended to be in the service of a free and open Internet have at times seemed more in the service of a free and open Facebook. Instant Articles is great for improving article loads times, but it also means that users never have to leave the confines of Facebook.

Internet.org promises to connect the world, but its initial rollout in India was heavily criticized as undermining principles of net neutrality by favoring specific companies and platforms—still connecting the world, maybe, but mostly connecting the world to Facebook. To their credit, Facebook responded to this criticism by opening the Internet.org platform (awkwardly rebranded as Free Basics) to third-party developers in May of this year, although in editorials written since then Zuckerberg has defended the program as never really antithetical to net neutrality.

While Facebook taken at its best intentions doesn’t make a distinction between access to Facebook and access to the Internet, many users apparently already do. And this matters, because the kind of agency, creativity, and innovation a user has on the open Internet is very, very different from the kind that she has on Facebook.

This isn’t to say that the Internet is going to be wholly replaced by Facebook any more than the Internet “replaced” phones. We still use phones. But the Internet and the many sea changes it engendered (like smartphones) ultimately transformed a lot of the underlying fabric and infrastructure of phones to the point where it’s actually pretty hard to use a phone without also technically using the Internet, and damn near impossible to find a phone that’s actually attached to a landline.

Facebook has all the technical infrastructure and insight to become that underlying fabric of the network as we know it, harder and harder to work around. For me, for now, not being on Facebook is more inconvenience than life-altering hindrance—at its most annoying, it means I’ve sealed my spinster fate because I can’t make a Tinder account (luckily, I already have a heart of ice and refuse love at every turn). But my war-zone reporter friend pretty much can’t do her job without it, and it’s taken as a given that the News Feed algorithm dictates the kind of traffic that her stories, and my stories, receive. And if a patent acquired by Facebook in August 2015 is any indication, in the future not having a Facebook account could shape my credit score.

This isn’t to say that Facebook is doing something terrible to Internet infrastructure—OCP has been incredibly good for Internet infrastructure. Wind farms are good for Internet infrastructure. Facebook is doing really amazing, innovative work from and through their data centers, where the people who work there don’t mind reporters who are 20 minutes late and will go two hours over time to show them the ins and outs of the data center.

Maybe my anxieties of a world in which Internet infrastructure becomes just Facebook infrastructure are alarmist gripes of a curmudgeon who really likes things like Tilde Club. I don’t really know what kind of future Facebook might shape for the network, and it would be facile to assume that future is either good or bad when all technology is inherently ambivalent, despite the best of intentions. And I’m sure that the people working at Facebook have the best of intentions. The road to Share Way, I think while gazing out onto the adjacent highway, is probably paved with them.

0 comments:

Post a Comment