Listen to this article
On an early summer evening in May 2013, several hundred young(ish) men and women wearing sneakers, jeans and T-shirts assembled around a yellow crane in Hacker Square at Menlo Park, Facebook’s new campus 30 miles south of San Francisco. The sun was dipping below the horizon, leaving pink streams in the luminescent blue sky. A man on stilts bounded across the square. Another waved an old-fashioned boogie box throbbing with dance music. A third man, bearded, wearing a blue mask and indigo Zorro-style cape, jumped on to the base of the crane. The Facebook engineers had stumbled across this hunk of machinery at another office a few years earlier, entirely by chance, and had become attached to it. They used it for office meetings and pranks. So when the company moved to the new campus at Menlo Park, the engineers insisted on moving it too, as a ceremonial prop and symbol of their origins.
“Friends, Romans, hackers! Lend me your ears!” the man in the cape shouted. The crowd temporarily fell silent. Pedram Keyani, Facebook’s director of engineering and the nominal leader of the Hackathon group, stepped forward. “You know the form!” he shouted. “If you are still here at 5am there will be breakfast! And Chinese later!” Hackathons always started on the yellow crane, and then moved to another room where the engineers always ate exactly the same Chinese takeout several hours later, in the middle of the night. Mark Zuckerberg had first started buying food from this particular Chinese restaurant when he created Facebook a decade earlier. The restaurant was now a long way from Hacker Square. But nobody wanted to mess with that ritual by ordering from somewhere else.
“Is there anyone here who has not been to a hackathon before?” Keyani shouted. A few hands waved. “Well, that’s awesome! The only rule is — go enjoy yourself, go mingle, go have fun! And just remember, our hackathons are not like anyone else’s hackathons. Enjoy! Let’s write some code!”
The gang of hoodie-wearing engineers dispersed in small groups and walked to a large conference room studded with inspirational slogans: “Move fast and break things!” “Done is better than perfect!” “What would you do if you were not afraid?”
“Posters are one of the biggest culture carriers at Facebook,” observed chief operating officer Sheryl Sandberg. “We will be in a meeting and someone will quote something to somebody else, like ‘Make great decisions’. That’s the type of company we are. A lot of the initial cultural ideas came from Mark but we are very aligned as a group. You cannot impose this from the top. It has to grow from below too.”
Five years earlier, in the summer of 2008, Facebook quietly passed a little milestone. Its leaders realised that the company was expanding so fast that it employed more than 150 computer engineers. Outside Facebook, nobody knew nor cared. In Silicon Valley, rapid growth is considered a badge of honour. But when the top Facebook managers realised they had crossed the 150 threshold they became uneasy. The reason lay with the concept known as “Dunbar’s number” — the theory developed by British evolutionary psychologist-cum-anthropologist Robin Dunbar. In the 1990s, Dunbar conducted research on primates and concluded that the size of a functioning social group was closely related to the size of a human, monkey or ape brain. If a brain was small, the size of a monkey’s or ape’s, say, the creature could only cope with a limited number of meaningful social relations (a few dozen). But if a brain was bigger, as for a human, a wider circle of relationships could be formed. Humans did this, Dunbar argued, via “social grooming”, conventions that enabled people to be closely bonded. Just as primates created ties by physically grooming each other’s fur by picking out nits, humans bonded with laughter, music, gossip, dance and all other ritualistic day-to-day interactions that develop when people work or live together.
The optimal size for a social group among humans was about 150, Dunbar suggested, since the human brain had the capacity to maintain that many close ties via social grooming, but not more. When groups became larger than that, they could not be held together just by face-to-face bonds and social grooming, but only with coercion or bureaucracy. Thus bands of hunter-gatherers, Roman army units, Neolithic villages or Hutterite settlements all tended to be smaller than 150. When they grew larger than that, they typically split. In the modern world, groups that were smaller than 150 in size tended to be more effective than bigger units, he argued, and humans seemed instinctively to know this.
Most companies’ departments are below this threshold. And when Dunbar examined how British people exchanged Christmas cards in the early 1990s (which he considered to be a good definition of a friendship circle in British culture back then, before the advent of platforms such as Facebook), he discovered that the average number of people reached by the cards somebody dispatched to different households was 153. “This  limit is a direct function of relative neocortex size, and this in turn limits group size,” Dunbar wrote. “The limit imposed by neocortical processing capacity is simply on the number of individuals with whom a stable interpersonal relationship can be maintained.”
Not everyone in the academic community agreed with Dunbar. But Zuckerberg and the other founders of Facebook were fascinated by Dunbar’s number, and they eventually asked him to provide some consulting advice. Initially, their interest was commercially focused. As they designed their site, the Facebook engineers wanted to know how many friends a Facebook user was likely to have so they could build their systems accordingly. However, as the Facebook engineers talked to Dunbar, they realised that his findings not only had implications for how Facebook should build the website for external users. The findings could also affect how employees interacted with each other inside the company. Back in the early days when Zuckerberg had first created his company, the employees had worked as a single group. Some of them lived together, they worked in close quarters, knew each other well and had joint rituals, such as ordering food from the local Chinese takeout. However, as the company swelled in size, it was harder to maintain this sense of group identity.
The history of Silicon Valley suggested that for many companies the problems posed by a rapid expansion in size were deadly. The example that particularly worried some Facebook engineers, however, was Microsoft. Although the Seattle-based group had started off as a dynamic and creative entity, by the turn of the century it was plagued with silos. This fragmentation was not as extreme as it had been at companies such as Sony, for example, but nevertheless it undermined Microsoft’s ability to compete.
So was there any way to avoid that fate? The Facebook engineers were determined to try. “We want to be the anti-Sony, the anti-Microsoft — we look at companies like that and see what we don’t want to become,” one senior manager later recalled. They started tossing ideas around about how to combat the problem. In the summer of 2008, one of the early Facebook founders, Andrew Bosworth, a burly, baldheaded, tattooed engineer, floated a novel idea. Bosworth was known as “Boz” to his colleagues (Facebook engineers loved to give each other nicknames; it was part of the process of social grooming).
In previous months, Boz had been trying to create a training programme for new recruits. His goal was to ensure that when computing engineers joined Facebook they knew the same set of computer codes as all the other employees and were assigned to a team that used their skills most effectively. So he created an introductory course that showed them the company and taught them crucial coding knowledge. But Boz then realised that the course could do more than impart technical knowledge. It could also be a tool of social engineering. After all, if you put the new recruits through a common training experience in small groups, you could create a mechanism for some social grooming and bonding. And while these groups of trainees would not stay together as a unit, since they were destined to be scattered across the company, the joint experience could create lasting ties between them, and the type of social intimacy that fostered nicknames.
That summer, Facebook declared that all its new employees — no matter how junior or senior — would undergo a six-week induction process when they joined. Boz was named “Bootcamp Drill Sergeant”.
A crucial part of that training process, Boz added, would be a rotation programme to show new recruits the entire company. “Instead of assigning engineers to teams arbitrarily based on a small amount of interaction during interviews, Bootcampers choose the team they will join at the end of their six weeks.”
The new recruits were not just being asked to learn new technologies, however. “Bootcampers tend to form bonds with their classmates who joined near the same time and those bonds persist even after each has joined different teams,” Boz explained. To put it another way, the Facebook managers were trying to use Bootcamp to achieve two things.
First, they were organising the company into discrete project teams, dedicated groups to perform tasks. A company such as Facebook needs silos, in the sense of specialist departments and teams, simply to get its work done. Project groups were needed for focus and accountability. But the second aim of Bootcamp was to overlay those project teams with another set of informal social ties not defined by the formal department boundaries. This, it was hoped, would prevent the project teams from hardening into rigid, inward-looking groups and ensure that employees felt a sense of affiliation with the entire company, not just their tiny group. “Boot camp [can foster] cross-team communication and prevent the silos that so commonly spring up in growing engineering organisations,” Boz said. Facebook was both creating the preconditions for silos and instilling systems to break down those silos.
In December 2011, when Facebook moved from the office in Palo Alto into the brand new campus in nearby Menlo Park, its staff numbers had risen above 2,000, breaching the Dunbar principle multiple times. However, as the ranks swelled, the social experiments grew apace. The Facebook managers were determined to use whatever social tools they could imagine both to create dedicated, specialist project groups and to prevent those little teams from ossifying into competitive silos. Architecture was one weapon in this fight. The new campus was on a site previously owned by Sun Microsystems, another tech giant that had flourished in Silicon Valley in earlier decades. Initially a freewheeling start-up, it later turned into a stodgy behemoth, plagued by silos, its employees tucked into dozens of different buildings, subdivided into small offices and cubicles. “It felt like a cattle pen!” laughed Mike “Schrep” Schroepfer, Facebook’s chief technical officer, who had worked there. “You didn’t have much contact with anyone at all.”
However, when Facebook bought the site, Zuckerberg decided to christen it “1 Hacker Way”. The address sign was painted over in blue and the image of a giant white thumb pointing upwards was added on top: the “Like” symbol on the Facebook platform. Builders ripped out most of the internal dividing walls in the old Sun structures and added whiteboards, bare pipework and graffiti walls. In so far as there were meeting rooms, these were surrounded by walls made of glass, so that anyone could peer into anyone else’s room. Even Zuckerberg worked in the open-plan space, visible to all. So did Sandberg.
Zuckerberg had a “private” office too. But this was lined with glass and placed in the centre of the campus, next to a walkway along which all the employees constantly strolled. A sign saying “Do not feed the animals!” was fastened to the window. “We call it Mark’s goldfish bowl!” quipped Schrep. “Everyone can see him.” Then Schrep went further. He asked the architects to connect the third floors of the separate buildings with walkways. Stretching high in the air, these were painted in the same bright orange-red as San Francisco’s Golden Gate Bridge; doors at the end of the walkways opened automatically when anyone passed. Schrep’s goal was to ensure that the engineers never needed to pause when they rambled around the building: “There is all this research out there which shows that if you can keep people moving and colliding with each other, you get much more interaction.”
The space between the buildings was turned into attractive “rambling” zones, to encourage employees to hang out together in the balmy California weather. A couple of times a year Zuckerberg held “all-hands” meetings in Hacker Square. He also held town halls (or “Q&A” sessions, as Facebook calls them) every Friday in a vast cafeteria. The symbolism was clear: Facebook managers were determined to present the company to the employees as a single, open mass, where everyone could — and should — collide with everyone else, in a freewheeling, irreverent way. Hacker Square was also used to stage another of the experiments in social engineering, the hackathons. Every six weeks or so, several hundred engineers would congregate in the square before later retiring to a large meeting room with bright orange walls plastered with inspirational posters. There they would spend all night together working in small teams on coding problems, testing out ideas, or “hacking”. Being pushed together in a small space and asked to work intensively overnight was one way to unleash the creative juices.
In the early days of Facebook, Zuckerberg had spent all night brainstorming ideas with the other founding members of the company in the house he shared with the engineers. But as time passed, the Facebook managers insisted that the groups who coalesced in a hackathon night had to cluster together with people from different teams from their normal projects and work on something outside their day job. Sometimes engineers congregated in short-term mini-teams because they had made contact in the days leading up to an event, or because they were all interested in a specific problem. On other occasions engineers decided to work with each other almost by chance. Either way, hackathons were intended to break down the normal departmental boundaries. They were another tool to ensure that while Facebook had dedicated project teams, these could not ossify into inward-looking silos.
By 2013, other companies across Silicon Valley were using variants of the same silo-busting tactics that Facebook had developed. At Google and Apple, the employees staged hackathons and rotated staff. The idea of conducting common induction and training programmes for new recruits was spreading. The concept of using architecture as a tool to promote employee collisions and collaborations was also widespread, both inside and outside the technology world. 3M, the manufacturing group, prided itself on running research laboratories that deliberately mixed up different specialists. Google had imaginatively designed facilities that enabled staff to collide.
Numerous other entities were using social media sites to promote better employee communication. But what made the managers at Facebook unusual was the degree to which they kept turning the lens back on themselves and trying new iterative experiments in social engineering. Having built a successful business by using computing ideas to analyse human friendships, they remained endlessly fascinated by how they all interacted with each other.
“I never used to think about this social stuff. It didn’t seem that important,” confessed Schrep. “But then, when I came to Facebook, I realised just how much it matters. That’s a real change! And now I cannot stop thinking about it.”
Gillian Tett is the FT’s US managing editor. Extracted from ‘The Silo Effect’, published by Little, Brown on August 27, £20 (and in the US by Simon & Schuster on September 1, $28). © Gillian Tett 2015
Illustrations by Shonagh Rae
Before I became a journalist in 1993, I did a PhD in social anthropology (or cultural anthropology as it is known in the US) at Cambridge University, writes Gillian Tett.
As part of my academic study, I conducted fieldwork, first in Tibet and subsequently on the southern rim of what was then the Soviet Union, in Soviet Tajikistan, where I lived in a small village between 1989 and 1991. My research was focused on marriage practices, which I studied as a tool to understand how the Tajik had retained their Islamic identity in a (supposedly atheist) communist state.
When I became a financial journalist, I was often wary of revealing my peculiar past. The type of academic qualifications that usually command respect on Wall Street or in the City of London are MBAs or advanced degrees in economics, finance, astrophysics or other quantitative sciences. Knowing about the wedding customs of the Tajiks does not seem an obvious training to write about the global economy or the banking system. But if there is one thing the financial crisis of 2008 showed, it is that finance and economics are not just about numbers. Culture matters too. The way that people organise institutions, define social networks and classify the world has a crucial impact on how governments, businesses and economies function (or sometimes do not function). Studying these cultural aspects is thus important. And this is where anthropology can help.
What anthropologists have to say is not just relevant for far-flung non-western cultures but can shed light on western society too. The methods I used to analyse Tajik weddings, in other words, can be helpful in making sense of Wall Street bankers or government bureaucrats.
The lens of anthropology is also useful if you want to make sense of silos. After all, silos are cultural phenomena that arise out of the systems we use to classify and organise the world.
Studying the silo effect as an anthropologist-cum-journalist may even offer some answers about how to deal with silos, not just for the bankers but for government bureaucrats, business leaders, politicians, philanthropists, academics and those in the media too. That, at least, is my hope.