The social media effect
Microsoft researcher and Data & Society president Danah Boyd talks to the FT's Hannah Kuchler about the effect of everyday technology, such as Facebook, on society and culture.
Presented by Hannah Kuchler
You can enable subtitles (captions) in the video player
Hello, and welcome to Tech Tonic, a podcast that looks at the way technology is changing our lives. I'm Richard Waters, West Coast editor of Financial Times in San Francisco. Last week, we heard from Francesca Bria, a digital innovation expert. She talked about her work in helping European citizens reclaim digital sovereignty. This week, we hear from a researcher focused on the effects of data-dependent technologies, like that of Facebook and Twitter, on society and culture.
If you are interested in destabilising a population's confidence in institutions and intermediaries, you gaslight them. You get the so they can't tell what is reality.
That's the voice of Danah Boyd. She's a principal researcher at Microsoft, visiting professor at New York University, and founder and president of the Data & Society Research Institute. She spoke to the FT's Hannah Kuchler in New York.
So thank you so much for joining me, Danah. First of all, I thought this was fascinating. Data & Society held a workshop called "Who controls the public sphere in the era of algorithms?" in February 2016, and one of the questions in the blurb appeared pretty prescient. You said, "Can Facebook determine the outcome of the US election?" So what made you worry about what we're worrying about now all the way back then?
The funny thing about studying people's use of technology is you get to watch how people try to manipulate it for all sorts of purposes. And for me, a lot of these threads go way back. They go back to the earliest days in which I was studying social media. And I spent the better part of the 2000s looking at how people were trying to alter algorithms for economic gain, political ideals, or just plain entertainment. So fast forward to the point of 2016. We were very interested not just in what the companies could possibly do, where we felt like there was a lot more boundaries set in motion for them, but what people could do to the company's architectures.
In some ways, I don't think this should be surprising, although I fully admit that it has become surprising. Search engine optimisation. This is the beginning of search engines. Political campaigns have built social media efforts early on, and some of those are extraordinarily manipulative. I think what's different now is that people are waking up and recognising that decentralised networks, who are not part of the current political establishment, can coordinate through network technologies to play a role that, historically, only really powerful actors once were able to play.
Yeah, and I think when they're waking up, they're not really sure what we can do about it. Do you have any ideas?
Looking at what's going on, I think it's important to go back to the roots and ask, why? Why is it happening, who's participating under what conditions, and what are those root factors? The reason I say this is that most of the discussions of what to do now are Band-Aid fixes, right. Moments where we're like, oh, somebody has made money over here running misinformation campaigns. Let's stop the money making efforts. And while that's totally reasonable, I think the roots are really critical. Part is what is happening culturally in an American context-- I'll put it in American context first, but there's different parts around the world. In the United States, we've seen strategic efforts over a very long time to fragment the American [? polis. ?]
And that has occurred both at a very specific and localised level in that US military, which used to bring people together across geography, had actually-- plays not nearly the [? specific ?] role that it once did, so that moment of getting to know people that were different has collapsed. Ever since the late 70s when we went 24/7 news media, and we started to see monetisation, and in particular financialisation of our media ecosystem, we've increased the level of opinion making in our media, and it's become polarised in its opinion making.
That's only been further magnified in the US by the lack of local news, which of course is connected to financialisation and hedge fund takeovers. So you have all of these conditions where people are very ripe for a fragmented worldview. Now layer onto that what happens in the US in 2008, where due to an economic collapse, a lot of people throughout this country feel as though their identities are completely destabilised.
They don't know what their future is. They're scared about economic conditions. And regardless of whether or not we actually saw a massive increase in inequality, we saw a massive increase in perceived inequality. And the result is that people felt more and more tribal, so we've already had the conditions to be polarised, and then you create a tribal dynamic on top of it.
And then you go forward, and you have the next generation. You have young people who are looking out to their future and seeing limited access to economic possibilities, huge debt to go to universities, et cetera. And the result is that they're frustrated, insecure, and they're not sure what's going on. And so that moment where they might be just playing with an effort to game Oprah Winfrey on her TV show suddenly turns into a moment where they can shape the political election.
And for a lot of folks, there's something very fun about being part of these campaigns. And I think that's one of the things that people really miss in all this. It's not that there's any one type of actor participating here. We're dealing with many different categories of people coordinating in a decentralised way, willing to share tactic even if they don't share a goal, and that is so strange for anybody used to running a political campaign or running an activist campaign, where you understand what the goal is and you try to align people on tactic. That's not what's happening here.
So you think that the way that we're talking about it at the moment-- We're talking about Russian actors as if they are just a separate discrete entity that is sitting in some kind of FSB building somewhere interfering with elections. You think that we should see it as much more fluid and connected to other issues, and maybe not addressed solely by changes in technology.
I believe that there's a lot of different actors, including foreign, state, and foreign non-state actors. Even when we're talking about who are the Russians, we're quickly going to the internet research agency, which is a well known troll farm who is notorious for leaving tracks everywhere. They're not even trying to hide. That's why they are the easiest to identify. And they definitely have relationships with the Russian state, but are they the Russian state? That's not so clear.
So what happens is, when you have alignment there, that may not actually even be the state, where my guess is we're also seeing state actors and we're seeing other non-Russian state actors. But we're also seeing white supremacists, and white nationalists, and we're seeing Hindu nationalists, and we're seeing conspiracy theorists, and we're seeing teenagers. And we're seeing all sorts of different groups all throwing spaghetti at the wall trying to see what sticks.
What's notable about their efforts is that they're trying to find vulnerabilities in our media landscape. And that means both our social media landscape and our news media landscape. So they're trying to find the moment where journalists will actually propel their views and scale them and amplify them by either know repeating a message that they're trying to get out there, or by negating it, knowing full well that the boomerang effect's in play, which is that if you don't trust a news media outlet, and a news media that tells you something, you might think there's actually something true to what they're trying to negate.
They're also looking for vulnerabilities in social media, and their relationship to social media is different depending on the platform. With Twitter, it's actually mostly about trying to get to journalists, not trying to get to the mainstream public. And so some of the most common tactics I watch are these sock puppet accounts. These fake accounts who tweet at journalists, not tell them to follow a link or do something, but asking questions.
Asking journalists a question for which you know that a journalist is like, oh, is there something there, and starts googling it, having found, then the YouTube video and the blog post that was set up for the journalists to find. They are targeting an amplification agenda. Now with Facebook we're dealing with a different environment. That's all about trying to get information to spread, but what is the end goal, right?
So one of the things I think that will come out as we watch the Internet Research Agency, the Russian-affiliated corporation and their campaigns on Facebook, there's nothing common about the kinds of ads they put out there. They have ads that are pro and against pretty much every position you can imagine. What they have in common is a general sentiment that you should not trust the media, and you should not trust the government of your own country. So the narrative that they're putting out there is not one to propel a specific belief, but to ask people to question everything.
That's also a form of critical thinking. That's also a narrative that we encourage in our education system, and so we're seeing these coupling of frames happen at scale. And there's no doubt that internet technologies mirror and magnify the good, bad, and ugly of every part of the system, and that social media platforms are part of it as are news media platforms. The question is, how do we deal with it as a systemic issue rather than thinking that we can solve it by any one actor doing better or going away.
So do you think that the media needs to be better at, rather than just throwing its hands up and saying, oh my god, there's so much misinformation out there, and actually trying to track misinformation and making sure that it doesn't amplify them?
I think it's imperative for the news media in particular to step back and recognise that they may be part of the ecosystem, they may be being gamed, and to figure out what stories they should not be amplifying. Consider this: it used to be common amongst journalists to not cover when people die by suicide. Why? Why did journalists not cover it? They didn't cover it because they were concerned that they would actually create replication. We'd seen a tonne of studies showing that the more journalists talked about suicide, the more people then went and tried it, so the worst thing you could do when you were covering the death of a celebrity, which was itself a newsworthy story, was to highlight that they had died by suicide.
We've actually undone that strategic silence. We have seen that manipulation occur in a variety of domains. Islamist conversations. What do we cover in terms of beheadings, or in terms of any ISIL or ISIS related content? It may be newsworthy, but how do we not use our platforms to radicalise? And that's really imperative for news media actors. For social media actors, I think a lot of it is trying to identify this new form of gaming. And in some ways I think of it as a security vulnerability.
So we have historically understood security has an access issue, something like the Equifax breach, where the issue is getting access to people's data or information, but there's a new form that has emerged because of these our systems, which is data injection adversarial attacks. And so I think that, just like companies had to start thinking about search engine optimisation and who was trying to mess with their systems at scale, we also have to look at it in this light.
And I think we have to get innovative, but I think the most important thing to realise is that we will never fix it. One of the reasons for taking on a security mindset is to realise that it's an iterative and evolving process. As long as we have people that are interested in bringing down institutions and information intermediaries, we're going to have these vulnerabilities. And so we need to get smart about how we evolve our understandings with it, rather than thinking that we can find the silver bullet.
So if we're trying to learn from the security industry, do you think we should be doing things like having red teams, which is what the security industry does, to have teams which their whole purpose is to try and penetration test, see where they can get into the network, but instead-- this might not be finding the loophole-- but see how you can use the network for impact?
I'm a big believer in red team and blue team efforts. The tech industry used to believe in the culture of tests, which is that you would do quality assurance assessments of a system before it went out there. One of the things that social media changed, back in the early days of Friendster, was to let the public actually do the test for us. To roll out a new feature and find the bugs, because of something that broke when it was live on platform. I think that is now costing us tremendously.
My colleague Matt [? Gritson ?] talks about the need for white trolls, which is what does it mean to actually ask people to mess with the system in that red team sense to really see how to build it out. Certainly you need that kind of adversarial thinking whenever you're trying to build a system, because one of the challenges and the beauties of building these technologies is that people design for the ideal end goal. They design for what could be, not for all the ways in which it could be abused and misused. And I think that's one of the reasons why a lot of tech actors are completely flabbergasted by what's going on right now, because they so desperately want the technology to be doing the good work that they designed it for, and that is being-- the technologies are being used for, but that also means that along with that good, we're also seeing some deeply destructive practises.
How much do you think is structural? I mean, there's also been a lot of talk in the last year or so about attention, and how things will spread through Facebook faster if people are liking them and people are looking at them. And so, therefore, that really plays into the hands of anyone trying to disrupt the system. Facebook isn't going to change its newsreader, make it like a curated newspaper, which has nothing to do with how much we pay attention to stuff. They seem to have a very good motive to keep people on site for as much time as possible.
I would argue that all of the media right now has an incentive to try to get people to click and consume as much as possible. News media as well social media. That is the cost of financialisation. That is the cost of needing to return on investment every quarter and make more money and more profit each time. I think that it's been an existential crisis for news media, because they've tried historically to hold on to two imperatives, one of a faith-based belief in producing content for the public to create an informed citizenry, and an economic imperative that is affected and shaped their industry writ large.
Social media undoubtedly grew up purely in an ecosystem where profit was already part of the story. so even though most tech founders started with an ideal of community, an idea of connecting people around the world, they have to come to that question of financial motive really fast. The challenge with an advertising based model, whether we're talking news media or social media, is that it is all about eyeballs. It is all about attention, and that attention can be very, very costly, because what is most motivating for people are often the things that are least psychologically beneficial for them individually and least socially beneficial. So if it bleeds, it leads, and the same thing operates within social media.
We see this content that makes you know emotionally responsive. And indeed one of the biggest challenges for polarised content is that people hate post. They throw content into Facebook to express their anger and their frustration about what some other might believe, and, as a result, they help propel content that is extraordinarily costly. And I think that's where the amplification mechanisms on social media are not just the platforms themselves, but the way in which the public as a whole becomes a part and parcel of that amplification channel, which is one of the reasons why we have reached an emotional exhaustion with consuming a lot of news content at this moment, because it's just overwhelming.
And that's one of these broader geopolitical environments, because again, you can manipulate that. If you are interested in destabilising a population's confidence in institutions and information intermediaries, you gaslight them. You get them so that they can't tell what is reality. And one of the best ways to do that is to overwhelm them. And in an ecosystem that is written-- driven by attention, and driven by eyeballs on sight, of course that's very easy to do, so I think it's exactly accurate that this is a lot of attention, and attention is connected to financialisation.
And do you sense any kind of backlash brewing against the amount of attention that we give to social media?
So there are certainly individuals who are sick of it and opting out, or trying to go to environments that are filled with kittens and babies. There's also simultaneously a lot of very public critique of social media, in particular. Calls for regulation, calls for companies being pressured into doing different changes to their product. And that is all coming from this very heady moment where people can't tell which way is up, but the broader challenge I'd say is that, when people are in a state of true emotional exhaustion and they're calling for change, they're not going to find solutions to the structural problems.
They're going to be looking for the Band-aids for the moment, and I think we're going to build a lot of Band-aids for the moment right now. I don't think that there's any way not to, because it's politically necessary for government, for industry actors, for a whole variety of civil society groups, but I'm just hoping that we can use this point of exhaustion and point of frustration, and people wanting to be out of the situation, to propel into a broader, more systemic change around how our media infrastructure operates, and how we band together as a society. That's a longer term project and one that I don't know that we've even begun.
Yeah, people briefly quitting Facebook after the election isn't going to change that. The other thing that there seems to be a lot of attention on, something that we've known we're moving in the direction of huge amounts of data being sucked up about us for a long time, but I think the recent developments in the advancing machine learning has made people more concerned that the amount of data that's being collected about them might be used for things that they don't really understand at all. Do you think this is an exciting moment or worrying moment in terms of data?
I think it's both. Let's talk about the positive first. We are on the precipice of a whole set of advances in medicine that are only made possible by sharing a tremendous amount of data so that we can understand things like the growth of cancer. The possibilities and potential of precision medicine are phenomenal, and we want to remedy the potential harms, especially in a country that connects medicine so deeply to an insurance profit motive, but we want to make certain that we can develop innovative remedies for people who are suffering from terrible diseases. And so I'm extraordinarily excited about the amount of data that we need to ingest and build the advancement in machine learning and neural networks technologies in order to get there. That's phenomenal.
On the flip side, we're using the same types of technologies, and the same obsession for data, to amplify practises that have had long and historic problems. Think about what we're doing in the criminal justice ecosystem. Criminal justice has been a disturbing and fraught sector from the beginnings of this country, not to mention the degree to which it is extraordinarily racist and destructive.
We have more people incarcerated in the United States now than we had enslaved during the height of slavery. It's a terrible situation. So when we introduce technology and amplification mechanisms into that space, and we try to train systems based on historic paths, all we're doing is amplifying historic prejudice and historic racism. And so that's an environment where I don't think we know what we're doing, and I don't think we're being responsible in how we're doing it. And there is indeed a profit motive that's driving it, and an obsession for efficiency and lowering of costs without actually questioning whether or not this is healthy.
Then, of course, we go back to the news information and the social media context. On one hand, people want content that connects them to their world. And you ask anybody what they love of social media, and it's about the ability to connect. It's about having the information at your fingertips, and the ability to connect with people you love across distance. And that is made possible by the sharing of data, because you're sharing it with people to connect. But that same data can be used to engage in different kinds of surveillant practises for economic incentives, advertising, and of course for other political incentives, and so that's one of the things that's so tricky, because it's not really about individual data in any of those 3 cases.
It's about how your data relates to other data within the system. It's about the ability to understand prediction, and when we're engaged in any act of prediction, we are basing our information on the connective tissue of the rest of society. And the question is whether or not we're doing it for end goals that we believe are productive or ones that we believe are destructive. And one of the biggest challenges is that not everybody agrees on what those end goals should look like.
And do you think we'll see-- We talk a lot about personalisation. Do you think that we'll see philtre bubbles coming out of the realm of Facebook and coming into all sorts of realms? I mean, for example, there's been some interesting stories written about personalised pricing on Amazon. Do you think we end up in a situation where we actually have constructed different realities for everyone?
Humans self-segregate because it's comfortable. We find people that are like ourselves. And lots of sociological and psychological literature has shown time and time again, that given any form of choice, we go to a whole world in which we have as homogeneous of a surroundings as possible. The challenge is that what Facebook has been designed to do is to amplify your desires. And if your desires are to live in a philtre bubble, it will do that work for you. It will amplify them.
And that's one of the reasons why these systems are so costly, these personalisation systems that are indeed going from Facebook to every other environment, because you want to be in your own comfortable environment. You want to be in your gated community online, offline, and everywhere else. The question is what does it take to rebuild a society in which we are able to work across distance?
So I refer to the idea of the military, and the military is a really important part of an American social infrastructure, because the US military, the standing army, has always been based on the idea that people from around this country, who are fundamentally different, had to be socialised into a unit across difference to stand by each other enough to die for each other and to die for this country.
That is an important commitment. There is nothing at that scale. We're not going to ask everybody to do that. And, indeed, as we lose structures like the military, we lose the processes of being so utterly uncomfortable. How do we commit voluntarily to being uncomfortable again in order to meaningfully connect? That's going to be the solution to personalisation, the counter to personalisation, but as long as there is an economic imperative to giving people what they want, we will see personalisation produce philtre bubble after philtre bubble.
I'm going to slightly change tack now. One of the other issues that the tech industry has been dealing with a lot in the last year or so, or at least it has become public in the last year or so, has been conversations about people speaking out about sexual harassment from VCs and other tech leaders and senior men in tech. And you wrote quite movingly about your own experience with inappropriate propositions from tech entrepreneur Marc Canter, and I just wanted to look broadly at where we are in the industry on this. I mean, do you think that after all this publicity, harassment is going to decline?
No, I actually think it's going to get worse, and that's what I struggle with most. So sexual harassment, and more insidious forms of sexual abuse, inevitably come from a differential in power, an abuse of power, and that abuse of power is further magnified when people don't believe themselves to have power and yet they do.
One of the biggest challenges amongst the tech industry is that many of the people who are part of it see themselves as geeks, as outsiders, as social misfits. They don't see themselves as being powerful. That's part of why we're having such a confused conversation about the role of social media. Facebook doesn't see itself as being one of the most powerful companies in the world. And so what happens is that, whenever you have power and you don't know it, the likelihood of abusing that power only increases.
So let's look at what tech dealt with in terms of historically around that power. So geek culture often allowed for a form of shared identity for people who felt socially ostracised in other parts of society. Nowhere was that more honourable than geek forms of masculinity. The idea that you could you weren't necessarily one of those alpha men who were making it big in the financial sector, but you had pride in what it meant to be a guy in tech, and that normalised a certain form of masculinity that we saw grow at scale.
And then tech all of a sudden became the most powerful sector in American society, but it didn't necessarily go with the cultural ethos. So we may make movies about geeks and celebrate them in ways that look very different than real genius back in the day, but for most tech folks this is their safe haven of their people like them. And now we're asking hard questions about who gets to be a part of tech, because tech has so much power. We're asking why are there not more women in tech. We're asking why are there not more people of colour in tech.
And those are very important questions to be asking, but for a lot of people whose identity is wrapped up in what it means to be in tech, they're basically feeling as though they're being under attack for not being acceptable enough. The same attack vectors that actually made the culture emerge the way that it did.
And, unfortunately, what that has meant is that, not just at the highest levels of tech, but the whole way through tech, we are seeing increased kinds of misogyny, increased kinds of sexism, and all it takes is a little moment when a mid-level engineering manager has been told and incentivised that they need to be more diverse in their hiring practises, and they have an interview candidate that they don't believe is as qualified as a male candidate, and they're pressured by their boss to hire that woman. And that is a radicalisation moment. That is a moment in which anger and resentment and misogyny start to breed.
And so one of the challenges for me is how do we get through this, because our current incentive structures aren't working. Our shame structures may push out individuals, but they don't address the broader systemic issues. And I think it's going to take a lot of hard work to do that, but the most important work is going to be in getting by and writ large from people to realise that we have to collectively rethink what the culture of this industry is. And that's going to be hard for a lot of people who are deeply wed into the correct culture.
There's so much hard work to do. Thank you so much for taking the time to speak to me.
Thank you for having me.
We'll be back next week with another episode of Tech Tonic. In the meantime, if you'd like to comment on today's show or suggest any topics you'd like us to cover in future episodes, please email us at firstname.lastname@example.org. Don't forget to subscribe to our show on your favourite podcast app, and if you write a review, that will help other people find us, too. Thanks for listening.