Media Distortions is about the power behind producing deviant media categories. It shows the politics behind categories we take for granted such as spam and noise, and what it means to our broader understanding of, and engagement with media. The book synthesizes media theory, sound studies, STS, feminist technoscience, and software studies into a new composition to explore media power. Media Distortions argues that using sound as a conceptual framework is more useful due to its ability to cross boundaries and strategically move between multiple spaces – which is essential for multi-layered mediated spaces. The book introduces two main concepts – Processed Listening and Rhythmedia – to analyze multiplicities of mediated spaces, people and objects.
Drawing on repositories of legal, technical and archival sources, the book amplifies three stories about the construction and negotiation of the ‘deviant’ in media. The book starts in the early 20th century with Bell Telephone’s production of noise in the training of their telephone operators and their involvement with the Noise Abatement Commission in New York City. The next story jumps several decades to the early 2000s focusing on web metric standardization in the European Union and shows how the digital advertising industry constructed what is legitimate communication while illegitimizing spam. The final story focuses on the recent decade and the way Facebook constructs unwanted behaviors to engineer a sociality that produces more value. These stories show how deviant categories re-draw boundaries between human and non-human, public and private spaces, and importantly – social and antisocial.
Elinor Carmi is a researcher, journalist and ex-radio broadcaster who has a passion for technology, digital rights, and feminism. In the past 8 years she has been examining internet standards, specifically the development of the digital advertising ecosystem such as advertising networks, real-time-bidding, and web-cookies/pixels. Currently Dr. Carmi is a Research Associate at Liverpool University, UK, working on several projects: 1) “Me and My Big Data – Developing Citizens’ Data Literacies” Nuffield Foundation funded project; 2) “Being Alone Together: Developing Fake News Immunity” UKRI funded project; 3) Digital inclusion with the UK’s Department for Digital, Culture, Media and Sport (DCMS). On February 2020, Carmi was invited to give evidence on digital literacy for the House of Lords’ Committee on Democracy and Digital Technologies, at the British Parliament in London, UK. In addition, she has been invited by the World Health Organisation (WHO) as a scientific expert to be part of the closed discussions to establish the foundations of Infodemiology. Before academia, Elinor worked in the electronic dance music industry for various labels, was a radio broadcaster and a music television editor for almost a decade. In 2013, she published a book about the Israeli Psytrance culture titled “TranceMission: The Psytrance Culture in Israel 1989-1999” (Resling Publishing). She also tweets @Elinor_Carmi.
The following is a transcript generated by Otter.ai, with human corrections during and after. For any errors the human missed, please reach out to cms@mit.edu.
Scot Osterweil 00:33
But Elinor is a postdoc at University of Liverpool in media studies. And one of the things just by way of introduction, one of the things that I was interested to sort of observe as I came to MIT, comparative Media Studies, not as a media scholar, but as a practitioner, was all the ways in which the field here certainly at MIT was rooted in the work of sort of communications engineers. that that that that many of the roots of some of media studies were in the work actually, of engineers. And I’ve particularly think about that in the way that the concept of signal and noise which is so relevant, our work sort of had its origins in that field. And what’s so intriguing about elders book, to me at least is the ways in which she’s quite quite a scholars have
Scot Osterweil 01:30
meaning someone.
Scot Osterweil 01:34
Anyway, the ways in which media scholars have tended to focus on signal on the signal. And so she’s taken an interest in her work, she’s taking an interesting look at the noise. And the other thing that I’ve struck by Elinor in your, in your work is the focus on sound as, as a metaphor, we frequently the ways in which we use metaphors around seeing, and you’re suggesting that we use more metaphors around town, so. So I’m really intrigued to hear what you what you have to say, I’m sure everyone else here is too. So with that, I will simply hand the floor over to you. So
Elinor Carmi 02:12
great. So thank you very much, Scott, I’m going to do the share screening parts before we start. Um, so thank you very much for this introduction. And as you said, the timing of this book was perfect for the apocalypse. And I had my ugly cry share of you know, not being able to actually come to Boston. But thank you so much for giving me this opportunity to present to you and I’m really looking forward for the date. I have kind of like an hour and a bit more, but I will try to shorten the presentation so that you can have more questions. And I’ll just start so Hi, everybody, and I’m gonna do a talk about my new book, media distortions. And you can download a book, it’s open access, and you even have a playlist with dark motives and things like that. And you can see things that didn’t come into the book. So please go into that website. A bit about me, I like to go to Comic Con, as you see here, I thought that it would be suitable to show you how I always think about different kind of bots and different kind of deviant things. I’m, I’m a researcher, activist, feminist. And currently, I’m working on several projects, which sort of continue what I’m going to talk about today around data literacies. I’m going to talk about that a bit more towards the end of the talk. But I also want to talk about my background, I used to be a radio broadcaster of psychedelic trance I used to edit television channels, is to be a journalist of electronic jazz music culture, and write about that. And I wrote a book my previous book was with the Israeli psychedelic trance culture. So for me, sound has always been a way to think and examine things through. So I think this is quite important into sort of introduction of this book, origin story, if you may. So why focus on the deviant us, um, which is, I think, something that, for me, I always was very attracted to that. So even if you think about the Israeli psychedelic trans culture, that was a culture that was sort of deviant in the Israeli culture. And I was always intrigued with how different kinds of things get categorized as deviant. So for me, part of the part of the power of examining things that are a bit in the outskirts, a bit called deviant is to really understand the politics of drawing these kind of boundaries of what is deviant and by understanding what is the deviant we can understand what is the norm much better So we can in my book, and in my research, I always try to understand who created these kind of categories of what is legitimate and what is illegitimate. Why and with what transgression led do that, who does this category serve? And how do these categories affect the way that we engage with media and technology? I also questioned what is a media phenomenon. So when I started my PhD, which this book is based on, a lot of people said, Oh, but this is, you know, this is marketing, or this is advertising, or this is computer science. And actually, people thought, Oh, you know, we know what is spam, it’s like emails about Viagra and Nigerian princes. And for me, one of the main maybe points to take from my book is to question these kind of things that we take for granted. Whether it is different kind of terms of think, or a different kind of definitions of things. As media scholars, I think it’s really important for us to not take for granted what computer scientists tell us that things are, but also what you know, marketers or advertisers tell us things are. So this was kind of my standpoint. And also, of course, to examine these kind of boring things, or these kind of things that are common sense or taking for granted.
Elinor Carmi 06:17
So in the book, I look at I examine a three case studies, I’m going to focus on one today. But I encourage you to read the book, and you’ll see all the others. And what is really important for me is that when we’re talking about these kind of deviant media categories there, they keep on changing and evolving. So I focus on noise and the early 20th centuries of Bell Telephone as one of the biggest media company of the time, and how they sort of structure different kind of territories, and people’s behavior. Then I focus on spam, which isn’t the sort of the end of the 90s and early 2000s. And then I focus on Facebook, and how they categorize anti social behavior. So what is important for me to say is that I think some of the problems for us as media scholars is that the sort of that our media, our research objects keep on changing very fast. And I think one of the things to take from from these kind of things is the bigger questions, right? Like, we’re probably some of us are doing research on Facebook, which hopefully wouldn’t exist in a decade, but what do we actually take for the questions that we want to ask within Facebook. And this is something that is also very important for me to to emphasize that media is gonna come and go, but the kind of larger questions of how different kind of categories shape our behavior is something that’s really important for me to examine. So, in a nutshell, my book is about media power. However, most of the times when we think about media power, and most of the sort of the theories that we have around it, and especially when we talk about the internet, of but also it’s like the internet is, you know, we just take them Michel Foucault’s panopticon, which uses very visual concepts about what we can see and what we can see it but also, what’s really important about the panopticon is architecture has a huge element in this kind of media power theory. We also have a lot of different kinds of terms. When we talk about these things. We talk about vision and visibility when we talk about algorithms, or AI and different kind of things like that. And of course, let’s not forget Frank Pasquale is a book about the black box. And what I sort of sort of realized as I was examining my research, and if you see I just now I almost said I saw it. So it’s kind of a crazy how ingrained this kind of visual concepts are, in our terminology, and how we think and how we explain different kind of things. So when I was writing the book, I had to change different kind of words into sound concepts that made me realize how ingrained it is and how we think and engage with it. So one of the symptoms, these are the kind of things that I felt that are missing within the visual frameworks, which is, when we’re talking about these kind of power relations between media companies, we’re talking about multiplicities. We’re talking about multiple actors, multiple spaces, multiple time, multiple purposes of conducting listenings and rhythms, which I’m going to talk about shortly, and different kind of architectures. So what vision doesn’t really allow us is to sort of go between the boundaries of spaces and usually when I do this kind of talk, I do this kind of experiment where I show that, you know, if I’m going to shout, it’s going to pass the walls, but my vision is kind of constrained within time and space. So my, the theoretical approach that I developed is influenced by multiple other approaches, such as media theory, and you’re going to talk about it a bit later in the science and technology studies, software studies, feministic Oh, science, critical Legal Studies and force sound studies. And it’s really important for me to emphasize that I used grounded theory, which means that I didn’t really assume that I know all these things in advance. And it’s sort of these kind of things came up as I was examining the material that I was doing research on. So these are the kind of Sonic epistemologies and I’m going to come back to that a bit later. And for the people who are reading my book, and maybe notice each chapter, each of the share of the case studies is divided. The first half of the book is dedicated to the service structuring of territory, which is the media, and then the second part to the process listening. So what I, what I’m trying to say with these things, is that I am taking a different kind of approaches. And I’m showing how power relations are constructed with these two concepts, which I’m going to talk about shortly.
Elinor Carmi 11:09
And I’m going to start with the first one process listening. When we’re talking about Science and Technology, most of what we’re talking about knowledge production, we usually we usually hear about theories that use vision concepts. However, there are sound theories to happen using sound as a way to sort of understand and to produce knowledge. So one of the main sort of theories that has been produced was by Alexandra Supper and Karin Bijsterveld from the Netherlands, and they’re talking about different kinds of practitioners who produce knowledge. So for example, doctors, when they listen to you with a stethoscope, or car mechanics, listen to the car, and how they make date different kinds of diagnosis, in order to understand what’s happening with different kind of bodies. So they make this kind of classification of different kinds of modes of listening, and how these kind of practitioners make their assumption and then make different kinds of claims with our bodies healthy, or is it malfunction or different kinds of things like that, what I noticed when we’re talking about the online environment or a mediated environment, these kind of modes of listening are not enough, because we engage with different kind of environment. Therefore, I developed a new mode of listening, which is called processed listening. It is a mode of listening or by practitioners, you can come from different kinds of professions and interests, listen to different kinds of sources, with different kinds of tools in different times in order to produce different kind of knowledges. And by knowledges, I mean different kinds of profiles. So for example, when Facebook listens to our behavior through a different kind of cookies and pixels, it creates a different kind of profile on us. And then they can rearrange the platform in different ways to make interventions in our tempo special experience. And this is what I mean by Earth media. Now, when I was talking about when I was trying to examine how different kind of media companies shape the way the different kind of information is flowing or not, I realized that people use different kind of concept like flow data streams, data traffic, and channeling. And what I realize is that these concepts don’t really explain to you the politics, behind how different kinds of information or different kind of connection is made possible or unpossible. To us. And so, I was very influenced by as I said before, a different kind of media theorist, especially Raymond Williams, and his concept of plant flow. I was also influenced by feminist technoscience by their notion of process. And of course, Henri Lefebvre a rhythm analysis. And basically what all of the combined concepts are saying is that when all of these companies, whether it’s Facebook, or I’m going to shortly talk about the online advertising industry, when they listen to you through different kind of instruments in order to create the profile, they then create different kind of architectures that are changing according to that profile. So the way that we engage with platforms have a different kind of ordering rhythm, which is influenced by different kind of political decisions that are usually influenced by advertising logic, and obviously, money. So I like for example, if I’m a doctor, and I listened to your body in one session, and that event has a beginning and an end, when different kind of platforms or you know, web and online advertisers Listen to me There isn’t a beginning or end to for example, if Facebook listens to me, it’s not like Oh, now I know everything there is to know about Elinor by the 22nd of October 2020. And I don’t need to listen to her behavior anymore, is there is an ongoing process of listening to my behavior in order to have a richer profile. And then this profile is then helping these kind of companies to create different kind of architectures that are arranged in a specific way to make me engage more or to make me click on a specific ad, or to make me maybe sort of comment on different kind of inflammatory posts and things like that.
Elinor Carmi 15:42
So with with media, what I’m trying to say is that it the way that media companies reorder different kind of components in a way that orchestrates desired rhythm. So they with this kind of ordering, they decide what is sociality, well filtering out problematic rhythms, which they defined as either noise or spam or anti social behavior. So these kind of practitioners they conduct the way that mediated architectures change according to the knowledge that they gained from process listening to people’s bodies across multiple spaces. Now, if this sounds a bit confusing, then don’t worry, I’m going to give I’m going to go into the one of my case studies. Now. So I’m in the end of the 1990s. And and I just want to say that I decided to, it’s really hard to know when you write a book, and you want to talk about everything, and you can, you know, you don’t really know what to decide to focus on, I really wanted to focus on the standardization of web metrics, which is the end of the 1990s and the beginning of 2000s. Because I think that everything that we experienced today, around profiling around fake news around disinformation and misinformation, and basically, the problematic and broken online ecosystem where we are the product started in those times. So I think a lot of the times, you know, people talk only about the last decade, but we’re talking about processes that happened probably 20 years and probably even before, but I think it’s really important to identify key moments where these kind of things, which is basically the surveillance of our online behavior have become normalized. And so in this case study, I’m basically looking at how did the standardization of web metrics, which means how to measure different kind of behaviors in order to trade them in an efficient way. So as you can see here, of IAB, which is the Interactive Advertising Bureau, I wanted to standardize different kind of metrics. Because in those times, if you remember, the internet wasn’t really, the web wasn’t really a technology that which was a sort of that people knew that it’s going to succeed. So a lot of the time we take for granted that the internet is succeeding, but actually, in those times, that the kind of subscription model didn’t quite work. And with the dotcom, bubble crash, people didn’t really know what to do. And therefore different kind of company specific companies managed to survive the.com bubble crash, including Amazon, and then different kind of platforms started to emerge, which gave a free service. Whereas, you know, as we all know, it wasn’t quite free, but we were the product. So let’s see what happened in those days. So this kind of the IB wanted to actually understand how can we standardize different kind of measuring units? How can we standardize how we measure people’s behavior in order to make this kind of product? which is our behavior as efficient as possible? Because if we are the product, then the kind of the currency needs to be agreed upon with all of the actors that are involved. Does you can see here, these were the most sort of common
Elinor Carmi 18:54
metrics.
Elinor Carmi 18:57
And what they basically wanted to understand is, how are they going to measure it through different kind of angles? Will it be through the ad serving? Will it be through people’s computer, and what will actually count as a click what will count as a total visit, and what will count as an ad impression? Now, for me, one of the most interesting thing that was part of this research was to actually analyze different kind of internet standards through the IETF, which is the Internet Engineering Task Force. And also, I analyze different kind of legal documents in order to understand how do they actually decide what’s happening there. So when I was starting to read the cookie standard, which is what you’re looking at here, I started to realize that actually, what’s happening in the backend is that when you visit a website, you through different kinds of default settings, you are being sent a lot of cookies, which could be dozens of cookies could be hundreds of cookies, in order to basically send different kind of information. About your behavior, if it’s from the website that you type on, that would be first party cookies. So that means that, you know, if, for example, I look at The Guardian, then it will be a first party cookie that the Guardian collects information on me. But if it’s third party cookies, it could be ad exchanges, or different kind of data brokers, who are also gonna sort of listen to my behavior across different kind of platforms, and throughout time. So what I basically saw here is that there is a standard, and we’re actually not really aware of what’s happening. What was really interesting with the standard, is that the people who suggested it at the beginning said, Well, actually, maybe we will show part of the standard will be to show people what’s happening in the back end, because people don’t really understand that all of these these things are happening. And advertising industry at that time said, No, no, it’s going to confuse people. So let’s just show them, you know, that the front end, as computer scientists call it, so that it wouldn’t confuse them to see what’s happening in the back end. Um, but actually, when we think about cookies, and one of the things that struck me is that all of the people who are talking about cookies, whether it’s computer scientists, or even media scholars, because just a text file, but this just a text file is actually a communication form, is what cookies actually do, is to print different kind of text files. But every time that you make these kind of whether I’m reading an article, or I’m clicking on things, these kind of things are communicated to different kind of entities. And those can be either, as I said, before the Guardian if it’s first party cookies, or it could be other data brokers, or different kind of agencies, which we are not aware of. So this is what I was talking about sort of knowledge not going into the common sense, or the common way of defining different things. This is one of these moments where I realized that actually, you know, we’ve been sold that cookies are just text file, this is also part of a legislation that’s going on behind it. But actually, it’s a form of communication, because they’re communicating different kinds of topics, which could be my gender, the kind of device that I’m using, the kind of a broadband that I’m using, and different kinds of things throughout time.
Elinor Carmi 22:22
So one of the other things that the advertising industry wanted to do was because there was all of these kind of robotic behavior, they wanted to understand who is human and who is not human, because it was really important for them, if we’re the product, it’s important to have exact measurements. So what they did was to develop this different kind of made measurements, and what they call filtration, and filter, who’s human, and who is not human in order to make accurate measurements of what is happening so that they can treat us. So as you can see here, they develop different guides. So for example, the base basic method, which was a text file of robots, the other one was identification of specific suspicious or non human activity, which was a different kind of lists that you were supposed to send the IAB. And another thing was to analyze the rhythms of users activity, which I found really interesting, because what they were basically defining there is what are the kind of behaviors that are can be considered as your bodek? And hence non human? And what are the kind of behaviors that can be defined as humans? And as you can see here, these are the kind of definitions that they had of what is the robotic behavior, the sort of what did they analyze in order to establish that, so they establish what users performing multiple sequential activities, users with highest levels of activities, users who acted in consistent interaction attributes, and other suspicious activities. And so what they would do with that is basically not only to try to understand and to measure people, but also to define what is a human and non human behavior. And for me, actually, one of the first times that I realized that was in my first book, when I published it, all of the people that I interviewed, I invited them to come to a launch event so that I can interview them. And I sent them all the same message on Facebook because I was, frankly, a bit lazy, and I just changed their name. And after a while, I didn’t hear from a lot of them. And I was really offended. I was like, I interviewed you at least tell me you know, yes or no or something. And then one of them who works in the music industry told me Oh, did you send the same format of message and I was like, Yes. And he was like, Oh, well Facebook then will think that you are a bot or a spammer of some kind, and then it will send your message into the other folder, which could be you know, the spam folder or the junk folder. Because they analyze your behavior, and this is a, you know, a sequential activity and with the same activity, maybe just change the name. So that was something that triggered my my sort of thought of like how these companies are making these kind of decisions of which kind of behavior is legitimate, and which kind of behavior is illegitimate, and then shapes our behavior accordingly. Because what he told me that label manager is that he, of course, has the same kind of messages, he changes them in a way that will try to avoid this kind of being called spammer. So, this, this was the sort of analyzing the internet standards part. Here, I’m showing you how in the European Union, when they were trying to make legislation around the electronic communication that was starting to be more popular at the time. How did they actually define that? And how did the advertising industry you lobbied the European Union? How did they manage to bypass that? So what you’re looking at now is the special article from the electronic director from 2002. And the way that they define it is the use of automated calling and communication system without human intervention, for the purposes of direct marketing, and may only be allowed in respect of subscriber having given their prior consent. If you’re European, you probably know about the prior consent, I’m going to talk about it, uh, shortly. But we know that there are a lot of different kind of activities, which we consider as part of that. So for example, if you remember the horrible spam attack that we got YouTube, into YouTube and into our computer without being asked without our prior consent. And then, you know, when Apple was talking about that, they said, Oh, well, actually, with article two of that law, it means that once you have some kind of connection with a company, then that makes it and informed consent, and then find consent or in turn, or implicit consent means that if you have any kind of interaction with this company, then you are in a kind of weird relationship where you can, you you are allowed to send these people these kind of marketing
Elinor Carmi 27:24
things. And during the end of the 90s, in the beginning of the 2000s, there were a lot of debates of whether, you know, cookies should be legitimized, and what’s happening there. And this is this is an article that I really liked from 2001, on the left hand side, where the chairman of the United Kingdom says, cookies have been branded as spyware tools, or some kind of subversive software. But it’s what we use every day. And I think it’s quite telling that this is 20 years ago, where even then people understood that you do cookies, and pixels, and all these things are basically spyware, and doing this kind of surveillance over our everyday activities. But the kind of normalization that it happened throughout the years, I think, is something that we need to remember that happened with the kind of the lobbying of the advertising industry, that managed to make it seem as if it’s nice and friendly, even with the name when you think about cookies. Um, so actually, according to the law, both cookies and spam are the exact same thing, because both of them are sent through automated systems. They are meant for direct marketing, and they are sent without our prior consent. But it was really important for the advertising industry to make a difference between legitimate advertising practices, and illegitimate advertising practices in order to standardize what was starting to be the online data broker online ecosystem that we know today. So part of what we understand and we know today is the real time bidding, which is happening in the back end of our screens. And we don’t know about it, because as I said, before the advertising industry lobbied for us not to understand what’s happening again. And basically Facebook and a lot of these companies, including Google, they basically took this standard of real time bidding and developed it into their own different kind of systems. So when I was talking before about these kind of what I was talking about, why should we use sound concepts is exactly these kind of multiplicities of actors who listen to us throughout time. So we have so many companies who are listening to our behavior in the back end, it could be Facebook, it could be Google, it could be Amazon, it could be you know, your your government. It could be different kind of data brokers that were not aware of these Companies are then trading these data without our understanding or consent. And so as I said before, this was part of the politics around that is to create these kind of architectures where we don’t really understand what’s happening. And part of the sort of one of the ways that the sort of the people who develop the standard said, Oh, you know, we’re not going to make it visible, people can still change the cookie setting in somewhere in the setting of the browsers. And this is part of the politics that I’m talking about arranging the architecture in a way that it will be difficult for us to actually engage in it in a meaningful way. And I thought it was quite funny. Because today, when I wanted to post
Elinor Carmi 30:45
a,
Elinor Carmi 30:47
sort of advertise this, on my Facebook, I used a different kind of browser. And then Facebook asked me if I want to accept all cookies. So if you look at the left hand side, we have a lot of choice here, we can either accept all, which Facebook obviously wants us to choose because signals in blue. And then if you want to manage data settings, you go to the picture in the in the right side. And as you can see, the only way that I can engage with Facebook is by accepting cookies, there is literally no way for me to decline, to negotiate to say, maybe I want these cookies, maybe I don’t want those cookies. And for me, this is this is part of what what I’m trying to say with my book is how these kind of companies structure specific architecture, where we basically have one way to experience them. And Facebook, of course, is not the only one, we also have Google and Amazon and all these companies. But I think it’s really important for us to understand that we are presented with one way of experiencing them. And, and then we don’t see what’s actually happening in the back end. So what you’re seeing here, some of these devices do not exist anymore delight to blame, which was connected to Firefox in the middle. On the left, you see Privacy Badger, and on the right is the UK sort of ad blocker. And what I want to show here, I don’t know if you managed to see that there are hundreds of cookies that are plugged into my device. And this is I think for me, one of the most amazing thing was the most amazing exercises I usually do with my students of advertising students who don’t even realize what is happening in the back end. And once they show them that out, most of them are quite shocked. And I think a lot of them are like, Why Why didn’t we know this before? And how can we do something different. And I think, for me, this is, again, part of the bigger politics where our computer screen or our you know, phone screen creates this kind of divide of power relations, where in the front end, we get a very kind of pristine architecture and interface where we can engage with these platforms, if you’re living in the EU, you might get all of the choice of pressing, I accept I agree or okay. But then in the back end, we have a whole other online market that is trading us in milliseconds. And so this kind of the boundary of our screen is not only with, with what’s happening in terms of it’s a division of what is human what is non human also, because what’s happening in the backend is something we can’t even comprehend. So once we upload a certain web page, or Facebook, this these kind of the trading and the bidding on our profiles happens within milliseconds. So our screen creates this kind of estimate the boundary of the symmetric power, that these kind of big technology companies or the data brokers have all this. So and this is, what happens is that these kind of ads keep on chasing us. And this is I really recommend you to see a South Park, they had a thing in the 19 season where they were criticizing online ads and sort of how it’s kind of chases you. And from my research that I’m doing these days, where we try to sort of understand people’s that illiteracy is one of the things that we keep on getting from people is that not only do people don’t really understand what’s happening with their data, a lot of the time when they’re expressing these kind of concerns and fears from advertisements is that they keep on seeing the same ads and in a certain amount of time they say okay, then after I’m seeing it so many times, I’m going to press that ad just so it will leave me alone. So I think what we’re seeing here is these kind of ordering of the architecture and what we kind of can engage with in a specific time in space, and how that basically shapes our behavior. And this can also be in the shape of how different kinds of colors push us different kinds of disinformation and misinformation. So I think only now we start to get this kind of critical debates of how Facebook keeps on, you know, pushing different kind of problematic material, but actually pushing problematic material from Facebook as part of their business model, because the more emotional material will be on their platform, the more that you will engage. So it doesn’t, it doesn’t matter to them, if it’s a picture of your family in an awkward position, or if it’s disinformation, or conspiracy theories, because it’s all part of fueling more engagement and more comments between people.
Elinor Carmi 35:39
So I’m gonna push more towards what’s happening these days. And sort of to show you that today activists are pushing and I think that there is this kind of
Elinor Carmi 35:54
reckoning of this kind of power relation and what these companies are doing. We’re seeing the FTC wants to start investigating real time bidding these days. Actually, I think last week, or or a week and a half ago, activists show that the whole consent framework is actually flawed of the ad tracking industry. So we are starting to see a lot of changes. But the main problem is that the way that we experienced the internet today is that these companies want us to feel that this is the only way that we can experience these platforms, it can only be through, you know, my own profile that you know, not many people, for example, can use one profile, or that it could only be with cookies, or it could only be true, you know, this kind of ongoing surveillance. And so for me, one of the main things about sort of peeling off these kind of layers of politics and how these companies have lobbied and tried to create these kind of architectures, is to show that actually, all of these things weren’t strategies that were ongoing by these companies. So that also means that we can also build, you know, different kind of platforms, and different kind of things in a different way. But in order to do that, and this is sort of relates to what I’m working on now, is that people actually are not aware of what is happening in the back end. And as I said, this was a planned strategy. But people do not understand the political economy of the internet. So they don’t know most people do not know that Google and Facebook are funded by mostly that they main income comes from advertising. Most people don’t understand how algorithms profiling, you know, web cookies, most people don’t understand how they work. I can even say further that I’m doing for this month. And when we ask people even what is the basic thing like what is data? What is the thing that you know, that these companies may be know about you and things like that, most of these people have no idea. So I think that in order to to create a change in order to understand what is happening, and even more, so in order to demand a different kind of future, we actually need to go to the basics and actually explain to people what is happening. And I think unfortunately, a lot of people have seen the social dilemma. And you know, keep on texting me and saying oh, you should see that even my mom sent me a home only now I realize what’s really happening, which is good and bad, because obviously that documentary could have been much better. And I have a whole Twitter rant, if you want to see that about that. But um, I think what it actually shows is that the way that we communicate that whether scholars or journalists or activists, we kind of assume that people know about these things. And a lot of the time we kind of talk from a very sort of top down kind of approach. And I feel that we actually need to go to the basics and actually explain to people what is happening in order for them to demand different kind of platforms, and to understand what are the consequences of that. So I actually wanted to show you part of that. And this is part of the research that I’m doing right now, where we ask people on the left hand side, you see how much do you agree that people that companies would use your personal information to personalize your experience. And you can see your according to different kind of groups, more or less, they don’t really agree, but some of them do. But when we actually asked them, if they accept the companies would use their personal information to track their behavior over time. You can see here unanimously that most people don’t agree with that. And the fact that people don’t make these kind of connections to personalization and tracking over time is the same kind of thing is actually a clear sort of evidence that shows that people actually have no idea what’s happening. And they don’t make these kind of connections. So when you don’t know what’s happening, you can’t really demand that you will have a different kind of future. And while I believe that, you know, as academics, a lot of us are doing really important work in sort of advancing the debate, we actually need to, to go to the basics and to to make people understand. So I’m going to switch that. And I’m gonna, I think, go straight to the end and say that deviant media categories are about the struggles to determine what is human, what is normal, and what is social is about what makes us as individuals in society is about the default settings of our lives. And if we want to change the default settings of our lives, or at least to have several options of default settings, then we need to sort of peel off these kind of strategies that were constructed throughout time to make these divisions of what is deviant and what is not. So I think I even managed to finish before time, because I was talking really fast. So there you go.
Elinor Carmi 41:09
I’m gonna stop sharing. Great,
Scot Osterweil 41:14
thank you very much. That was fascinating. Let’s see whether right off whether we have any questions. I don’t see a handout. I guess one of the things I’ll start with a question, which is, I know some of your work, you looked at sort of practices, sort of outside of norms? Are you seeing media practice? Are you seeing communities of attempting to sort of circumvent these systems in their media practice? In many interesting ways.
Elinor Carmi 41:54
I think that, um, a while ago, I saw that several teenagers are using the same Instagram account, in order to bypass different kind of profiling. And I know that a lot of activists are also doing stuff around that. But I think, you know, there’s a lot of different kind of people who are trying to object in other ways. So we have mushrooms, for example, who is an activist who’s trying to change things from the legal aspect, which I think is quite important, crucial, because, you know, you need to have several kind of battles going on the same time. I think people are also trying to create different kind of alternatives to Facebook and, you know, and Twitter. I don’t know how, how effective it is. But I also obviously, don’t think that there is only one way, if we want to change the way that things are, I think that we need to go through multiple directions. So I think, you know, on the one hand, having more sort of different kind of education and data literacy program, is one way, another way is to try and change the legal frameworks that we have today, which are completely not equipped to combat these kind of companies. We I don’t know, if you saw a couple of days ago that ethic FTC is gonna have a huge anti trust against Google, which is great, but what is it actually going to do? Is it actually tackling the main issue, which is the business model? No, it’s the thing oh, maybe we’ll you know, dissect Google into even smaller pieces. And also, as you can see, all of these sort of measurements are coming after a really long time, which these companies already managed to cement themselves as a huge, an inseparable part from our lives. So for me, one of the main things is that actually, we need to have more public spaces, which could potentially come from a government funding, it could come from maybe having an internet tax, or different kind of things like that, where we will have
Elinor Carmi 44:11
other options than the big companies.
44:15
It’s not a perfect solution, like any kind of solution, but you know, it could be a start rather than counting on Google or Facebook to to do everything that we need basically, today.
Elinor Carmi 44:29
I hope it answered the question
Scot Osterweil 44:32
is Emily, did you? Um,
Emily Grandjean 44:36
yeah, I so I have to admit, I’ve only watched I think about half of the social dilemma. Um, but I’m just curious to know if you had the platform of moviemaking to spread your message, how would you have changed the way the message with people about these things? So if you could, like recreate the social delimma your own terms, what would you have wanted to depict?
Elinor Carmi 45:00
I’m happy that you’re asking that Emily, because I’m just planning to actually submit to Netflix. So if Netflix is watching this, I’m open to get all of the funding that you want to give me. But No, but seriously, I am. I thought about that. I think one of the main issue for me in that documentary is is several things. First of all, asking the tech bros to answer the the problems that the tech bros have created, for me is a huge issue. There is there are a lot of smart people both in academia, both underground, you know, activists have been dealing with these things for for years, who haven’t been asked. And another thing is that, although I like the US, and I think a lot of amazing things are happening in the US, these companies are global. So if we keep on, you know, only focusing on the US issue, it’s our recreates the problem, right? Because a lot of the standards of these companies, when we think about content, moderation, a lot of the laws that sort of govern that comes from a us focus and us centered mindset. And I think it’s quite important to have different kind of perspective of how these kind of companies shape, I would definitely also focus on Chinese companies, and Weibo, and different kind of things like that. And also show, I think, more historical perspective. But I think that, yeah, I would definitely also not do it a one film documentary, because I think if you’re trying to show so many things in one program, that sort of means that you’re losing a lot of things. So I am thinking of doing a series. And, um, yeah, and just to consult with people who have been dealing with it in different parts of the world, and to get them a richer understanding of how these technologies affect different kind of communities, different kind of regions, and not to necessarily assume that all of the people who actually created a problem can actually have a more insights than people who have been dealing with it in different kind of ways and aspects. Um, that would be my approach. And again, that’s links if you want to, I’m available on email.
47:16
And yeah,
Emily Grandjean 47:17
thanks. If you need any petitioners for the series, let me know. I will know I’m serious. I was actually a DM him a lot of people this week saying I can’t stand having all of this, these documentaries and having my mom telling me Oh, did you see the social dilemma? It’s so important, I learned so much. And I was like, Yes, mom. But so, you know, I think this is part of
Elinor Carmi 47:44
something that I think the academics need to be,
Elinor Carmi 47:48
we need to communicate more, because I think there’s a lot of really important and great work that’s happening in academia, but a lot of the time, it’s really difficult for us to communicate it. So I think that we need to have better routes to communicate stuff, you know, a lot of the things that are happening with Facebook, we’ve been talking about them for a decade, if not even more, right. And everybody’s keeps on being surprised every time and actually. Um, so that means that basically, we need a better communication channel between journalism and way to communicate with, you know, everyday people in order for them to relate with what we’re talking about.
Scot Osterweil 48:26
Other quiet Vivek
Vivek Bald 48:29
Thank you, Elinor, for really great talk. That taps into all of my, like, existing paranoia. And, and, you know, having been involved I was involved in, like web 1.0, really in the 1990s as HTML, coder. Right. And so that I know that all these conversations, all these concerns have been there ever since the beginning? Right. Yeah. But that the but that they haven’t been, as you’re saying about the social dilemma, that, you know, these voices have not been listened to? And the people who are in that film knew all of this, too. I mean, they’re, you know, because it was it was all there. Right. Yeah. But my my question is actually about it goes to some of the work that you’ve been doing, the interview based research that you’ve been doing. And you showed us one slide with people, whether they are okay with their information being trapped in these different Yeah. And, and I guess, you know, part of what I’m curious about is whether you’re also approaching those questions in terms of, of trade off and and what people are willing to trade off and I’m thinking about this, in terms of the really, you know, draconian Surveillance that that followed 9/11 right and, and for those who were in the communities being surveilled, it was a horrific moment. And and, you know continues to be so because it kind of shifted the way that that, that the state is allowed to surveil different communities. But for a lot of people who are not part of the surveilled communities, it was this question of trade off, well, if I’m safer, I don’t mind if, if I’m being tracked, and surveilled, because I know that I’m not the person who’s being targeted, right. And so actually, this is for my own benefit. And so in the case of the logic of the logic that has been sold about cookies and other kinds of tracking devices, is that, you know, this is for your own good, so that we can, you know, we can give you serve to you the kinds of things that you want to see the kinds of products you want to buy, etc. And it’s just a matter of, you know, you just have to trade off a certain amount of privacy for this benefit. And so I’m just wondering about, like how that plays into, or has played into the kinds of conversations or interviews that you’ve done with people.
Emily Grandjean 51:22
I think it’s really interesting what you’re saying here, and I had a focus group, actually, today, and a lot of people said that, you know, that they were concerned about different kind of privacy related issues, but then with, you know, the pandemic happening, they didn’t really feel that they have a choice. And I think that the the non choice factor here is is quite huge. Because a lot of services that we have today are digital, there’s actually very few services today that you can do non digitally. So there’s actually not really a choice here, you’re not really being asked if you can or cannot do things. And what a lot of these companies did, they basically changed the nature of this kind of contract with people. Right? So you have these different kind of Terms of Use, or different kind of contracts, basically, where they say, what are they going to do with your data, I’m probably one of the few people will actually read these terms of use when I can not always because I tried to have a life. And you know, that sort of creates I’m against, again, these kind of astrometric power relations, because what are you actually trading? Do you actually know what you’re trading? And I think a lot of the time, you know, we’re being told that we should feel safe, or you know, that this is okay. But actually, it’s not only to sell you different kind of data, and we can’t, it’s really difficult to trust these kind of companies. Because as time goes on, you realize that actually, you know, I found that I’m giving my data just to get to the Guardian, but actually, I give it to the Guardian, and I give it to a lot of other data brokers who maybe are then going to sell me a problematic maybe life insurance or different kinds of things like that, or maybe it can harm different kind of, you know, job opportunities, or things that we actually can’t really understand predict right now. Because one of the problems is that this data, as I said, with process listening, one of the main things that I wanted to emphasize with that is that it’s an ongoing process. So I actually have no idea who has my data, how much time Are they going to keep it? How are they going to use it for how long and different kind of things like that. So I agree with you that, you know, if it was in a fair world, or in the previous world, when we knew that we’re making this kind of a transaction or kind of a way to say, Okay, I’m just gonna give you a bit of my data. And then you’re gonna do it with that. But actually, one of the things that I’m trying to argue in the in the book is that there’s actually no negotiation here, right? Like, you have one way as you saw with Facebook, I can only press accepts all or I accept, and why actually, can’t we negotiate? Why can’t I negotiate individually with each of the platforms? What do we want to do? Why can’t they have a bar on the side that I can see which kind of date is going on? And believe me that I’m not gonna get confused, because I’m sure that all of you, at this moment have at least 20 tabs open with like a bazillion other things happening? So I think that, you know, we need to think about these things differently. And I agree with you that while some people think that, you know, it’s great, I’m gonna get personalized, you know, I’m gonna get, you know, maybe a gym that I always wanted to buy and Kislyak you get, like really good ads, but I can’t even know what is the the trade off here, because it’s so opaque to me. I have no idea what’s happening in the back end. I have no idea who is involved. And that is what’s troubling to me. And what’s troubling to a lot of activists and scholars is that this kind of like this kind of a scream. Basically separates between me and all of the other companies that can listen to my behavior. So while I do agree with you, sometimes the trade off is is okay. The that sort of the all the possibilities that it could be misused or abused is is very high. And so we need to both think about how we’re going to regulate these companies, but also how we create a fairer kind of trade off here, actually, maybe I do want to negotiate with you just like I wouldn’t have an open ended contract with my landlord, who maybe tomorrow decide that they want to crush in my place in the living room. So these are the kind of things that I’m thinking about. I hope that answered. Yeah, thanks.
Vivek Bald 55:47
And I agree with you. On my question was more about the, you know, how do we change a kind of broad acceptance? You know, among, you know, as part of part of the, the activist work? Yeah, you know, how how to change a kind of, or create a broad consciousness, that the trade off is no trade off.
Elinor Carmi 56:14
I think one of the So again, it’s about data literacies. And it’s about, about these very basic things, unfortunately, the way that media and computers are being taught, we’re not really being taught about basic things like the online economy, what are cookies, and what are all of these kind of things. And part of the project that I’m working on now, which you saw before, which was the survey that we conducted, and now we’re doing focus groups, is to create these, it’s called me my big data, it’s newfields Foundation funded, and part of the things we’re trying to do is to try to understand how different kinds of groups, what do they actually understand? And then how can we design education material that actually is tailored to different kinds of people. What we saw with the the survey, at least, is that people’s data literacies are very much influenced by education, and socio economic condition. So the more educated you are, and the richer you are, the more you know, what’s happening with your data. And the more you know, for example, about privacy settings and what to do there. So again, we’re seeing this kind of that how these kind of platforms that this architecture actually harms the people who are the most marginalized. So the answer is data literacies. In a nutshell.
Scot Osterweil 57:33
Um, we have a couple questions and or comments in the chat, but I can read them. But I’d rather if people
Scot Osterweil 57:39
ask between the chat and the q&a,
Scot Osterweil 57:41
or the chat as to from the panelists, and the QA is just from the attendees. But so I wanted to, and I want to get to all of the but I wanted to first give people an option, rather than have them out loud. Abby, did you want to make your statement yourself before I read it? Or what’s your preference there?
Abby Sun 58:02
I’m sorry, I have just done an entire day full of zooms. So I apologize for my video being off. I’m happy to read my statement. Also turn it into a question actually, because I sort of wrote it. Elinor before you said that you would love to make a documentary for Netflix yourself. So I’m always curious. I mean,
Elinor Carmi 58:23
I think big.
Abby Sun 58:25
Yes, it’s all the rage these days in my own background is in documentary film. And I think a lot about the politics of exhibition and all of that. So for me, what I wrote was, I’m very off rephrase it to I’m pretty much I’m pretty concerned, honestly, as someone that comes from a film distribution, exhibition background, about the platform streaming platforms, such as Netflix, that the social dilemma is on. And for me, the the way that these platforms are designed are similar to the ways that social media platforms keep viewers engaged. And so for me, the fact that a film like the social dilemma, which is criticizing the sort of, you know, decisions that these social media platforms make to keep viewers uncritically engaged are in fact, the same strategies that Netflix itself uses. So if a film or docu series even though I do have not seen the social dilemma, because I have major problems with the filmmakers other work he uses a lot of I think advertising and marketing strategies and his work etc. Stand to anyway, don’t need to get into that. So my question is, anything that’s kind of critical of this type of the results of these types of technological apparatuses, how they as you describe it, like filter and sort people’s behaviors. I guess I’m curious for you to talk about your thoughts of whether or not anything critical of these things can be truly critical. If it’s being used also as a piece of content by these systems,
Elinor Carmi 1:00:05
I think thanks for your question. I think
Elinor Carmi 1:00:08
it’s a really important one. And I think it is really difficult to create content that would resonate, I can tell you from my experience, um, I remember one time, I was teaching my students cookies and everything. And one day, I saw all of my students come and cover the camera, you know, the laptop camera. Yes, literally everything that I’m teaching them comes through. And now they’ve said, and then I asked them, so why did you cover that the camera and all of them said, Oh, we saw that episode on Black Mirror, where, you know, he’s, you know, being photographed. And I was like, Okay, okay, maybe it’s like, halfway. But I think things like Black Mirror things, like stories, basically, stories that we can relate to, I think are and I’m sure as the documentary is, you know, the value and the power of stories, I think that could be our way to, to communicate these kind of things, clear, and hopefully to to make critique in an engaging and meaningful way for people’s everyday life, because I think, sometimes critique, um, when we talk about it, and when we tried to communicate it, it’s sometimes you know, everyday people maybe wouldn’t be interested, or they don’t really understand how it relates to them. So I think if we can translate a lot of our ideas to do these kind of stories that people can relate to, that can help sort of a communicating the critique in in a meaningful way. I hope that that answered your question.
Scot Osterweil 1:01:44
Thank, Abby. Tarleton, would you like to? Yeah, I mean, again, and ask your question. Thank you.
Tarleton Gillespie 1:01:50
Yeah, I’d be happy to. Hi. Hey, Elinor. I’m so glad that you made it to Boston, even if it was only a little No. Right. So but
Tarleton Gillespie 1:02:01
yeah, of course, of course. So this is fascinating here, I was hoping that you would close the circle on the talk by coming back to your use of sound metaphors. Because I find, you know, the argument about, you know, having to sort of stabilize what counted as cookie stabilized what counted as a human user versus non human. That’ll, that’s all very convincing To me, the turn to digital literacy, for sort of, you know, what do people not know, that? They don’t know? So, in that argument, where did your turn to sound to help you notice something make a case that you couldn’t have other watches?
Elinor Carmi 1:02:38
Um, I think that because so thanks for the question. I think when I started the research, I the first case study is with Bell Telephone, and it’s with actual noise and sound. And so when I was starting the research looking at spam, I was trying to think, okay, Spam is this kind of disturbance. And so what’s actually happening there, and I was trying to see, so who actually created the first sort of communication model, and that’s Romeo, a to bell. And, um, and I think that sort of helped me to think about the way that the more we listen to different kind of things, but also, the more that we can listen to multiple spaces at the same time, that creates this kind of symmetric power. And so when I’m talking about cookies, or when I’m talking about Facebook, Facebook has the most power by being able to listen to my behavior, both on the platform, both outside platform, at the same time, I can’t listen to my own body, because which body I consider to be in my computer, it could be my laptop, it could mean my phone. Because of this, as I said before, this kind of screen, which doesn’t really allow me to understand who is listening to me. So to me, listening enables us to cross boundaries of time and space. And that means that I can listen to your behavior when you’re on Facebook, when you’re on Google. And it’s an ongoing process. Which for me, we don’t, we can’t really, it can really happen when we’re thinking about its true vision, because vision for me is very much constrained within time and space. But I can only be in a specific time in space. And that creates for me, a singular layer if it may. And what I realized with Facebook and all of these kind of companies that are listening to us is these kind of multiplicities of layers and as as we as you obviously know I work also on content moderation, which we had a very great article that came today on internet policy review, if you want to see from my our AOAR panel in 2019. And there I examine content moderators and how they listen. And so I noticed with that Content moderators, which at the time that I was examining it, it was kind of sort of new, it was 2015 2014. But I was starting to engage with that I was actually single, actually, there’s so many different kinds of entities, whether human or if it’s funded moderators, or non human like cookies, or listening to our behavior. And therefore, there are these kind of multiple spaces, which I think vision doesn’t really allow us to understand these kind of multiplicities. And for me, the multiplicities are important because they show us how so many organizations are involved in creating these kind of profiles. And again, this doesn’t only happen in a digital environment, it also happens as I show with the bill telephone, in an analog environment. So when I was comparing content, moderators to telephone operators, who also listened on the line and work, you know, part of the communication channel, this is part of, you know, what I was trying to engage? No, I hope it answered,
Tarleton Gillespie 1:06:02
yeah, yeah. I like that. It’s somehow it’s sometimes about looking at actual listening practices. And sometimes it’s the kind of kind of metaphor that you use to train yourself to notice things. I think that’s, I think both of those are really interesting. Thank you. Thank you.
Elinor Carmi 1:06:17
Yeah, I think like, for me, again, it’s a it’s kind of an exercise I don’t, I don’t mean that we need to sort of abandon everything that is visual and optic, I just think that there is there are more and more scholars who are pushing towards thinking about media through different kinds of senses. There is David fairyseason, amazing book about the archeology of touch. And so I think that as scholars, and what I really like about this department specifically, as I know, there’s a lot of artists that people are thinking critically, I think that it just opens more, you know, spaces for us to think about what what’s actually happening. So it doesn’t mean that we need to completely abandoned vision for all the people like vision related concepts, but it’s just another way to examine different kind of media phenomenon.
Scot Osterweil 1:07:11
Tomás, are you would you like it?
Tomás Guarna 1:07:15
I’m just gonna read what I said. Because I know it’s so I’m falling on the idea of distortions? Have you considered this scenario of algorithmic listening to bots? So the algorithm listening to bots, who will algorithm algorithmic behavior. So I wonder what happens to that feedback loop and how the global glory algorithm becomes is sorted by the input of algorithms. And I wonder what the consequences are spillovers for for human use for human users.
Elinor Carmi 1:07:43
So what is the disruption of when algorithms? Can you repeat that?
Tomás Guarna 1:07:49
I’m wondering you the so you know, we have algorithmic listening, right? Well, we also have already called content production. We talked about bots, we talked about how this algorithms produce content. So I’m interested in maybe if you consider that feedback loop, right, how the algorithm listens to algorithm co production, what happens with that scenario?
Elinor Carmi 1:08:11
Um, I think that what’s really important for me, in my research to emphasize that it’s never only machines involved or bots involved, there’s always humans in every part of the process. So you know, even with platforms only, when we think about it, only the recent years, Facebook has actually confessed that there are content moderators at the beginning, they said, Oh, we don’t know what you’re talking about. And then when, you know, people started pushing, more and more people started to talk about that, you realize that there are people at all of the all of the points of automation, it’s never a full automation, there’s always humans there. So I think that when you’re talking about that, this kind of algorithmic content and also our Gareth, Nick ordering, they’re always people at different points of the communication channel that are going to decide what is deviant. So sort of what is relevant for our business model, and what is not relevant for our business model, and hence, will be filtered out. So to me, that is, it’s really important. And it’s really important for me not to talk about algorithms in an abstract way, because then we’re sort of taking away the responsibility of the humans that are always involved in that process, and the politics of the different kind of humans, whether they’re programmers or the CEOs, or if they are the more inferior humans in this process, which are the content moderators, or you know, even in Google, they have, you know, the rankers who are deciding how things are going to get ranked on the Google search results. So to me, what I think what you’re asking, involves, again, with this kind of different kinds of decisions made by humans, and it’s never going to be a perfect kind of all entanglement but I think that the way they’re going to respond to each other is very much related to these kind of decisions. I hope that answer. Thank you.
Scot Osterweil 1:10:11
We’ve got two questions from the q&a, and I’ll read them out loud just so that they get into the recording. The first is from Roku. And the idea of surveillance and data capitalism that is being put forward by the tech giants nowadays, drives me to always think of Gramsci and his theory of hedge emoni following Gramsci strategies of changing the systems, he advocated of people going into institutions, such as schools, government offices, to change them from within, would that be possible in the tech industry? Would those change makers be immediately perceived as deviant and thrown away? And then ads? I’m sorry, a long question, but I’d love this talking about very enthusiastic. So
Elinor Carmi 1:10:51
no, it’s fantastic. Please, please ask, and it’s great. I really liked the questions here. It’s great that people engage with the book. So you’re basically asking if it’s the matrix, and if Neo needs to be part of the system or not. Um, I think that again, in order to change the way things are, I think that we need to think about it not as like the one solution, but as multiple solutions. And of course, different kind of solutions are going to be more relevant in different kind of regions. So what will be more beneficial in Europe is not going to be more beneficial in the US, or Russia or Asia, China, Israel, different kind of places. Um, I do think that a lot of people are trying to change, both from within and outside. I think that we need to have these kind of forces in all of the directions. I think that at the moment, the kind of instruments that we have, as I said, before, you know, everybody had so many hopes with the GDPR if you aren’t following and I really recommend it to follow my friends who’s an activist has been trying to change a lot of things. And he has changed a lot of things. What he showed is that even though we have the GDPR, the kind of the DPA, the the Data Protection Authority in different kind of countries in Europe, actually provide very little money to actually take care of all of the People’s sort of lawsuits against different kind of companies that are dealing with their data. So, um, I do think that people need to integrate, both in government, hopefully also in technology companies, but also that we need to educate people. So to me, it’s it’s multi layered both to change laws, you have data literacies that change the way that technology companies are operating, and also to change the way that we are talking about these things, whether it’s with, you know, more entertaining content, but also journalists. So, yeah, it’s a multi, multi year multi step strategy.
Scot Osterweil 1:13:08
To some degree with that answer. You may have answered the following question. But of this from Hamidreza Nasiri, what you said about social dilemma? Isn’t that actually one of the main issues that are the sources that give the illusion of informing people, but they systematically nor the systemic problems that make sure that the discourse remains shaped by the powerful institutions, the same thing that Professor balls mentioned, regarding convincing people that it’s for their own security, or recently convincing people that censorship by the big by big tech is actually in the interest of the people hacking, one fight that kind of informing that persuades people to act against their own interests, will make them think that they’re actually engaging in resistance?
Elinor Carmi 1:13:51
Or intense questions. Um, I, again, I don’t have all the solutions, I only think that the way to make people engaged is actually to inform them. And I think that people only, um, I have a friend who’s been an activist for many years, and he told me that the things that sort of motivates people to go to the streets is when things hit you the most. So in Israel at the moment, people are demonstrating for the past few months against Benjamin Netanyahu and his regime. And it’s not that they haven’t been demonstrating before. But I think that with the COVID, and the pandemic, they realize what they can lose it. I think that this kind of what it touches your life, when you actually know what you can lose, then people are more engaged. I don’t know if people saw that. Students here had a whole demonstration against the algorithm. So the story was that students were supposed to sort of graduate the high school here. They weren’t supposed to have an exam, but because of COVID, of course, they couldn’t have the exam. And then the government designed the weird kind of algorithms. It made a lot of mistakes, which meant that people who came from deprived areas received lower grades than they should have. And then students came and started to post this and said, You know, we’re having these different kind of signs, you know, like, Fuck the algorithm and going to the streets, and, and during a lot of petitions and like asking and attacking the government, and the government actually caved and change these results. So why did these students didn’t protest? For example, when it was Brexit? Which is a question that I was asking quite a lot. And I think that one of the things is that with the algorithm, you can actually see what’s happening to you, it was very visible, right, you can actually see how it’s going to harm you. With with Brexit, we don’t even know what is the current what what are the agreements, right. So I think that if we actually want to engage people, if we want to motivate people, to make the make these kind of different kind of changes, we need to help them understand how that can actually harm their everyday life. And that I think engages people into action.
Elinor Carmi 1:16:06
And we have different kinds of examples like these.
Scot Osterweil 1:16:10
You know, it’s funny, I follow that from a distance, and I kept thinking, it’s great that they’re protesting, but shouldn’t they be protesting the whole system to begin with the one that orders and sorts them? You know, in general, rather than just when it gets it wrong, so But
Elinor Carmi 1:16:26
no, totally. And also, not only when it comes to their grades, right. But again, if you don’t really understand how that affects your life, you, I think it’s harder to make these kind of different kind of imaginations on people. And also people, you know, we show how these kind of algorithms impact different kind of, you know, communities of color, things like that, but not all of the people are, first of all, encounter all of these things, not everybody here, or really understand how that can actually influence you in the future. So, um, yeah, I think they saw another
Scot Osterweil 1:17:04
Yes, Srushti had a question.
Srushti Kamat 1:17:07
Hi, Elinor, thank you for that. It was great.
Srushti Kamat 1:17:11
I actually have a question about the assumption that most people have agency over things like public demonstration, right? So I’m talking about spaces in countries where protesting is not an option, you cannot go out on the streets where digitization is the only way for you to actually act in resistance, it kind of goes back to the question in the q&a, it’s like, if the digital is the only space you can exist to express your freedom, express yourself, then you don’t really have much of a choice. And so how do you change the narrative of data literacy? away from cultures like the US and UK as well, maybe where you can go out and protest to societies where you just can’t do that?
Elinor Carmi 1:17:50
I think it’s a really good question. And I think that, yeah, we have a lot of assumptions in the way that in the kind of solutions that I was talking about, as well, and this is why I said that there is no one solution, different kind of regions will have to respond in their own way. I do think that, you know, there are a lot of demonstrations in other regions, which are not a sort of West based and they are doing it with different kind of means such as means in different kind of ways. I think it’s going to be difficult. And I think it’s it’s I don’t really know, the exact way, but I think that different kind of communities have always invented creative ways to objects, and to protest the way that things are happening.
Elinor Carmi 1:18:44
And
Elinor Carmi 1:18:47
they will have to use different kinds of tools that are available. I hope that answered. Yeah, but I think we definitely can’t assume that people always have agency. And obviously, providing people with data literacy doesn’t necessarily mean that all of them are going to go to the streets. Because if you’re poor, and if you don’t have the time or money to go to the streets, then you’re not going to do that. So of course, I’m not saying this, this, you know, the one bullet point and you know, it’s gonna free everybody and we’re gonna burn the streets. And you know, but I think that it is a gradual evolution of these things. And just like, you know, feminism and anti racist, you know, groups hot, it’s an ongoing process, right? We keep we need to keep on fighting, and it sounds like everybody has the time or resources to do that. But I think the more knowledge that we will have, of what we can actually do and what kind of power we have when we come together. I think that we can change it, but I Of course, not everybody will be able to participate in these demonstrations.
Scot Osterweil 1:20:09
Great, um, it looks like unless you have another question, it looks like we wrapped, we’re wrapping up right on time. And so I want to thank you again. And thank you for waiting for hanging in there and showing up when you finally could was really, really fascinating talk. And I’m glad that we were able to have it. So I want to thank everyone else for who came and look forward to
Elinor Carmi 1:20:35
having me and again, I just want to remind you that the book is open access, so feel free to check it out and also the playlist and I’m on Twitter. So if you feel like continuing the discussion, then feel free to either DM me or email me. And it was a real pleasure during this event, because you had amazing questions that really made me think as well, which is quite rare. So thank you very much. Thank you.
Elinor Carmi 1:21:07
Thanks.