(Unedited) Podcast Transcript 369: Treating Social Media Like a City
This week we’re joined by Sahar Massachi of the Integrity Institute to talk about his piece in MIT Technology Review titled “How to Save Our Social Media by Treating it Like a City”. We chat about the similarities between managing social media’s bad actors and urban problems like black box highway modeling, speed management, and city building.
Below is an unedited transcript of the discussion.
Jeff Wood (43s):
Sahar Massachi welcome to the Talking Headways podcast.
Sahar Massachi (1m 13s):
Thanks Jeff. I am so delighted to be here. Thanks for having me.
Jeff Wood (1m 16s):
Well, I’m glad you’re delighted to be here before we get started. Can you tell us a little bit about yourself?
Sahar Massachi (1m 20s):
Yeah. So at the moment, I’m the co-founder and executive director behind the integrity Institute, which is a think tank on sort of social media integrity and how to protect people on social media. That’s powered by a community of tech workers who do this as their full-time job, but I came to this through, I don’t know, I think a pretty fun and winding path that involves, you know, hackerspace tour of north America and helping out a political campaign for a doomed candidate and all kinds of stuff. That’s me in a nutshell.
Jeff Wood (1m 55s):
So is this your main job or do you have another job? And this is kind of what you do because you’re really passionate about the subject
Sahar Massachi (2m 1s):
I’ve called it. My full-time obsession since about January of last year, I don’t think I’ve been paid yet. I’ve just got a fellowship that sent me a check in the mail a couple of days ago, which will cover my, my rent for the next three months. And it started off as a very grassroots volunteer thing. And it’s still a volunteer thing though. I think soon we’ll be able to get some funders or fellowships to make me stop mooching off my girlfriend or life savings quite so much
Jeff Wood (2m 27s):
Understandable. And then w you know, we’re going to talk about cities in a bit. What got you interested in cities, or have you always been interested in the urban form or how we get around or anything related to kind of urban ism?
Sahar Massachi (2m 39s):
What a fun question. So there’s this particular feeling that I had growing up, which is a little hard to talk about or describe, and that feeling is sort of being plopped onto the world without a sense of rootedness. And the reason for that is that my parents were refugees from Iran and they fled as children for their lives to the only country that would take them, which was Israel. And I was born in Israel in 1989, and we left in 1991 for a combination of war and violence reasons in the least, as well as just, you know, the U S economic opportunity and so on.
Sahar Massachi (3m 20s):
And I grew up in the suburb of Brighton and Rochester, New York. So because of all that, I didn’t really have family in Rochester, and I didn’t really have like, even my family back in Israel, you know, we’re, we’re setting down roots, but we’re also, you know, in recent living memory refugees. And so for a lot of different kinds of things, it really felt like being introduced to the world from scratch and not having sort of a family tradition to drawn. And when it comes to urban ism, all I knew growing up was the suburb of Brighton, which is a nice place. I like it a lot, but it definitely wasn’t a city. And when I went to college outside of Boston, actually at Brandeis and took the tea for the first time, that was a big deal for me.
Sahar Massachi (4m 5s):
And just sort of walking around and like being able to experience this new kind of way of living when I was, I don’t know, 18 or 19 felt remarkable. And it was part of the kind of feeling that you get when you start off in college, where everything is new and different, and you can feel your brain being stretched and growing in different directions. And you could say that was my start.
Jeff Wood (4m 24s):
I see you have a map of Rochester behind you on the wall. It still holds a good place in your heart. Amen.
Sahar Massachi (4m 29s):
Absolutely. So this is my apartment in Somerville mass. And sometimes if we had changed timing a little bit, I would be coming at you from my childhood bedroom in Rochester, New York, or the suburbs of Rochester, New York, you would have seen a different poster behind me, which is a mock-up of what the Rochester subway should have been or might one day be. And I’ve had that since I think I was 16 or 17. I liked Rochester a lot, but also I’ve moved a lot and lived in Springfield, Missouri and new Haven, Oakland, San Francisco, you know, Boston, New York, I think Springfield, Missouri was the strangest one for me. And it just feels, I don’t know, each city has its own vibe. It’s hard to talk about it without sounding self-consciously poetic, but it’s like a friend, you know, like I have casual friends and I have best friends when it comes to cities.
Jeff Wood (5m 18s):
We’ll come back to that because I think that’s an interesting point, but I want to talk about your piece to start out. So people kind of know what we’re referencing to a certain extent. So you wrote a piece in MIT technology review that I was really interested in to read as someone specifically who thinks about cities, but doesn’t really think about the nuts and bolts of social media, even though I think that it’s kind of a mess before getting into a discussion about the piece, how to save our social media by treating it like a city. Could you lay out the basics of why social media is a specific target of your thinking? I know you’ve mentioned the integrity Institute, but I’m curious why social media is such kind of a flashpoint for you.
Sahar Massachi (5m 51s):
Yeah. This sounds like a great way to talk about my life story. So, you know, I, I grew up in Rochester and I went to school at Brandeis and I really thought that I had life figured out I was going to work for move on, not org or something like that was the highest pinnacle of what a job could be. And I ran into this problem where I really want to do good for the world, drawing on my family’s history and just my own inclinations and like a Jewish upbringing in America. But also people kept wanting to pay me to work on computers because I like computers. And you know, if you try and do good, you pay $40,000 a year. I don’t know right now, I mean paid $0 a year. And if you work on computers directly, you get paid hundreds of thousands of dollars a year.
Sahar Massachi (6m 32s):
So I try to found an app for like civic tech and that didn’t work very well. And I tried a bunch of other stuff, try and figure out how you do computers and social good. And eventually ended up at Facebook by way of the other shiny star on my resume was Wikipedia, which was great. And everyone please donate to Wikipedia. That was my job to try to get you those banners on the top of the page saying, you know, if everyone donated right now, we’d stop asking for money. It’s true. We compete, they raised their budget. And then he stopped asking, I’ve never seen a nonprofit do that before. Anyway. So I was, I was at Facebook and my friend said, this seems against your values. This is strange for you. Are you going to Facebook to change it from the inside?
Sahar Massachi (7m 15s):
And I said, no, no, I’m not. If someone tells you they’re, they’re joining a large institution to change it from the inside and they haven’t even started yet. Well, I think they’re lying to themselves and they’re probably lying to you and I’m in it for the resume, you know, the money, the network, the skills, and also, you know, it was a good place to work, highly rated. So I start off in the growth team and I was really happy. So I dunno, I lived my life on Facebook. That’s where my friends are. I moved around a lot. So it’s hard to remember who lives, where helps a lot with that. I have a little Facebook group for Sahar visits, X city. And then whenever I visited the city, I’d post in the group. And that’s how I keep track of my friends. And it just felt right. You know, like it’s people and his computers at the same time.
Sahar Massachi (7m 57s):
And I like both. And then the story goes after about a year I heard there was a civic engagement team inside the company and the civic engagement team cracked the problem that I, and other people haven’t been able to crack before, which is the civic tech problem. If you make an app for civic technology or website, the people who download the app or use the website are going to be the very people who don’t need to use it. And that’s just been the problem. Their answer was let’s just shove civic tech features inside of an app that I was already using. And I thought that was great. I joined the team to join the team. You had to swear the civic oath to do what’s right for people not what’s right for the company and be fair, be selfless, be representative, be responsible, know your impact, do research to know that you’re actually doing the right thing instead of just hoping it was great.
Sahar Massachi (8m 43s):
Long story short, the scandals happened. We turned into integrity team. I helped build a lot of the first stuff on the civic integrity team. Eventually I moved to Boston and left the team in order to be with my romantic partner and got on sort of the journey of discovery that led me here today.
Jeff Wood (8m 60s):
Dive a little deeper, like why is this social media target of reform, I guess is maybe the best word to use, but why is social media of target of reform in your mind? You are on the integrity team and it does great things apparently, but why is it that you leave? And then you have to do more.
Sahar Massachi (9m 18s):
I mean, I think there’s a world in which I could be like a, I don’t net brutality activist or a full-time person. Who’s mad at the NSA and really focuses on that. You know, I am out of the NSA and there’s a lot of different like tech in society. Things I could have ended up doing. And there’s a life that I could have had in which I end up making, like, I don’t know tools for activists or social movements or something like that. I think a lot of our lives are just sort of happened by chance. I got a chance to work at Facebook and I took it because I really liked the Facebook product. And it made sense for my career. I found out that there was, you know, the best team inside of Facebook that I could think of. And I joined it because it was there. We moved from civic tech to like protecting people online because that’s where the world is moving and what people needed.
Sahar Massachi (10m 3s):
And I became an expert and I am still an expert. And I look around and see how people talk about protecting people online and talk about content moderation and censorship. And I get kind of angry cause like the things they’re talking about are just so much more primitive than the things that we were talking about internally. And, you know, it’s sort of like angry Brianna on like a surface level of like, why don’t you get it? But also like at a deeper level of like, they shouldn’t be the case. You know, we have this sophisticated knowledge and we have these frameworks that we developed internally at, at one big company. I know there’s other frameworks that people develop at other big companies and the public should have access to that level of discourse. The reason they’re arguing in dumb ways in some ways is because they don’t have access to the better ways you could talk about it.
Sahar Massachi (10m 48s):
If you had people working full time with access to data and experimentation who are allowed to, you know, share their, their knowledge of the world. And so to scratch that itch, or I built an organization that shares that knowledge with the world, but rather than being the one guy who does it right, and being the integrity guy, it felt important to me to like build an organization that brings in integrity, professionals all have a voice to all be on the stage together.
Jeff Wood (11m 16s):
What is integrity design? Like what is the focus of it? What is it targeting? How does it work specifically?
Sahar Massachi (11m 22s):
We’re still in the early days. So I think this is one of those things where you’re not going to have a tidy definition. If you think of like software design, right. And how to write code, there are certain things that you do when you write code to make it more secure from hacking and just more secure from errors. And there are just design patterns that people have come up with laboriously over the last three decades or four decades of just how you write secure code languages, frameworks, whatever it is. I hope that we get to a place where at a different level of distraction, when you construct a social network, they’re just designed patterns that you can put into place. So for example, reshare buttons or retweet buttons or message forwardings are really dangerous. And you should be careful when using them like worry about spam and figuring out ways that like people can’t spam and make it harder for people who are trying to spam to be able to pull it off.
Sahar Massachi (12m 12s):
So I’d say integrity design is the emerging discipline and theory of that discipline for how you build social features or a social platform in ways that have integrity feels circular and having integrity means having features that make it hard for people on the platform to harm each other.
Jeff Wood (12m 32s):
I’m thinking of like all of the, a lot of the negative things that we think about with social media, whether it’s countries who are using it as a weapon, there’s people who interact with each other and then cause pain to each other, whether that’s through words or through actions, stalking, et cetera, there’s all these negative things that come along with social media, but there are positives as well as you know, from, you know, your groups that you’ve set up to see people when you’re in town, there’s a lot of positives to it as well. So that’s why I was interested in that because it just seems like there should be a better way and people talk about it a certain way. They talk about moderating content. They talk about other ways to solve things. But in reading a lot of the pieces that you and your co-founder wrote, it seems like there’s a better way to talk about it.
Jeff Wood (13m 12s):
Or at least we should be more intelligent about how we think about this. It’s not just content moderation, there’s a deeper way to get into how it works. And, and you kind of go in and try to make that analogy that it’s kind of like a city, right? It’s kinda like a place where people can go and live their life and interact with other people like a city. But it’s not really like a city either. It’s very online. People can replicate themselves in millions of ways. It’s like the, and this is probably a bad analogy for you, but it’s like the matrix, right? It’s like, you can be wherever you want to be. You can be a sock puppet. You can do all these things. But in reality, a city is a place where you’re limited in how much you can do these things to a certain extent.
Sahar Massachi (13m 53s):
Yeah. I think like pudding is like, you know, what’s integrity design. It’s just like building social networks with care for the right reasons. And like the practice of asking how you do that and try and make it happen. And I imagine that like, you know, if you’re not the expert on social media, I’m not the expert on cities. I imagine that there’s something similar that goes on in urban planning or even, I don’t even know what the term is. I imagine urban planning is maybe more macro and there’s a different, more micro version that maybe has a similar or different name. Is there an integrity design for cities? Like, you know, meet space cities today.
Jeff Wood (14m 23s):
Interesting that you asked that question. I don’t know if there is in the piece, you actually, there’s a quote and I want to quote you on this. The city needs to be designed correctly from the beginning. It needs neighborhoods that are built so that people, societies and democracies can thrive. And I think that that’s a truism, but it’s interesting how cities evolved and how they came about is they weren’t really designed. They’re organic they’re as Shannon Mattern mentioned on the, A City is Not a Computer is they’re semi lattices. They grow on top of each other. They’re organic. It’s not tree thinking as she put it, which really hit me to a certain extent when I chatted with her. But I also was thinking back on, you know, how were our favorite cities grown and built and created basically somebody, you know, moved there because there was a natural resource of some sort, whether it’s a river or, or there’s a good place to be, or a farm or anything like that.
Jeff Wood (15m 7s):
And then somebody, after a certain amount of people moved there and, you know, designed a small little village, then somebody came in and put down a grid of streets and then it grew from there. And even now, today we have this situation where people put down streets or the code allows you to build as many houses as possible here, there, and everywhere. But I don’t know if there’s a, an integrity. I mean, we know about good design. We know about a good network of streets. We know that the grid is really important, but I don’t know if there’s like the same kind of idea as integrity. I mean, maybe there is, and I just don’t know about it, but I’m not sure, but I hope that answers your question.
Sahar Massachi (15m 42s):
Well, I think you’re actually making a really important point. Just like maybe I had a little bit too much of a rhetorical flourish there. Do you online cities need to be designed with integrity from the start. I mean, as an integrity professional, that’d be great. But like, I don’t know. It’s an open question. Maybe the cool thing about digital platforms is that you can really change the bones of it easier than you can change the bones of the city to use a Facebook analogy. What I know well, like I think that the share button is really bad. I think it leads to a lot of problems tomorrow. Facebook could get rid of the share button and in a way that I imagine, I don’t know, Boston couldn’t undo the big gig. So that is a difference. But yeah, you’re completely right. Cause if you try and bake in city planning from the start, you end up with Brasilia, right?
Sahar Massachi (16m 26s):
Like we have all these examples of failures.
Jeff Wood (16m 28s):
Well, it’s interesting. You mentioned Brasilia because there’s this affinity for tech moguls or countries to build cities from scratch. You have neon and in Saudi Arabia, you have that folks there want to build, you have the new capital of Egypt. You have, you know, Songdo city in South Korea. This is not a Northern Southern Eastern Western difference. There’s a lot of people that want to build new cities, but in the end they don’t end up being cities. They ended up being these really sterile environments, Brazilian one of the main culprits, because they’re not organic. They’re not grown from people moving there because they have a process where they have an idea of what they want to do. And so it’s interesting that you mentioned, you know, building social media sites in that way too. Is it the NRG? Is it the openness that allows people to be themselves to grow these things from the beginning?
Jeff Wood (17m 11s):
And then in the end, you know, after they grow, initially the car comes in and then it messes everything up, right? That’s the best kind of analogy I could pay because the introduction of the automobile was not a benefit for city. It was, it was actually the beginning of a negative reaction because of the way that people were able to travel. And it’s not bad the car as a thing itself, but the way it was used was a poor example of growth over time. And, and the way we built our cities. And now what we have to do to, you know, address that in terms of climate change is another big issue. So it’s interesting to think about the crossovers between cities and social media. You know, my, my biggest thing maybe is next people are starting to think about what this online world will look the quote, unquote, the metaverse and I’m not there yet, but I see where, where people want to go.
Jeff Wood (17m 56s):
And it’s, it’s interesting to think about how you build a city, how you construct a city to be egalitarian, to be focused on people’s, you know, goals and dreams. And then, you know, how do you build a place online that can be similar, but not harmful. Like social media has been to a certain extent. Yeah.
Sahar Massachi (18m 13s):
So I’m founding a thing that is a nonprofit startup, you could say, and, you know, I’ve founded book clubs and various other sort of small associations in my life. And one thing that you do when you start a new group, I think is you don’t demand that people use Robert of order when it’s like in no for you splitting a pizza, that would be absurd at the same time. If you’re, you know, a group of a hundred people all meeting at the same time, you need some sort of structure in order to have people be heard. Otherwise you’re just like a bunch of people yelling at a high school auditorium and nothing productive happens or zoom meeting.
Jeff Wood (18m 46s):
Sahar Massachi (18m 48s):
Oh man, don’t get me started a spatial chat is the best and gathered town, by the way. I don’t know if you’ve tried those,
Jeff Wood (18m 54s):
I’ve tried something similar. We are you’re in a room, but then you can hear the people that are closest to you and kind of that type of thing. Yeah. Yeah. And one of those before,
Sahar Massachi (19m 2s):
So you could say that like integrity is sort of like almost akin to organizational behavior or like, like whatever magic and MBA is supposed to give you in terms of understanding how to structure organizations, where, you know, you start off with the village of people by the bend in the river who are just like having a good time. And you say, this is great. Like, you know, we’re not going to impose all these onerous stuff on them. If we do, we’d lose that organic vitality. But as more people grow in those bonds of trust become weaker. That you’d want to have some sort of method for them to solve problems. And you want to have some sort of scale, you know, first you have a little bit of content moderation on like a forum, right?
Sahar Massachi (19m 42s):
That’s like the first thing you try that lasts you for a while, but when it gets big enough individual content moderation doesn’t work and you have to find things that are more akin to, you know, the structural problems and you know, how it all works. And maybe to use again, the Facebook analogy, you know, not everyone on Facebook can post on mark Zuckerberg wall. You can’t post on, you know, anyone, but your friend’s wall. And that’s a design choice that was mentioned as some sanity to the system. And there’ll always be those design choices. And then the question is to what end are these design choices being made? Are they simply for order for the sense of more engagement and growth, or are they also for the collective flourishing of the citizens to round out this point, if you look at something like clubhouse, they’re an example.
Sahar Massachi (20m 29s):
I think of that little village, having the spark of something fun that people really found and then grew organically, but, you know, because we’re in the internet and the city of atomic Superman, and because things happen at internet scale, as opposed to human scale, suddenly millions of people were using this thing that only had the processes and the tooling ready for thing of, you know, a hundred or a thousand people. And then they ran into a big problem. And so I’d say that like, because things can grow really quickly baking in integrity concerns, or at least the bones of integrity concerns in the beginning will really help.
Jeff Wood (21m 3s):
That’s interesting. I mean, that goes to another point that I felt like I wanted to discuss with you when I was reading the piece and, and reading other pieces as well. There’s a, there’s a really important 1938 paper by a guy named Louis Wirth. And actually we actually read it in full with the permission of the university of Chicago. On our 200 episode, his article was called urbanism as a way of life. And basically it discussed why people acted differently when they moved to cities, as urbanization was increasing. You know, the construct created by a city according to worth is that people can be anonymous and create more impersonal relationships. They can also be more specialized with their work and interests. It’s not like being in that small village anymore. It’s moving to a place that is larger and you have to kind of deal with more people as they come at.
Jeff Wood (21m 45s):
You imagine, you know, 19 teens, 1920s, New York, and how many people were walking in the streets before automobiles. And during the time of railway travel and street cars, you just had to deal with this kind of throng of, of humanity. Cause what’s interesting about this is the connection to your piece about how, how social media has kind of supercharged this characteristic of cities. But the important factor that you mentioned is the actual, the physicality of cities and kind of building that physicality into social spaces because of what you mentioned with that clubhouse example, which is that stuff can grow so fast. And so unwieldily that it is kind of out of control. And even in the city, like Louis worth said, I mean, it gets unwieldy. It gets a little bit overwhelming for people and they decide to not have as many strong connections, but it’s interesting that you bring that part of an Intuit as well, is that if you have a physical limit that doesn’t exist online necessarily, but you build that into the system, it makes it actually more palatable to be a member of that society.
Sahar Massachi (22m 40s):
Yeah, absolutely. I mean like for the growth of a city is in substance bounded by the number of homes you can build in a period of time, right? Like you’re not going to see a club of 15 artists turn into a metropolis of 2 million people in the span of two weeks. It’s just physically impossible to do it. And that gives people some human scale time to sort of figure out the emerging problems and, you know, have some time to experiment with solutions as the city grows. And that’s a sort of growth. That’s a story about the growth of a small platform to a big one, but it’s also the, the, the same kind of thing of just how lies are spread, how hate speech is spread. Just any sort of behavior in human civilizational time.
Sahar Massachi (23m 19s):
We’ve sort of discovered a way for people to teleport and have bot armies in a digital city. And we just don’t know how to deal with it yet. And it’s going to take us, I don’t know, maybe decades to figure it out. We might have to have some new laws, we’ll have new norms, a politeness, you could call it like, we need to have a whole generation of people understand media literacy and then grow to be the old people, right? Like cohort replacement so that the entire population has a media literacy. If you want to call it that, it’s going to take a lot of time. And until then, I’d say that we want to have people’s existing intuitions. Like we have to give them something to hook into. We have to make the reality of how online works close enough to people’s intuitions that they’re not so easily bamboozled.
Sahar Massachi (24m 3s):
And even just as designers of these products or as like, you know, it doesn’t have to be about the us virtuous, enlightened people and the like bamboozled masses, just like the people who make these products. Also having tuitions that aren’t incorrect, you know, the power users, the, the so-called sophisticates and the people who take to city life. They also suffer from the same sorts of mismatch between what they expect life to be and how it actually is. And in that gap is where a lot of dangerous behavior can exist.
Jeff Wood (24m 32s):
Who are the power users? Like what’s a, what is a power user? I’ve read that a couple of times. I’m curious, like what your definition of a quote, unquote power? Is it just somebody with a lot of followers or is it somebody that uses technology to run bot armies and things like that?
Sahar Massachi (24m 45s):
No. Fuzzy term, let me, let me, sub-categorize it. Cause I do mean a few different things at its core. It’s someone who uses the app a lot, right? Could be someone who posts a lot, could be someone who really is sophisticated in their use of the ad targeting tools. I submit to you that if you looked at the top, people who use the most sophisticated ad targeting cools on Google or Facebook or whatever, I think they’re overwhelmingly going to be spammers and scammers. Like the people who like really understand the system better than even the people who are maintaining the system. Like those people understand the system so well because they’re looking for loopholes and ways to gain a sort of advantage.
Sahar Massachi (25m 27s):
And you can do that a step further, the people who post a lot, who like own a fleet of 500 different pages, understand the dynamics of how it works and what goes viral. And like, what are the tricks and loopholes that they can go through better than even the builders and the caretakers of the system. So it’s a little weird because they’re the most active customers of whatever feature you’re talking about or, or internal product where the product is ads or groups, or I don’t know, subreddits they’re either the people who are sort of explicitly trying to abuse the system or are there people who don’t mean to, but just empirically are often the ones who are the ones who create the, the bulk of the harm are more likely to create the harm.
Sahar Massachi (26m 12s):
And isn’t that weird? It’s just a strange dynamic. I mean, you could say like the power drivers of automobiles cause like more accidents, but it’s sort of like linear, right? It’s just cause they use automobiles more, but this would be as if like, I don’t know the very people who use cars the most were also the ones who knew how to like attach, I don’t know, turrets to them and blast pedestrians or something
Jeff Wood (26m 34s):
That kind of goes to the idea of say integrity, design over content moderation. Right. Because if you’re just monitoring the end content, you’re not addressing the systemic issue that these people who are taking advantage of the system and pushing out so much information all the time, that feels like the kind of a main point.
Sahar Massachi (26m 51s):
Yeah. I mean, there’s like we had a large argument or discussion inside the Institute about what is spam and how you define it. And I don’t think we came to a conclusion. Really one idea that came around that I really liked was spam is abusing a messaging system in ways that it was not meant to be used. So like spam in a phone call scenario is just like, I don’t know, having 30 simultaneous phone lines going to the same house and using them up for the whole time to just call everyone in the phone book spam in an email scenario is just like emailing people who never consented emailed to you. You don’t have a relationship with who like the system wasn’t meant to allow that I’d say spam in, in a Facebook scenario is just doing things that you were not supposed to do.
Sahar Massachi (27m 35s):
Even if you can do them like posting 500 times an hour, like finding ways to push crap to people who are not your friends on Facebook and not even like your friends in real life. And that’s all a windup to say that a lot of these problems are just spam problems, right? It’s, it’s spam from the producer side of, of posting a lot and spam from the consumer side of seeing things that you don’t really want that you didn’t sign up for that is there through someone like breaking a loophole in the system and there are definitely integrity harms that don’t come from spam, right? Like child sex abuse material is the canonical example. Right. But you’d be surprised how much of the problems on social media are variations on people pushing fake.
Sahar Massachi (28m 21s):
Ray-Bans only instead of pushing fake Ray-Bans, they’re pushing hate speech manipulation, you know, lies, propaganda, whatever it is.
Jeff Wood (28m 28s):
So you’re saying that when somebody signs up for my newsletter and then 10 days later, they say to you, when they sign off that it’s spam. It’s not actually true. I feel like, I feel like I get that from time to time. I’m like, I can see exactly when you signed up. And then when as running a newsletter, that’s really funny. And then also just kind of like the interesting thing is, is for me specifically, because they do run a newsletter and I do send out mail every day to people, you know, it’s interesting to think of what is legal, what is illegal? What is a norm, what is frowned upon, but really you can do it if you really wanted to like scraping email addresses off online and things like that. You can grow your email list to hundreds of thousands of people, but whether it’s ethical or not to do so is another question and interesting to think about how you limit those types of moves forward.
Jeff Wood (29m 16s):
Some people take advantage of the system and try to try to grow as fast as possible because they want to be a quote unquote, unicorn, but others take the lesser path or the slower path, the, the righteous path I would say to try to grow and, and make something real and happen.
Sahar Massachi (29m 31s):
I can go a little off book here. And first of all, you know, I’m here as, as myself, I haven’t to run the integrity Institute, but like we have a strong norm inside the Institute that everyone speaks for themselves. And the Institute doesn’t take many positions. I’m about to say something, that’s my position. And also everything’s my position. But especially this, it’s a problem of market competition, isn’t it. And if I may, like we sort of solve the email spam problem and we solved it in a really bad way that doesn’t quite work, but it works well enough. And the way the society solved the email spam problem was basically through taxing email. Like if I were to open up Gmail and email someone right now, I don’t have to pay like a fraction of a penny. But if you wanted to send email to your newsletter through MailChimp, we’d have to pay for it or Mailgun
Jeff Wood (30m 16s):
Sahar Massachi (30m 18s):
And that’s because basically all the email, like sort of inner nodes formed a cartel and they said anything that doesn’t pay. It’s just so likely to be spam that we’re, we’re gonna mark it as spam. And the end user that like the sender of the email might not be a person who pays, but in a sense of being subsidized, right? Like if I use Gmail, Gmail is sort of paying that fraction of a cent per email for me in exchange they’re, you know, monitoring my use and making sure I am not a spammer email. People also, you know, did other things they did really terrible hacks. Like their domain actually really matters. Like if you try and send email from a.xyz domain, it’d be very hard for it not to be marked as spam or I think a.info as well, just because of the sort of sociology of the sorts of people who use the domain and then the empirics of how often they send standing or from it.
Sahar Massachi (31m 7s):
And one reason we’ve solved it is because there was an overwhelming, you know, societal imperative to solve email spam. One reason it was solved so badly is that emails a protocol. So even if Google and Microsoft today decided they were going to like add a new feature to email, there’s still enough independent players around, they couldn’t force it through necessarily. It’s really hard to change a protocol. So, you know, this might be a tangent, but that’s why I’m really scared about these new social media ideas that are saying, well, decentralize it blockchain, whatever, because if you don’t bake in the integrity design from the beginning into a protocol, it’s very hard to change it. And if they’re creating an entire social network based on that protocol, you better get it right.
Sahar Massachi (31m 49s):
The first time
Jeff Wood (31m 50s):
It doesn’t have exact correlations because you know, physical, like you said, the physical world is different than the digital world to a certain extent. But I think about it as traffic, right? I mean like congestion pricing and those things that we want to try, it seems like that’s been tried, but you still get the spam. And so I’m not sure how those two correlate, but that popped into my head. There were other two driving analogies that you mentioned in the piece too, that I thought were really interesting in ways how to solve, solve some of these issues. One of them is the DMV and, and driver education, those types of things. Then the other is speed bumps. I would, I would say road diets, but speed bumps work as well. I wonder if you could explain kind of that analogy and thinking about that as a way to promote integrity.
Sahar Massachi (32m 30s):
Jeff, I’m so excited for you to tell me where I’m off base here or what the replacement for speed bumps are because everyone hates speed bumps. It’s actually a pretty bad analogy because of that. Whereas I’m proposing a thing that would not hit the majority of people, it’d be a sort of magical speed bump that would only pop up if you’ve driven more than a hundred miles a day. I don’t know, but to the, to the idea. So the basic idea here is we have this online city, as you know, it does have a lot of characteristics like a city. And in this city, people have superpowers. I call it the city of atomic Superman. They can disguise themselves perfectly. They can clone themselves, they can make robot armies. They can, I don’t know, teleport or fly.
Sahar Massachi (33m 10s):
So then the question is like, how do we sort of look at these superpowers that people have online and then curve them a little bit so that people still have the superpowers, but they can’t abuse them as much, or they can only use them a certain amount of time per day or something like that. And I grasped towards metaphors and real life to help think it through. Because again, I just think that there isn’t too much new under the sun and analogies really help. So for the DMV, one super power that people have online is the ability to create new accounts. It is very easy to make a new account on, let’s say Twitter or Reddit. And this creates a problem because if a platform arrests someone for breaking, I don’t know, take talk law.
Sahar Massachi (33m 53s):
They can just pop out of jail by creating a new account and go on about their day. There’s some ways that platforms try and deal with is by like detecting fake accounts and some do a better job than others. But another way you can do it is just sort of make it so that new accounts don’t have the same power as that legacy or an older account has in the same way that children can’t drive cars. You have to go through some tests, be old enough, be sort of vouched for by a parent or guardian before you’re operating heavy machinery, the heavy machinery on social media, it could be something like a Facebook group or a retweet button. Maybe you need some amount of being accepted by societies and adults before you allowed. You’re allowed to use these things that are dangerous, like powerful they could be used for good, but also dangerous is that that’s the first idea.
Sahar Massachi (34m 40s):
The other idea speed-bumps comes from this abuse ability problem. And the concept here is twofold. The first is, again, you just, the harm is coming from people who are using every day actions and behaviors and just spamming them. You could say posting a lot and that the second comes from this sense of, we don’t need to have perfect certainty that a particular person is doing something bad before we make it harder for them to do the thing that is probably bad. So you can imagine like if we’re 50% certain that a person is performing hate speech or harassing someone, maybe we make it just like 50% harder for them to do the next thing that looks like harassment today.
Sahar Massachi (35m 21s):
Maybe I don’t know, their posts takes a little while longer to, to, you know, from the submit button to the success model or something like that. Or you have an interstitial that says, are you really sure you want to say this thing? You know, you’re not stopping people. It’s just some annoyance in their way. And the analogy here is speed bumps, which again, please help me find a better analogy.
Jeff Wood (35m 42s):
Well, the one that just popped in my head was 20 is plenty. There’s a speed limit, 20 miles per hour, which is basically if you go above 20 and you get towards 25, 30, 35 40, there is an increasing curve about whether people will die if they’re hit by a vehicle. Right? So if you’re going 20 miles an hour, it’s unlikely that somebody was hit by a vehicle was going to pass away. And so, you know, a lot of cities have actually lowered their speed limits to 20 miles per hour because they are trying to limit the amount of, of people who are killed or maimed by a motor vehicle. So that might be better than speed, speed, my business specific tool, and maybe 20 is plenty as a specific tool as well. But that actually that ramp up towards damage might be actually a good analogy for you as well.
Sahar Massachi (36m 25s):
Yeah, yeah, yeah. Perfect. Yeah, like let’s say a city decides to implement 20 is plenty and maybe they, you know, because we’re in the magical world of code, we’re able to sort of like make it happen through magic, right? They could like change all the cars tomorrow so that they didn’t force it. Maybe they enforce it where you could go 21 or 23 miles an hour if you wanted to. But I don’t know. It’d be harder. You can do it very often per day. Maybe. I don’t know what your car would turn out like ugly or color or something like some kind of, some kind of small thing where you’re not going to do it very often, but you can, if you really need to, they’re not censoring drivers, they’re not singling out drivers, you know, hitting 22 miles an hour and having your car turned vomit color.
Sahar Massachi (37m 10s):
And like, I don’t know, a little beeping noise start in your car until you, you go back down to 20. It’s not that you’re being punished for doing something wrong necessarily. It’s just like a safety measure for everyone. Even if the, particularly on your stretch of road, there’s no children to hit. That’s the kind of thing that we’re getting at here. It’s design changes for looking at information flows and how at the margin. And also empirically, if we make this design change better, things will happen without getting into the details of, you know, this person did a bad thing and they must be punished. Like, ideally they’re just no need to look at people’s content at all.
Jeff Wood (37m 48s):
What kind of feedback have you got from this piece specifically? Have people reached out to you and been like, I, I don’t like the city analogy or I love the city analogy. What, what has been kind of the pushback or the plaudits?
Sahar Massachi (37m 58s):
Yeah, you know, I was on Twitter and I was checking my like messages from people who don’t know tab and there’s this guy asked me to be on his podcast and that really be my day.
Jeff Wood (38m 10s):
Other people besides me,
Sahar Massachi (38m 13s):
MIT tech review says that like my editor there says that it was a really good piece and they’ve gotten a ton of great feedback and he was really pleased with it. I don’t know if he does that. He tells it to all his talent, but our freelancers, but that made me feel good. Professors have reached out to me and said what they’re going to do in their courses, which is a real honor. I don’t think that anyone has really argued and said it was bad and dumb in a way that was neat enough to make me remember. The positive thing I can think of is just yesterday, we hired our first director of operations and she told me that a friend of hers who I don’t know, send it to her. And this friend is an architect. Who’s all about say planning.
Sahar Massachi (38m 53s):
And he said, have you seen this? I think be right up your alley and some other nice things that I feel a little embarrassed to share, but they were very nice. And that was cool. Like independent verification that the world actually seemed to like it. Do you think you’ll keep going with the urban analogies moving forward? Well, you know, maybe we can just keep talking and you can give me some ideas and then I can steal your ideas and turn them into another piece. I don’t know. Can I try one on you right now? Sure. So this is a physical analogy, but it’s not an urban one. You can think of the Twitter feed, the Facebook newsfeed, the Reddit algorithm, the Ted talk algorithm, whatever it is, the sort of like ranking what stories people see and having, making a decision about what people see on their feeds.
Sahar Massachi (39m 35s):
It’s a black box and people sort of think about how do we understand this immensely complicated black box? I’d argue what it is, is more with the gravity? Well, this is a system that rewards bad behavior. The bad behavior it’s rewarding is inflammatory content, hate speech, misinformation, whatever it is, this is sort of a thing that you can test empirically. And the reason this is happening is that all these companies have teams of engineers that basically make tweaks to these, these ranking systems. And every time they make a tweak to the ranking system, they do a randomized control trial, and they look at metrics to see how version a changed versus version B, right? How the experiment versus control.
Sahar Massachi (40m 15s):
And if the tweaks make engagement and growth go up, which means more people posting, commenting, sharing, and also more people on the platform overall, they’ll ship that change. And you can imagine that over the years, we have this highly optimized machine for getting people to post more and getting people to stay on the app. And it turns out that things that cause outrage, et cetera, perform better than like wholesome things. I don’t think this is like very like regulatory, like people who are saying this for awhile. So here are the, here’s the idea. So like the Stephen Hawking Einstein, like rock led weight on a rubber sheet. That’s what the newsfeed algorithm is.
Sahar Massachi (40m 55s):
It’s this gravity well pulling people towards and setting up incentives for people to do bad behavior and all these different, like ways the platforms try and solve their problems are like putting little barricades in front of some section of this, of this, you know, gravity well and saying, you know, the marble can’t get past this barricade and that might even be true, but there’s immense potential energy there. And you know, people will find their way around that barricade to go down because that’s where the gravity is. And so the real challenge of integrity team is not building more and better barricades to completely surround, you know, circle it’s to invert it and to have it be a hill rather than a gravity.
Sahar Massachi (41m 37s):
Well, we have a bunch of ideas of how to do it, but that’s sort of the abstraction.
Jeff Wood (41m 42s):
Yeah. I understand completely what you’re talking about. It’s harder to walk uphill then than it is to have gravity pull you down. The thing that came to my mind and I imagined it came to mines of listeners is four-step transportation models. We’ve incentivized driving for so long and engineers are focused on building these models for regional planning. And what they do is they continuously incentivize driving instead of reduce in the time of climate change, reduced VMT vehicle miles traveled is a very important metric. And so we increasingly promote driving, promote driving. And then the model tells you, you need to build a freeway here and the freeway expands. And so induced demand, induced demand, and it keeps going and going. And so then you have the United States where a majority of people drive instead of walk back and take transit, active transportation modes, those types of things.
Jeff Wood (42m 25s):
So it’s very similar in my mind to the ideas that you’re talking about because we want to not kill driving altogether. We want to make it harder in places where you should be doing other things you should be using active transportation. The zoning code should allow for more dense cities so that we can save the planet, basically from this madness. And even with electric cars, it’s not going to, and we have this discussion on the show all the time, but even with electric cars, it’s not going to change them. A lot of people are driving and they’re creating sprawl and, and have all these in nature, nature impacts. So that’s the connection that I made. There’s a four step transportation model that most DLTs use, and it creates these mechanisms for forecasting demand.
Jeff Wood (43m 9s):
And then the forecast always says, the driving and the need for infrastructure is going to go up right in a 40, 45 degree angle. And there’s these hilarious graphs that people have made where you have the actual demand for driving on a freeway. And it’s like, you know, it’s kind of going down slowly over time, but then the model keeps on going like this. There’s like shunts out the side that show that the model showed that driving is going to increase this much. And it’s a 45 degree angle. Whereas the actual curve is actually kind of drifting downward. And so you have this perverse incentive from that mechanism to continue to build roads, to invest in, in infrastructure that’s new instead of investing in bridges and roads that need repair, et cetera. And so I think that’s a pretty similar actually analogy to what you’re talking about.
Sahar Massachi (43m 52s):
Yeah. Can I ask you a follow-up question? What are the role of metrics and all this and like organizational?
Jeff Wood (43m 58s):
Well, yeah, I mean, you know, for a long time, actually the were given money from the feds based on how much VMT they had. So there was actually an incentive to increase VMT because they would get more money that way. And so you had more driving, you had more, more money for building more roads and it was a perpetual cycle. So that type of metric was always going to increase driving over time. So,
Sahar Massachi (44m 21s):
Oh my gosh, this is really good. Can I take it a step further?
Jeff Wood (44m 24s):
Sure. I mean, I will say though that this is four-step modeling and transportation modeling overall are not my expertise. I know tangentially about them as from being an urban planner, but I’m not the expert at them. So if you want a specific like person to tell you the details, I’m probably not it, but I can tell you the basics as it were.
Sahar Massachi (44m 42s):
So, so I feel like what you’re telling me is the story of bad metrics, cause bad design, which causes undesirable behavior to happen, which gives you some short term success. But in the end, you end up with environment and defaults and so on that, you know, make people miserable and maybe even like global warming comes to, you know, doom it all. But even aside from that, you, you sort of like by following these short-term incentives end up in a place where like everyone’s unhappy.
Jeff Wood (45m 11s):
Well, think about freeways, right? Think about, think about the 4 0 5, they expanded the 4 0 5 freeway. They spend a billion dollars and now it’s more congested than ever. It might’ve, you know, there wasn’t even, like there was a, like a little bit of bump of traffic impact where people felt a little better about it. And then, you know, and then it got worse. The most perverse incentive from this is that it made Elon Musk mad, right? And made him think of these ideas of putting cars in tunnels, which is, you know, a horrible idea as compared to a subway, but he continues to do it anyways because he wants to personally travel in a tunnel to escape the 4 0 5 traffic, which was created because of the system of Los Angeles traffic that perpetuated it. So if anything, the four step model has been perpetuating bad decisions for a long time.
Jeff Wood (45m 55s):
And it even perpetuates these people that shouldn’t be making transportation decisions to make them, but who have a lot of followers and bot armies and those types of things. So maybe that’s actually a connection to, to what you’re talking about.
Sahar Massachi (46m 8s):
I think the induced demand part is so important. It’s a little bit of like a marketplace thing. Like if you create a system where posting plagiarized content from like, you know, Patriot Eagle forum, page two, like old ladies who don’t know how to use Facebook on groups that, you know, they joined works, you know, the Macedonian teenagers are going to do it and it makes them donate and teenagers are going to do it. Then like also like some Americans are going to do it too. All of a sudden you get into a place where, when the company belatedly realizes they created a monster or some parts of the company, do there’s people running these pages, running these groups who say, well, you can’t get rid of it. Now I base my livelihood on this, or you’re censoring me if you clean up your act in the same way I imagine we’d have, I don’t know, trucking companies or people who built their homes in the suburbs say like, or the excerpts say, well, you can’t get rid of this road.
Sahar Massachi (46m 59s):
I needed to do my job. Like path dependency is really real. And it’s difficult.
Jeff Wood (47m 3s):
Can I give you an example of that? Yeah. Well, so there’s a, there’s a road connector in Dallas called and there’s been a concerted effort to tear it down because it actually separated downtown Dallas from an old. Now it’s a deep Ellum, which is kind of an older district. It’s historical. It’s also the entertainment and urban designers and people in Dallas are, are working to think about how to redesign this area and tear down the freeway. But there’s a lot of folks who say, I need that road to get to my job. I need that road to do this, that, and the other thing, the solution is elegant in that it’s a surface street, that’s actually slower. It’s going to promote the reduction in speed. It’s going to promote more urban development that connects deep Ellum with downtown Dallas. But there are those people that are pushing back on it and they are saying, you know, we need this road or what are we going to do in alternative, or you’re keeping low-income people away from their jobs and you’re doing this, that, and the other thing, when, you know, ultimately it’s actually a solution for the city to grow a better way.
Jeff Wood (47m 58s):
And so you have that, like you said, path dependence that keeps, you know, freeways from being torn down. You have a nurse really strong inertia in Houston. They’re actually trying to expand the freeway, but they’re going to have to take thousands of businesses and, and, and houses from people who have lived in the neighborhood forever and ever, and ever to expand this road. And, you know, we should have learned from this many, many years ago because of what happened during the freeway expansion era, but we’re not. And so the DOD is actually kind of said, you need to prove to us that there’s an environmental benefit to this, but with another got with another administration, it might not be so positive as a result. So far, my imagination says that they’re going to eventually push it through, but there’s hope that it actually might get stopped.
Jeff Wood (48m 41s):
So there’s those path dependencies, those inertias those things that happen all the time in the transportation and urban planning world. And it’s quite analogous. I’m pretty impressed with actually how connected these two things actually are
Sahar Massachi (48m 52s):
Jeff, are you telling me this? Isn’t the Robert Moses club podcast?
Jeff Wood (48m 56s):
No, no, no. And I’m not even going to say, oh, well he did some good things. No, there’s people out there that say that, but no, I’m not, not a fan of Robert Moses, for sure.
Sahar Massachi (49m 8s):
I’ve made a huge mistake.
Jeff Wood (49m 10s):
I’m sure most of the listeners are not either.
Sahar Massachi (49m 12s):
I remember my final analogy here. You have path dependents. You have, you have this problem where you create this infrastructure in this mode of being that like doesn’t work and then people are locked into it. You kind of say the same things on, on social platforms. We think so. Like it’s not in the long-term interest of social platforms to have a culture of bullying and harassment and just sort of toxic behavioral clients, but it looks good on their short-term metrics. So they’ll do it. But long-term, you know, they have a problem where the only people left on the apps are like the bullies in the trolls and they don’t want that. I don’t know. I feel like there’s a, like a climate change sort of metaphor there or something where like, you know, you built your thing irresponsibly for decades or for years and years, and now the reckonings do, and your user base collapses, or you turn around and your entire network is just like, you know, shitty copy pasting means.
Sahar Massachi (50m 3s):
And that’s it because all the quality has fled. Maybe that’s the analogy too. All of a sudden you have excerpts that you don’t want.
Jeff Wood (50m 11s):
Yeah. I think there’s something there. It’s probably not as clear cut as some of the other ones, but I think it works. You know, you have a lot of places where you can’t serve longterm with property taxes. I mean, somebody’s 70 and the excerpts might have low property taxes now and they might’ve moved out there because the housing was cheaper, but eventually the bill comes due and you’re going to have to pay for all those fixes, whether it’s pipes, whether it’s roads, whether it’s water, infrastructure, those types of things, they all cost money and cities can provide that better and cheaper because of the way that people are clustered together. And so that’s something that we haven’t really reconciled in the United States is the cost of sprawl. People talk about it. People know about it. People understand the math problem that is inherent in that, but it’s not something that we’ve, we’ve actually dealt with realistically.
Jeff Wood (50m 54s):
And then the bill will come due. Eventually you’re going to have a situation where you have to, you know, like the infrastructure bill getting rid of all the lead pipes in, in places around the country, they’re still not gone because there is hard and they want to spend money on the shiny new stuff, not, you know, the old faithful stuff that they need to do to maintain, you know, a positive infrastructure. So that, that’s the tough part there.
Sahar Massachi (51m 14s):
Kind of like wanting to set up a whole new virtual reality megalopolis without actually preventing, you know, Pakistani intelligence networks from taking over black lives matter is in Christian communities around the U S
Jeff Wood (51m 27s):
Something like that. Yes.
Sahar Massachi (51m 30s):
It’s very strange to just see like intelligence agencies doing this stuff. But I guess as a tangent, one thing you said really struck me, can I rant about path dependency? Sure. Okay. How do you think about corruption? Like how do you think about like sort of ethics and choices? I don’t know if you cover that a lot or are you, there’s like a hip Streetsblog way to think about it
Jeff Wood (51m 49s):
And Streetsblog covers that extensively? I do not necessarily it’s relevant, but it doesn’t come up in a way that I’m going to talk about it specifically corruption and those types of things they exist. There was actually, we had a guest on a number of weeks ago talking about asphalt and the history of asphalt, which is really fascinating. He talked about how basically, you know, a lot of the asphalt and road companies were basically cabals early on and maybe they still are, but early on. And you know, a lot of the political machine from cities would, you know, in the mob and all that stuff would make money off these asphalt companies kind of a Monday money laundering situation. And so it’s interesting how a lot of the infrastructure around the country was constructed by these cartels per se.
Jeff Wood (52m 34s):
It’s also interesting to think about how asphalt transferred from a natural substance that people mind out of places in, in say north of south America to, you know, basically the best materials was oil refineries. The sludge that was left over from the refining was actually used to pave roads in a way that’s actually, you know, from his discussion, I was really interested, you know, to think about it as it’s actually capturing carbon because you’re not burning it. You’re actually just laying it down. But those cycles cycles are really fascinating to talk about. And the corruption of it all, it was really interesting to just from those that machine politics, the, the mob, the, those types of connections to the asphalt industry and the road building industry. That’s so
Sahar Massachi (53m 12s):
Interesting. I’ll have to look that episode up. I think that like, if I may, so, so we’re talking about integrity, which, which I talked about is like a form of structural integrity, but integrity also implies ethics. And literally there’s just like three different of harms to integrity that come from like ethical challenges or ethical failures that I think I want to talk about it. And one of them is related to this path dependency problem that we were just discussing the first is, is this sort of path dependency, right? Like you build a feature, people use it. If you want to make any change to your, your platform, there will be winners and losers. And if the losers, you know, understand that this is happening, they’ll going to complain. And if you let their complaints sway you, then all of a sudden you’ve backed yourself into a corner of never being able to make significant changes.
Sahar Massachi (53m 58s):
And that seems to be similar to the dynamic that you’re talking about with, you know, tearing down or putting up roads. And I think there’s a parallel there, but there are two other kinds of ethical failures that are related to integrity work that I think are just important to call out because they are indefensible. And it’s important that we talk about it. The first is if you are a large advertiser for a social media platform, you’re going to have a sales person who’s sort of detailed to you. And this sales person, their ultimate loyalty is to the company, but they also have loyalty to you because, you know, you’re the money that’s coming into the company. If you post a thing that goes against terms of service and it gets, you know, just physically content moderated, you can complain to your salesperson.
Sahar Massachi (54m 43s):
And that salesperson has a whole channel, at least at some companies to sort of be an internal lawyer on your behalf and advocate that, that on decision be overturned in that, you know, the laws of the social platform shouldn’t apply to you because you’re such a nice advertiser. And this kind of makes sense from the perspective of a business, but it’s indefensible from the perspective of, you know, a city or a citizenry, or just like a information ecosystem. It happens a lot and it’s indefensible the thing that’s more indefensible even I think is you can imagine a setup where the people who decide what stays up and what goes down and make the tough decisions are also the people who decide like what integrity changes should be made.
Sahar Massachi (55m 25s):
That’s already like a red flag, really bad idea. But if those are the same people who are lobbyists for the company or care about, you know, what they call PR risk and their job is to make sure that, or to deal with people being mad at the company, you set up a dynamic where a politician or demagogic media figure, or sometimes people are both will say this social media platform is censoring me. If you ever try to enforce their terms of service on them, or even just roll out an integrity change that covers everyone, right? Like imagine you rolled out integrity, change that like cracked on a fake accounts or manipulation of fake accounts. And some people benefit a lot more from Vicky counsel than others.
Sahar Massachi (56m 6s):
And so for them, they’ll say, oh, my, my reach went down by 20% this week thereafter me, you can imagine these demagogic media figures telling the company, I will denounce you on my cable news show and sick my Army’s of followers against you, unless you backed down or even like, I will kick this company off out of my country, unless you allow my political party to have the sort of subsidy of being able to post inflammatory content and things that is hate speech and violence in terms of service and the company’s back down. And at least some of them do. And it’s really, really bad. And remember that a small number of people do a majority of the damage. It’s like a huge power law. So even though these exceptions are being made for a comparatively small number of people that does a huge amount of damage to the overall system, putting aside sort of the like knock on effects, I don’t know what the city metaphor is for that, but I think it’s important to just point out this isn’t just a technical problem.
Sahar Massachi (57m 1s):
It’s also one of ethics,
Jeff Wood (57m 4s):
Your previous one to this one, I think kind of sounds like, you know, building code corruption or something along those lines. I mean, people were getting paid to help you kind of put in systemic fixes, right? So if you want to, you know, build something that’s against code and you slip somebody some money under the table in some form or fashion, you know, then it goes on, there’s a whole thing here in San Francisco about that guy who was getting money and actually re redoing the checks to have, you know, have his name instead of the department of building inspection, DBI, what was his name? Allegedly Rodrigo’s cinch Sanchez. I think it was his name. And he, and the people would write checks to DBI the department of building. And then he changed his name to rod Bigo or something like that, like, so that, you know, when you wrote the check, so it was very, very shit.
Jeff Wood (57m 49s):
He got caught, there’s a big thing going on about it, but it sounds very, very similar in that respect.
Sahar Massachi (57m 54s):
Yeah. With the, I think the big exception is that the money is not under the table. Right. It’s like completely like out in the open, we’re just a paid advertiser and we demand better service. Right. Which I don’t know, is that worse or is that better?
Jeff Wood (58m 5s):
I mean, political contributions for candidates and whether, you know, your support, this thing or that thing. I mean, I, there’s a name Joe mansion that comes to mind right now. So those types of things I can imagine, some people are completely open about. It seemed to not have any type of pushback on them. So that’s interesting connection as well. Well, so what’s next for you and how can folks find you online if they want to check out your work?
Sahar Massachi (58m 28s):
So I personally am , you can find me on Twitter. I S a Y H a R or I’m my own personal website hosted on my own server on WordPress Sahar dot IO. I have a little blog. I make mix tapes for my girlfriend. It’s all very wholesome. I am also, you know, in some sense, repping the integrity Institute, which I co-founded, the integrity Institute can be [email protected] We also have a Twitter with a stupid name, like integrity, underscore inst because Twitter has character limits. And in terms of what’s next for us. So we’re two things, right? Where a think tank that produces think tanky materials. And we’re also community of people who do this, or have done this as a full-time job and typically worked for companies for platforms.
Sahar Massachi (59m 13s):
So one thing that we’re doing is adding new members, cultivating the community, coming up with new committees and working groups. And that is intense. It’s hard. But the other thing we do is come out with these materials and these working groups help us come out with these materials. So right now on our website, you’ll see some materials around transparency of social platforms and the different ways that you could have it and why you need all of them together and very specific metrics you could look for and frameworks to think about what’s going on and you need sort of frameworks to understand what’s going on before you asked for transparency about it. The thing we’re working on is more in the integrity design space. So it’s, what are the best practices for companies, both in building and in measuring that they’re building the right thing.
Sahar Massachi (59m 55s):
We’re going to have a series. We’re calling it integrity 1 0 1 to 4 0 1. So 1 0 1 will be sort of the basics for smaller companies with maybe, I don’t know, one to three integrity workers. And then the 4 0 1 will be your like take talks, Google’s YouTube or Facebook’s who like have the resources and maturity to actually do the best practices and the emerging best practices. And then we’ll see, I, I imagine we’re going to do a lot of stuff around the elections around the country, this, around the world this year and monitoring them and creating data sets in different ways. You know, listen, I’m very excited that we hired our first temporary staffer yesterday. And you know, we’re still at the level where, when someone donates $5,000, we throw a party. So we’ll
Jeff Wood (1h 0m 35s):
See. Awesome. Well, so hard. Thanks for joining us. We really appreciate your
Sahar Massachi (1h 0m 37s):
Time. Yeah. Thank you for having me. This was really fun. It’s an honor to be here
Jeff Wood (1h 0m 47s):
And thanks for joining us. The Talking Headways podcast is a project of the Overhead Wire and posted first at Streetsblog USA. Thanks to our wonderful patron supporters for sponsoring this show and Mondays at the Overhead Wire, you can support the show by going to patrion.com/ The Overhead Wire. You can sign up for our 15 year old newsletter at The Overhead Wire dot com. And you can listen to the show on your pod, catcher of choice, including Spotify, Stitcher, SoundCloud, I heart radio, and apple podcasts. And if you can’t find it there, you can always find it. Its original home. Hey, USA dot Streetsblog dot org. We’ll see you next time at Talking Headways.