Artsy Engineering Radio

Ethical Software Engineering

January 19, 2022 Artsy Engineering Season 2 Episode 2
Artsy Engineering Radio
Ethical Software Engineering
Show Notes Transcript

On this Episode  Erik and Kaja talk about ethical questions in software engineering and how the tech world has a responsibility to build good and ethical software. They will cover some examples when things definitely crossed the line of being ethical and try to figure out how to be an ethical software engineer. Talks mentioned in that episode where "The Reasonable developer" by Joe Corcoran from the Isle of Ruby in 2018 and "So you can Sleep at Night: Ethics in IT"  Jonathan Rothwell & Steve Freeman at the GOTO Berlin 2017 (https://www.youtube.com/watch?v=xI5qEJ39KMc).

Unknown:

On this episode of Artsy Engineering Radio, Eric and I will talk about ethics in software engineering, and how the tech world has a responsibility to build good and ethical software. We will cover some examples where things went wrong. So stick with us for some good failure stories to take home some learnings. Welcome to Our Podcast. Today we want to talk about ethics in software engineering. But first, let's introduce ourselves. I am Kaja recording from Berlin today. And I'm a software engineer at Artsy and how about you?

Erik Stockmeier:

Hi Kaja! I'm Eric, I'm also a software engineer at Artsy. And I'm actually recording from my parents house in Chicago. I'm visiting for the holidays today.

Unknown:

Yeah, it's almost Christmas. Yeah, I know, I chose a huge topic to talk about. I am sure we cannot cover everything today. But let's start with it and see where we will end. Let's talk about when we first encountered this topic, or when did we first start to think about ethical versus unethical software engineering? And why do you want to start talking about this?

Erik Stockmeier:

Yeah, to give a little background on me and sort of my journey into software, I am in my late 30s. Now, I spent most of my 20s, after graduating college kind of moving back and forth between different jobs broadly doing a lot of activism and stuff, I was a 18, or almost a teen when 911 happened in the United States. And that had a very profound impact on me. And then, you know, immediately after that, in college, the failure to stop the Iraq war from happening, I was I was politicized at a relatively young age. And so that kind of defined my my 20s Looking for ways that I could be active in the heart of an empire, looking for ways that I could change that. And eventually, I came to an appreciation for popular education. And I was briefly a public school teacher, not in a technical subject, but history. And basically, the the labor situation for that in the United States was was very bad in the early 2000 10s. So I wound up leaving that profession. And I zeroed in on programming because, for me, it was a place those rich with technical problems that were fun to solve. But also, it seemed like a very valuable skill that I could, you know, apply in other areas were generally making work in the the kinds of things that I are making money rather than the kinds of things that I wanted to be doing that I thought would be good for the world. It just wasn't an option. So I thought, I'll find a skill where I can make money. And that would also be something I could apply ethically. The larger world. Yeah.

Unknown:

Cool. Yeah, I'm already seeing so many things that we have in similar, in common here. I mean, I also, when I studied, I actually also had to do a teaching job on the side, and under really bad working conditions. But back then I wasn't aware of conditions or that there are good working conditions. I thought work always sucks. Yeah, but I mean, for me, the Yeah, I graduated in philosophy. And so I come from an academic culture of critically thinking about everything and deconstructing dogmas. And so it is kind of my nature to do that. throughout everything I do in my life, specifically this ethical versus unethical software engineering. I stumbled over over it when I was asked to speak at a conference of the German army. And that was a shock for me. I didn't know back then that they would have anything to do with software engineering. But when you have nothing to do with war, or the army, you still have these old pictures in your head of ancient pictures of war, but actually, nowadays, worse includes a lot of technology and software as well. Yeah, so the German army is also recruiting a lot of engineers and that's why they're also doing this kind of events and searching For speakers, yeah, apart from declining to being a speaker on a panel, I also make them was starting to do my first open source stuff and added paragraph into the license of my first piece of open source software that says that this code is only to be used for non harmful services or, you know, not allowed to be used by the military. Yeah, that was I mean, it was a bit naive, just writing this little paragraph into the license. But I thought of the story that I had heard of an engineer actually having having open source code out there and then being approached by someone in the military, if they could actually use his code. And it was like a weird story for me. So I thought I'd write that into the license. Yeah. And then the other thing that happened was that I saw two conference talks that really impressed me. One was at the Isle of Ruby in Exeter, in the UK, and was a talk, I think, by Joe Koren was his name about the history of unionizing in engineering. And in this talk, he he displayed this, all these kind of examples of engineers having to build stuff that they refused to build, because it was something unethical, in their opinion, often weapons. And then in order to being able to refuse, they had to unionize because their job was in danger. Then I saw another talk about ethics in software engineering, at another conference in Berlin, and it was like, talk with some horrible examples of software can actually be used in a very, very wrong and harmful way. And it was also an eye opening talks. But yeah, the next thing that I would like to talk about is, where would we like refuse to build something? Right, what is unethical? And where would we draw that line? Or how would we judge that? Do you have any idea? How would you draw the line there?

Erik Stockmeier:

Yeah, I think. So this is a really tricky question to make. And one that I think, especially as engineers, there are ways to think about it that we have the patterns for, but we aren't used to applying them to our interface with the out of the computer world. I don't know how else to put it exactly. But we have tensions right in between how we see the world and how we talk about it, especially in like startup culture, and the, you know, the current tech world, and then how it actually functions, the thing I kind of zero in on is first something that people would say, when I worked in public education, which is like the schools are failing, the schools are failing. This is always applied to you know, like, under resourced urban public schools, that among the people in the teaching profession, we would say the schools are not failing, the schools are working the way they were designed. The schools are designed to under serve these people, you know, basically, lower class or populations with more people of color, etc, that those schools are failing because they're designed to fail. We're used to saying that like the computer works, it's your code that has the bug. And we're used to thinking about in our tests, like, what is the input? What is the output? What are the side effects, we found a bug, oh, there's, there's a side effect in this code that we didn't account for. And we can write a new test for it. But in general, the code we write interfaces with the real world in ways that it's much harder to control for those inputs and outputs. And that's what I come to. So to give an example, maybe similar to what you were talking about with working for the armed forces, it's easy to think about ethics in terms of like whether you should pull a trigger. It's harder to think about it in terms of automating something that is going to remove the human decision making process from that. We see that again, and again, with tech driven intelligence operations and drone warfare. That's a very extreme example, of course, but in many cases, like we're dealing with software systems that were built by people, possibly in order to do things more efficiently, or to do things more ethically, and yet, the way the system works out, it's going to we can't it's hard to say I'm building something ethical or unethical, especially when we have a bias in our industry towards believing in technology in its capacity to change the world or to make things better. So that's, that's kind of the place that I get hung up on it. But it's a question that I'm always asking myself, and I'm always looking for ways to talk about it more with my coworkers and other engineers.

Unknown:

Yeah, I think also, fortunately, I have not yet come to this, like, real bad software where I would have to say, Oh, this is unethical. In my daily work, I cannot, I cannot build something like that. I think it is kind of clear that at least from a German perspective, stuff like tracking, user tracking has to be kind of done consciously in a way that privacy of individuals is always secured, and not exploited or something. And then the other thing is, yeah, of course, I would never build something for weapons or the army. But I think sometimes it's also not so easy to know, up front, if your software is going to be abused in a bad way, either. So I think there was an example from that talk about software, engineering and ethics, where they talked about this example of it was like an app that was serving as like an gay dating platform. And it was lounged, and then they found out that in repressive state, this software was abused by the police to track down gay people and actually put them in prison. And yeah, that example shows that, you know, the software engineers who built the software, definitely didn't think of this way of using the software. But also, they didn't pay attention to protect the user privacy enough so that they wouldn't be trackable that way, right. So I think there's always this this way of like, Oh, I'm building software? And is that software always going to be used in the way that I imagined it? And also, is there a way to actually use it in a no harmful way. And I think that's something that we have to think about when we design our software.

Erik Stockmeier:

Yeah, I had a similar experience, actually, at the start of the pandemic, in New York City where I was living in particular, you know, things were very chaotic. And people were feeling very isolated. So all around the city, there were different groups springing up, and some Artsy engineers involved in several of them, I know, to, you know, basically mutual aid groups that were trying to get groceries to people who didn't feel safe, going outside, collecting donations, etc. And I immediately wanted to jump into that to build, you know, a quick Rails app that would just allow people to ask for something and someone else to kind of take a ticket and pick it up. And I jumped into it without thinking and luckily, someone saved me the work and reached out and said, like, realize you're building a database of people to like publicly listed that they are vulnerable right now. But that kind of I was reminded of that with the dating app.

Unknown:

Yeah, right. I'm sure that, you know, this is why people need to be protected from being, you know, so, so vulnerable out there. And like, people, other people being able to abuse that information. I don't know. But I mean, right now, the most obvious example, in today's software, is probably what formerly known as Facebook is doing I guess, Mark Zuckerberg was everywhere in the media with with the problems of creating this kind of dynamics online through the platform where people were pushed into their own political extremes. Can you think of other examples from nowadays software?

Erik Stockmeier:

The big one, certainly, for me that jumps out has been the impact of first ride sharing. And then broadly, the rise of the gig economy, which is oftentimes also just called like, sometimes people refer to it as like the democratization of work, or there's lots of flowery language that can be used for it. I like the example of ride sharing because it's become so ubiquitous, and at least in New York City, it's it's hard to use taxis without an app anymore. And there was a major political crisis that came out of it. Certainly not unique to New York. The taxi licensing system, the medallions that people had to buy, basically, as a permit to operate a taxi that was controlled in a monopoly to keep the taxi market in, you know, somewhat from growing out of control. And the cost of those medallions had risen to, I believe almost a million dollars, when Uber came in and kind of circumvented the system, the value of those because taxing was no longer the lucrative income stream that it once was dropped so much that many families who had bought their taxi medallions using loans were suddenly hundreds of 1000s of dollars in debt. And this led to a string of high profile suicides, there was a long protest in New York City in front of City Hall by the cabbies and a hunger strike that finally led to a renegotiation of these loan payments. You know, that's that's an extreme example. And yet, in virtually any place, you look like the impact both the benefits of ride sharing apps, and also the impact on how people make a living, you know, being treated as employees versus contractors, which is in in many municipalities and countries become like something that goes up to the Supreme Court level, this is a case where we have something that is both beloved by users, but also has a massive impact on the way millions probably of people make money and pay their bills, and has led to a huge transition from stable employment to very precarious you know, gig work based employment. I think it's something that more engineers should be conscious of and scared of, because you can build something that has a a beloved output, and yet side effects that could be hurting or changing the way our whole economy works. One other thing, maybe we'll touch on it more, but you made me think of it earlier, when you were mentioning the more European attitudes toward privacy, Europe was the main precipitator of like the GDPR laws, something that just kind of nags me often is that the vast majority of the internet, like content economy has not really figured out how to make money without reliance on advertisements yet. Ad Tech is a huge industry, but as well, just like we don't have an in spite of all the money flowing in, through venture capital to tech, and all of the content, you know, with so many websites, Facebook, Twitter, etc, that are just too big to fail too big a part of our society to fail, they have not figured out how to make these things profitable, without relying on massive amounts of user data collection, appealing to advertisers, etc. And this isn't on any single engineer, or maybe a question of what ethical choice we have in a single thing to build. But the internet does not have an answer for that yet. We have not found a way to democratize access to information in a way that is, frankly, profitable, or sustainable, in my opinion.

Unknown:

Yeah, good point. And I mean, this example about the taxi drivers is also interesting. Because if I try to imagine myself being an engineer who built this app for commute, service, at what point would I have been suspicious that this might turn into something bad or harmful for anyone? And you know, which is in itself? Not an easy thing already to anticipate this. But then if I would have had the suspicion? How would I have voice that at work? Like how powerful are we in our position to drive like to drive the industry in to thinking of these things or taking stuff into consideration and how many people might be harmed with what we're building? And this is like the next question after the awareness when I don't know how you feel, but I feel like sometimes we can, you know, we can always have some kind of impact on the design of our software and the safety of it and you know, and also maybe take care of the user experience and stuff like that. But in the end, you if you work for a company and you're not working For yourself, you kind of have to go along with the company's decisions. Right?

Erik Stockmeier:

Yeah, that's a that is a, a really tricky question to talk about here. But I think it's, it's especially hard to answer because every engineer who's working at a company has to, to negotiate their own way through the culture of that company, that hierarchy and structure. And then, you know, like day to day business needs, and, you know, remembering that at the end of the day, you're working a job for money. And because of that, you have your own economic incentives or, you know, worries about what happens in that job. I think there's a free market side that says, you know, if we, as engineers are talking about these things, if we make it unfashionable, or however you want to put it, to not build a pourraient systems, that's certainly one avenue. And I remember, you know, in the early Trump administration, there was a big petition, I think it was, it was some kind of thing you could sign on to called like, tech won't build it. That had to do with like, I'm not going to work for the security state, under the Trump administration or something, something like that. I know, I signed it, and I won't build it, but I can't remember exactly what it is. So that's one thing. And one thing I think you've been great about at Artsy Caya is just speaking up within the tech team for your own concerns you've been willing to point things out when when you see an issue with our culture. And I think everyone on the team has really appreciated that. But it's still an open question. I don't think we have a single answer that has been replicated widely across the industry for, for how we make these decisions together. Because in my opinion, it can't be just you or just individual engineers making decisions about what is ethical, we need ways to have these conversations openly. And actually, I wanted to respond to one other thing you said, it's taking us back a step, but on the the idea of ride sharing apps and like the appeal of working for them. That's just such a perfect example to me of like the traveling salesman problem. When I was in my coding boot camp, it was described to me as a famously complicated problem. And when I first heard about or when I first started thinking about ride sharing apps, as a new programmer, I thought, oh, there, they must be doing this, they're actually solving the traveling salesman problem. And there's something so appealing to many of us, I think about solving these difficult problems that we can get, we can lose sight of the impact. And I wanted to do a reading for this podcast of the scene in Jurassic Park, where the mathematician who's the only skeptical part of the drastic Park startup says like you've been so focused on whether you can do it that you didn't think about whether you should, I should have prepared that. But yeah, I'm more into things that can be automated. So formatting a document, or to read a script was too much.

Unknown:

I think you kind of already put it into its essence. And, you know, we all know what you mean. That's a cool example. And I love Jurassic Park. It did have like Jurassic Park tower as a fan for all my childhood. And

Erik Stockmeier:

I'm sorry for changing the subject. Because you're asking how we can how we as engineers, can I make sure to keep ethics in mind when we're building software? So sorry, for the diversion?

Unknown:

No, thanks for the diversion. And yeah, I mean, you said like, it's this kind of question between, like, how much should the individual do and how much should we as a community community to which Yeah, that's a great point. And I think, of course, as an engineer, you are definitely in a more privileged situation most companies than in other jobs, like when I was working for a delivery company, I was not the one putting the packages into the truck and driving the truck around. But I was the one who made the software that was giving the truck drivers the best route. And so the company gave me more privilege. And I was having kind of a good seated situation there. You know, I could voice my opinion there without being afraid of getting fired for something, which I think probably most of the truck drivers didn't feel like so safe in their jobs. So I always think of myself as like an individual who has a little bit more privilege, and should also consider the people that have less privilege in my company and also speak up for them. But on the other side, I think one thing that happened here in Berlin, which is pretty cool. So we do have like a really nice Ruby user community in Berlin, that is very warm, and Hartley, a lot of people know each other. And so most of the engineers have worked for some companies than the other ones, for others. So everybody has like this information from insiders of different companies. And the best thing to do is to talk to each other and, you know, give each each other Also heads up, like if you apply there for this job, you have to know that this is what they do. From, you know, as an insider, I can give you already some kind of information on that. And it is not a union yet. But it's definitely a community that helps engineers to Yeah, to kind of be supporting each other on certain decisions. And also, it makes you feel a little less, like, you know, you're alone in this and you and your job is the only thing that matters, but you're more like in a conversation with a lot of different people and their thoughts. And I think that's the that's always good, you have also some kind of consulting with like more experienced people. And that helps a lot.

Erik Stockmeier:

I have a few thoughts when you mentioned, like using your privilege as an engineer. First off, like, of course, within your company, oftentimes, the engineering product teams enjoy a special status, and especially if the the rest of the business is relying on their work to kind of enable them. That's one case, there's also just a question of like, who, who gets to make those business decisions at all, who has a voice in like, what the tech priorities are. And, you know, in many cases, the communities that are most affected by these things have little voice. I often wonder how folks who come from non English speaking worlds even can learn to write code, since most programming languages are kind of based in at least English keywords, although maybe that even speaks to like, having a certain access to education, to be able to start to learn that sort of thing in other parts of the world. But, you know, I'm picturing like, huge portions of the Americas, the Spanish speaking world, I'm guessing that it's just a much less common thing. And so if people if there's not even access to like learn engineering, except among the elites in huge parts of the world, that's exclusionary. And so definitely, like, it's always maybe a little bit ethically fraught, to say, like, we need to use our privilege, like, yes, absolutely, we should. But our goal should also be a world where these privileges these distinctions are, are flattened and don't exist. Yeah. So those were some thoughts I had. And then on, that you mentioned, like this Ruby community, in Berlin, and like the Whisper network of, of talking to each other about what the what maybe the ethical trade offs would be working at different places. That's definitely a thing in other places. And certainly, if you get if you're in a room full of engineers who aren't looking over their shoulder or thinking about what their employer might have to say about what about their tech opinions, you can get some much more honest. Takes on on the current issues of the day. I think those those kinds of networks are absolutely necessary. And then, I think, a general understanding that diverse teams, while not they're not going to be a solution to the problem. Having a team that, you know, is, frankly, all people that look like me. Oh, like white guys with beards in

Unknown:

Brooklyn. Wearing a Christmas hat

Erik Stockmeier:

Christmas had every day? That's because I just walked into my parents house. But thank you for noticing. I think diversity. Broadly speaking, I know that that can be a clumsy term and at times is going to be like a necessary piece of being able to have the kinds of discussions about ethics that may go beyond diversity, but that need a diverse set of perspectives.

Unknown:

Yeah, a good example of that is Apple Health app and lack of the menstruation tracking feature there. Which is such a huge thing because I mean, almost half of the population in the world kind of needs to, in some ways, maybe not digitally, but in some ways be aware of their cycle and track that in order to know about their own health. And it's so funny to forget this very much key feature for most of the humans. And then yeah, if you look into the team who developed the app, see that? Yeah, that there is a reason why they didn't think of that, because they're, they're like this group of men who probably don't menstruate. It's really a shame. Yeah. If you think in that way, it's also the question is, is that app unethical in itself? No, it's not unethical, it's just that it discriminates. And then so the kind of the user group just is much smaller suddenly, which is also shame for Deb,

Erik Stockmeier:

I feel like we could have a whole separate discussion just on Apple and privacy, because it's such a tangled, there have the big four, or however you want to put it, they're the only company that at least doesn't openly do a lot with data collection. And so I've always felt like, given my options, this is probably like, the safest place to put my data. But I didn't know when you started on the menstruation tracking point, whether you are going to say it was good or bad? Because I mean, definitely your point about like a product team that can recognize a need that would have been perhaps, flying under the radar otherwise is a great point. I always wonder what different engineers think about Apple because they so heavily market themselves as the privacy company of the big, the big ones. Yeah, that's

Unknown:

funny. Yeah, there's so many, so many things. One thing I wanted to say, actually, to an earlier point, and then I think we need to, we need to close the conversation, because maybe just continue in another podcast. Yeah, but one thing I wanted to add is that, indeed, as a non native English speaker, I had trouble understanding some of the keywords in programming. And because, for example, the word map, I only learned it as like a geographical map. And I was really wondering, like, what does that have to do with like, applying one functionality to every item in a in a row. I really couldn't find the connection there and was so annoyed of that word, because it just made the whole method hard to understand. But at some point, I realized that map also can mean a lot of other things. And, yeah, but I guess if I would have been a native English speaker that, you know, there wouldn't have been this barrier of learning what math actually means. But to close our whole podcast, I would say that we cannot really answer any of these questions, we can just, you know, continue with thinking about them. And raising awareness, I think that's the most important thing.

Erik Stockmeier:

Raising awareness is, is the most important thing. And it's also the the one that is still lacking a single approach, or like the implementation details need to be filled in again, and again, by every person, or every subset of a team that is, is having these questions. And so looking to templates and examples from the past, similar to some of those talks, you mentioned in your introduction, we're going to need those. But the key question for me still today is, how is the industry as a whole going to raise that awareness in a way that is constructive? And you know, that moves the conversation forward and actually changes the industry to start introducing these questions at a level where they can affect what we build. I also just wanted to mention, a plug for another podcast, I could probably give a few but the one I listened to the most that touches on this topic, a lot is called Tech won't save us. It's hosted by a it's part of a Canadian podcasting network. Just a plug for that. For folks who want to do more thinking about this and, you know, maybe plug into a community of people talking about it.

Unknown:

Yeah, cool. Check that out after this episode. And yeah, let's let's go into our holidays and enjoy life. Thank you so much for talking to me, Eric. This was such a great podcast. Thank you so much.

Erik Stockmeier:

Yeah, thank you guys. Really good to see you.

Unknown:

Okay, Bye. Bye bye.

Steve Hicks:

Thanks for listening. You can follow us on Twitter at Artsy open source. Keep up with our blog@artsy.github.io This episode was mixed and edited by Alex Higgins and thank you Eve Essex for free music you can find her on all major streaming platforms. Until next time, this is Artsy Engineering Radio