199 | Dealing with the Internet's Most Intractable Problems with Chris Wexler

EP 199 Chris Wexler.png

Our guest on the pod this week is Chris Wexler. Chris is the CEO of Krunam, a technology for good company fighting Child Sexual Abuse Material and a number of other of the internet's most difficult challenges.

Resources mentioned in this episode:

The Imperfect Show Notes

To help make this podcast more accessible to those who are hearing impaired or those who like to read rather than listen to podcasts, we’d love to offer polished show notes. However, Awarepreneurs is still a startup with limited resources. So we’re not there yet.

What we can offer now is these imperfect show notes via the Otter.ai service. The transcription is far from perfect. But hopefully it’s close enough - even with the errors - to give those who aren’t able or inclined to learn from audio interviews a way to participate.

Chris Wexler Awarepreneurs interview on Stopping Child Sexual Abuse Material

SPEAKERS

Paul Zelizer, Chris Wexler

 

Paul Zelizer  00:01

Hi, this is Paul Zelizer, and welcome to another episode of the Awarepreneurs podcast. This podcast is all about the intersection of three things, conscious business, social impact, and awareness practices. Each episode, I do a deep dive interview with a thought leader in this intersection. Someone who has market tested experience and is already transforming many lives. Before I introduce our guests and our topic today, I just have one request. If you could go over to Apple podcasts or whatever app, you're listening to the show on, do a rating and review, it helps tremendously. Thanks so much. Today, I'm really honored to introduce you to Chris Wexler, and our topic is dealing with the internet's most intractable problem. Chris is the CEO of Karuna, a technology for good company fighting child sexual abuse material and a number of other of the internet's most difficult challenges. Chris, welcome to the show. Thank you so much, it's really good to be on. Thanks for having me. But you're doing some really important, really hard work. And I'm thrilled to have you here, it is an honor to get to hear about your work, and I can't wait to introduce our audience. But before we do that, Chris, our audience is no stranger to hard conversations and difficult challenges. And one of the things that we think is really important. Matter of fact, it's one of our core values, is to think well about caring for ourselves, as we're out there trying to make the world a better place. So as a way to get to know somebody, we like to ask you, what's one wellness or awareness practice that you personally use to help resource yourself for this really, really important work, but not always really easy work?

 

Chris Wexler  01:50

Yeah, I think the most important thing for me is to really practice mindfulness, and really focus on today. Because when you're dealing with something as horrific as child sexual abuse material, you have horror behind you, and horror ahead of you. And you can really only make a change today. And so I think if I, I'm, I struggle the most when I'm not focused on what I can do to change today, but what has happened or what will happen in the future. And so it's really being in and of the moment is really critical when I'm doing this kind of work. And we're going to talk about what child sexual abuse material is, and how prevalent it is, and what you guys are doing to help make some really significant needed change, as well as some other problems we're working on. But before we do that, just wind back a little bit, Chris, this isn't like every buddy's idea of what I want to be when I grew up, like, give us a little background story of Chris's professional journey. I have a heavy Long and Winding journey to get to where I am. I've always there's kind of a through line, which is trying to kind of change minds and influence people. That's always been and but you know, it started. Frankly, when I was in school on Capitol Hill in DC, I made the mistake of only a 21 year old can make which is realizing that people in politics or political. I really was like what I thought we were here to change the world. So what is a disillusioned young man do he goes and works on wall street for five years. And I like to say the added aptitude but not the attitude, I enjoyed the intellectual work of understanding how markets were going to change and move. But I really didn't like the environment. After a little bit of a, maybe a palate cleanser, producing theater for a couple years, I got into marketing at the beginning of the digital revolution, you know, 2001 or two. And really started working with a lot of World Class brands from General Mills to AAA to Microsoft, all this kind of led up to me going, boy, I'm not really making a change in the world like I wanted to when I was young. And I went and I made a decision to to put all the expertise that I've had, which is in the areas of big data and internet, large internet companies and just how people interact online and find a way to put those skills to actually make the world a better place versus just you know, sell another widget or another sports car to a 65 year old millionaire. And so that's when the opportunity to really work with the co founders of crew nom came to, to start this amazing venture, it was a no brainer for me. It's just it's an exciting synthesis of my skills and history and a real pertinent and, and vital need for people and society. And so I just can't have those opportunities don't come along very often, so I had to take it.

 

Paul Zelizer  05:35

And the name Kranum, it has some significance tell us about where that name comes from?

Chris Wexler 05:48

Yeah, you know, naming a company is never easy, particularly since everything on the internet is taken. And so we went through a process of really trying to find something that was the core of who we were and we realized that

 

Chris Wexler  05:55

there was a the, we kept coming back to an inspiration to us, which is a woman in Thailand named crew nom towards kr u n am recruiting I'm all one word. But we named the company in her in her honor after asking her of course. But she is she was a street artist in Chiang Mai, and did a project 20 years ago with the kids and the street going, Hey, I want to show you how to paint and just paint about your life. And she was so shocked at what those kids painted. That she when she found out that's when she found out that a lot of the karaoke bars in Chiang Mai, were friends for Child Sexual trafficking. And so unlike 99.99% of us, her decision was to drop everything and just walk into those karaoke bars and grab those kids and save them without a plan. And when the traffickers said, If you do that, again, we're going to kill you, she took the 20 Kids she had saved and went north into the Golden Triangle, which is one of the most dangerous regions of the world, with the kids, and tried to figure out how to protect them and grow them. And and since then, you know, 20 plus years ago, she's saved 1000s of kids out of out of dramatic situations. One of the first kids she saved just was the first non State graduate, non state child graduate from university in Thailand, she has fundamentally been compelled to make the world a better place and changed your tactics as she could save more kids. And we thought it was just such an honor for us to honor what she did, by taking what she's doing in real life and bringing it to the the virtual world into the digital world and trying to scale solutions to save more kids. So it seemed like a perfect fit for us.

 

Paul Zelizer  07:50

Wow, what an inspiring story. And what a great, you know, we talk a lot on the show, we talk about baking your core values and your mission and your impact statement right into the company. You literally have a right in the name. What a gorgeous example of that. So give us if somebody isn't quite familiar with this terminology, right? I think people can get a sense but it child sexual abuse material, like, what is it? And when did people start to like say, Wow, we really have a real problem with this on the internet?

 

Chris Wexler  08:24

Yeah, well, you know, I mean, most people are familiar with the parochial term of child pornography, we don't use that term. Because pornography implies consent. And when there's a child, there's never consent. And so the end, the kind of the nonprofit and for profit world is really focusing the language around child child sex abuse material, or see Sam for short, just to really acknowledge what this is, which is documented abuse. And so and, you know, this is something that's sadly been with us as long as there's been photography. And but I had largely been stopped in the mid 1980s, because you had a single point of failure for distribution of these materials, which is the post office and the post office as its own police force and did a really good job stopping these predators. But the internet came around 10 years later, and it started growing and growing. And the lat last year alone, there are nearly 70 million reports of Sam being distributed around the world. Every one of those is a kid, and often children that are now adults that have been re victimized for 2030 years as these images continue to circulate. And so this is a this is a problem that not only is documenting a single point of abuse but is removed victimizing these people every day. And so it's vital for us as a society to address it and clean it off. And, you know, luckily, very few people disagree with that. It's just not an easy thing to do

 

Paul Zelizer  10:13

Very much an intractable problem, I think we have

 

Chris Wexler  10:18

exactly

 

Paul Zelizer  10:19

70 million acts a year. Think about that, before we move on listeners - 70 million children are being abused in a sexual way without consent, because their kids they can't consent. Usually, it involves either threats of violence or just trying to survive. Just take a moment to breathe that in.

So you and your team, Chris, you say, Okay, here's an intractable problem - child sexual abuse material. Like it's complicated, the post office, you know, made huge progress. And then the internet came in, you're like, well, we did it before the internet, we got to be able to figure out a way to like, you know, start to turn, you know, turn off this flow. It's like just this massive load due to the internet. And we got to find a way to shut this down. You have developed like this three stage, I don't do called process or protocol, I can't remember. But I remember the three stages. Tell us about how you work with this issue, using, you know, some almost like design thinking or like, Okay, if we do X, Y, and Z, we can scale making this problem start to go away.

 

Chris Wexler  11:33

Yeah, so I think the easiest way to talk about it is kind of talk about where it came where the technology comes from. I think any technology is in, in and of itself is fairly abstracting, like what's going on. But I think if you talk about the problem you're trying to solve with it, it becomes a little clearer. Back in 2015, one of our co founders, Ben Gantz, was a child sexual abuse investigator in the UK, and he was spending 70 to 80% of his time going through compensated materials to determine is this illegal content? What kind of illegal content is it? And then maybe 20 to 30% of his time, actually investigating and saving kids. And he just thought that was a terrible like that needed to get flipped. And he ran into our other co founder Scott page at a conference and said, he was talking about use of computer vision and AI and deep learning and how it can really help sift through images and video and said, Hey, is there something we can do. And luckily, the UK law enforcement had started doing a really smart thing back in 2013, that, and UK home office was just had the foresight to do this, which is they started keeping a database of all this confiscated material, it's in a, it's in a cage in, in their offices, and it's not connected to the internet. I mean, this is stuff that this is the worst of the worst, and nothing that you know, we're not even even we at krunal aren't allowed to see it. They had this database. And so what we did, what Scott and Ben did is they said, Hey, we'll do some work for free for you guys, if we could see if this this computer vision and deep learning with deep learning and AI, we can actually start using computers to help classify this. And first time they did it, it was good enough for them to be encouraged to go further. And they started working with the home office for years kind of refining, this finally got deployed in 2018. So essentially, it's a piece of software that you point at a set of images and videos, and go and put it through and it then essentially, it's like holding up a template, a very complex computer based template of is this like other examples of this that I've seen. And so on some levels, it's like a virus scan virus scans, they're looking for patterns of code that are malicious, and we're just looking for images, patterns, and images and videos that are malicious, and trying to screen them out. And so essentially, at that point, it goes, Okay, I'm 95% positive that this is C Sam, or I'm 100% positive this isn't. And then essentially, and then it allows the users to then route those images to the either the appropriate, it gives a confidence score of how often or if that image is or isn't and puts it into various classes, which then allows you to make decisioning whether that's sending this image to the authorities for investigation immediately, or send it to another human, a human to go. If it's not sure I'm not quite sure what this is a human to verify that what it is, it really what it does, and it takes A lot of the pressure off of humans to do that sorting. And I think that's the dirty little secret of every one of the major internet platforms that we all use every day. Google search, Facebook, Twitter, fill in the blank Instagram, is that when something gets onto that platform that shouldn't that a human posts that one of the users posts, the system, ai larges, we flag it go, Hey, this shouldn't be here. And then a human has to go look at it. A human has to look at it and go, yep, that doesn't belong here. And they take it off. Well, I definition that's two people too many that have seen that image. And so we're trying to really cut it off at its source. So when somebody uploads something, you run it through the classifier, and, and flag it before a human even gets exposed to it either on the platform or otherwise. And we really kind of view it as digital protection. We're protecting the communities on these platforms, we're protecting the employees that are getting PTSD from looking at these kind of images all day. We're protecting the reputation of these companies. But the nice knock on effect of that is we're protecting kids, when we do it, their their images aren't being spread on the cycle of violence is getting broken. And that's really in the end, our goal is to break that cycle of violence.

 

Paul Zelizer  16:29

By there's so much I can say, Chris, to two quick thoughts. One is users, you know, we've had quite a bit of talk about algorithms. And oftentimes some of the negative impacts that algorithms have, for instance, we did a fabulous episode with Lisa and Renee Hall on some of the problems of racist algorithms and some of the other consequences of algorithms being used sometimes unconsciously, and sometimes a little more consciously in ways that wouldn't align for most listeners of this podcast, I'll put a link in the show notes so you can get a sense. So just it's fascinating, Chris, that you and your team have flipped, right? You've literally flipped. Yeah, algorithms can cause problems, but they also can bring tremendous good. And one of the good one of the second points, I was thinking as I as I was listening to what you had to say, before we hit the record button, Chris, you made a statement that was just it stopped me in my tracks. And you talked about the toxic waste of our connected life, the toxic waste of our connected life. And usually it is like environmentally, people talk about, you know, what happens, the minerals, and how their mind and batteries and our phones when our phones are obsolete and they get thrown in landfills, etc, etc, etc. We think about that kind of toxic waste. And those are big problems. And we need to work on that. But you were talking about, you know, like the people who are doing the actual like going through and screening is this See, Sam, is this a violent? You know, image is this, you know, hate crime type, you know, promoting hate crimes or recruiting for white supremacy group? Usually those are humans tell us about those humans? And where do they you know, like, what's their life typically, like? What kind of communities do those humans who are screening? You know, if I'm a content, what are they flagged or whatever they call it a Facebook like these tend to get outsourced to certain types of people in certain types of communities. Is that fair to say?

 

Chris Wexler  18:37

Yeah, I think it's in a process of evolution right now, for up until four or five years ago, this was largely done in the US and other developed countries where these companies were, they just hire people to do it. And then one by one, they got hit by a huge worker comp, settlements. In the ends of millions of dollars. What's

 

Paul Zelizer  19:01

the average lifespan of somebody in this job that were nine months, nine months, and then they're just totally

 

Chris Wexler  19:08

burned out and gone? Yeah, the people that can really hang in there and build an expertise and be really good at it. are, you know, they're kind of digital first responders. So the people that are really good at this, we need to honor because they're making all of our lives significantly better. But it's a brutal, brutal job. And so, when these companies got hit by these huge lawsuits, they didn't go, oh, what's a better way to do this? Oh, and Oh, alright, well, we'll move this to Indonesia, or the Philippines or somewhere else in the developing world. And so they exported the risk, and they exported the misery to poor countries, which is unfortunately how large corporations often do this. And so you know, here we are. A company A country that is benefiting monetarily from these giant platforms. And then when the bad stuff happens, what do we do we send it to the those that are most desperate for work. It's not a long term, because a long term solution. And now I mean, now we're seeing even lawsuits hitting these companies in places like Indonesia, where workers rights are pretty low, but they're still really thankfully, there's there are people there defending those workers as well. You know, I think eventually, we want to get this technology good enough to be at so that humans don't see this at all, we're, you know, I'll be honest, none of the technology is there yet. I think we can limit it dramatically right now. And people, a lot of people not just Karuna, um, a lot of different companies and people are working hard at this problem. But it's a I think it's a vital human rights issue that we do this well. And so that's, you know, I think that's a really, really critical thing. I think it's also interesting that you brought up, you know, algorithms doing poorly, you know, encoding racism into our every day. And in this case of us, we're using it for good, I think, I think it's important for all of us realize what machine learning and AI and all that is, it's a reflection of humans, and we're all flawed. The reason racism gets encoded into machine learning, and algorithms, is because there's racism in our society that's reflected in the data. There's racism, and classism and fill in the blank ism, encoded into our hiring practices and our education system that puts certain people into positions where they actually are doing the work on this. So they just don't understand properly how to mitigate for those problems in the data. But the technology itself is no, it's morally No. A hammer isn't good or bad. What's good or bad about a hammer is how you use it. It's the same with algorithms and machine learning. And I think because I think too often early adopters of any technology or are closer of an engineer's mindset of, Hey, can we do that? I think we're just getting to the age of technology, where we're asking the right question, which is, should we do that? And how should we do that? We can connect all the world, like Facebook has, and make it really easy to share. Now we need to ask the question of how do we do that, and have healthy relationships in that. And that's a much harder problem, particularly if you have years of encoded technology driving those interactions and promotion of various things. I think when you unfortunately, you know, we are, you know, way human minds are built. And we our attention is built is we love, we can't help it. We love things that are exceptional. When I say exceptional, not good. But an exception to the rule. patterns are like, Okay, I see that pattern, I see that pattern, Ooh, that's different. I'm gonna look at that. The problem is, what that means is typically, oh, what I'll look at is something horrifying. It's kind of like when you drive by a an accident on the freeway, you can't help but look, that's just ingrained into human nature. The problem is algorithms typically amplify that, because they only care about what you look at. And we are drawn the paradoxes we're drawn to things that we don't necessarily like, but the algorithm doesn't know whether we're looking because we're horrified or interested. All it knows is that we're looking. And I think it's critical as a result for the people who are running those communities to be thinking about the intention of that content. And so whether it's, you know, in the in the case of what we're fighting accrue nom, it's child sexual abuse material, but it's incitement to violence or actual, you know, threats. Those are things that are critical for these communities, to build into the infrastructure of the community policing of that, because it's a critical responsibility of theirs. And, you know, I like to think about it kind of in terms of here in the United States, the First Amendment. We talked about the First Amendment a lot and in the terms of free speech, and free speech is vital. It's absolutely important. But it's not absolute. If you look at the years of jurisprudence, The government has shown over the years that there are classes of harmful speech, that they have the right to police because it is harmful to people or threats, incitement to violence, blackmail, obscenity. All of those things are harmful to the people that that speech is happening to. And the end, we've focused so much on enabling speech. Now we're at the point of these online communities that we have to start building in the guardrails to protect people from harmful speech. And we're just at the nascent days of that we're just at the beginning days. Do I wish we had been at the beginning days, five years ago, six years ago, when it became pretty obvious that this was going to be a problem? Yes. But there's no profit motivation, by and large for a lot of these organizations to do that. And so what we're seeing now is, they're responding for the first time in many moons, largely because they see the downside of users really yearning on them for not acting on this. And so there's now a downside, because of reaction from you. And me, too, like I'm not gonna go on Facebook anymore, because it's, it's just a toxic waste pit. That is going to impact them. And so it's, you know, they always respond to the consumer. We, as consumers need to put more pressure on each of these platforms to build in these guardrails to protect people as well as enable speech. It's it's a, it's two sides of the same coin.

 

Paul Zelizer  26:42

How long Chris has Grinham been around? And at this point, what would you say you're doing? Well, in terms of the technology and a company's, you know, values and your outcome goals, your impact goals? And what are some things you're working on these days that you want to see it get better?

 

Chris Wexler  27:01

You know, the technology has been in use by certain arms, a law enforcement is going back to 2018. Just recently, we started pushing into the more commercial for profit space. And so meaning, talking to the big technology platforms, we've been really encouraged on theirs, there's been, there are a lot of really hard working people at each of these companies, and often what's called trust and safety, in those departments of their of these companies that this is what they try to fix every day. And when they hear about our technology, they're really open to it. And we're getting far along in talks with a lot of different companies that you would know their names. But, you know, the, the there's always that balance. And so, for us, we're getting great traction, law enforcement, we have a good to answer your question, good attraction with law enforcement. We're early days with corporate America, but it's, it's actually going well, and you really see that these companies are taking it seriously, which is encouraging.

 

Paul Zelizer  28:14

So let's do this. I'm gonna take a break and hear a word from our sponsor. When we come back, Chris, we want to put on our social entrepreneur glasses and want to understand a little bit more like how does this work as an enterprise? our listeners will be like, yes, this needs to get done. And we're so glad you're doing it. But like, how do you run a company like this? How do you fund it? What kinds of tech companies are saying yes? And what do they pay you for all that stuff. But first, a word from our sponsor? Do you have a business that's about making the world a better place, and you want to increase both your impact and your income, not to buy a private jet plane, but to live a good life while you're making the world a better place? If so, I'd like to talk to you about podcasting for a moment. One of the things I love about podcasting is the listeners themselves. So my mentor started helping me kind of learn who listens to podcasts, it's a pretty exceptional group. In particular, three things really stood out. One is the podcast listeners are early adaptors with like new ideas, and we tend to incorporate them in our lives and encourage other people to do the same. So if you've got a new idea, this is a great audience and a great way to get in front of them. Number two, podcast leaders or natural leaders, people in our work life, or community groups or families, they turn to us and say you've always got creative ideas, how would you handle fill in the blank? We listen to podcasters stay up to date on those new ideas. And the last thing is the podcast listeners tend to make more money, not just a little bit more money, but quite a bit more. When you put those three together. That's a pretty interesting confluence of things. If you're a social on interpreter. So if you have an idea and a business, that's about making the world a better place, and you want to learn how to use either starting your own podcast, you want to be a host of a podcast like I get the honor of doing today, or you want to be a great guest and really help move the needle. like Chris, what a fabulous guest Christmas today. We're printers has a podcast success team. But we'll walk you through every step of the way, and show you about what kind of technology do you need? And how do you get it out there? And how do you apply it to be a guest in such a way that hosts of popular podcasts will actually say yes. It's called our podcast success team. And you can find out more how to wear printers.com forward slash podcast dash success. And thank you to everybody who's in the podcast success team who sponsors this podcast. So Chris, as I was saying, in the second part of the show, we like to unpack a little bit get a little more granular. So like, give us a sense that your, your global company, or at least you know, you're not just us centric, you're talking about people in the UK and this woman who has inspired you, who lives in Thailand, like how many people work for the company right here right now. And like, are you everywhere, you tend to be in certain markets more than others.

 

Chris Wexler  31:23

We're not a huge company. Right now. We're about 10 people. And we have a technology team in London, we have kind of a spread out sales and operational team, frankly, from London to San Francisco. And actually, depending on the day, it could be anywhere, really a global traveling team. And so our team is is fairly broad. Now. I expect that to that group to grow pretty dramatically in the coming months. We're hiring right now. But I'm trying to remember the rest of your question. I'm sorry, Paul.

 

Paul Zelizer  31:59

Yeah. So where you're located the size of your team? Yeah.

 

Chris Wexler  32:03

So yeah, so we so we, yeah, we have team, we have a we have a team in London, in San Francisco, we have a group in Minneapolis as well. But also, you know, kind of all around the world, one of our our head of operations is going to be relocating to Sweden to help help out another company as well as us. So like, we're truly a global company, the internet's global, we can operate the same way.

 

Paul Zelizer  32:27

And when you look at it from a revenue perspective, like what do people pay crinum for? And what kind of like, companies tend to be like, you mentioned, law enforcement? Do they pay you you mentioned tech companies? Like who's hiring you? And who's allowing this company to grow and sustain itself? Are you doing this really important work?

 

Chris Wexler  32:47

Yeah, I mean, I think the the best, the best analogy, to kind of think about the business model is virus scan protection. We all pay, you know, McAfee or fill in the blank, you know, $100 a year to make sure that our computers don't have viruses on it. So we very much operate on a similar model, but at a much larger scales. So instead of you know, the, you know, 20 emails you get a day, and 10 files you may download, we're talking in the millions or billions of images and videos that companies are processing every day. And so it's a software sales, or software service process. So kind of based on volume of the images and videos scan where we have, we take a small fee. And the nice thing is because we're driving efficiency in these organizations, outside of the social impact, we're driving efficiency, they need less humans moderating it, they're having a better experience for their community. It's typically a net savings for them, when they deploy us. Or in the case of law enforcement, when they deploy us. It's freeing up resources to actually go and protect more kids. And so it's really, you know, we probably fit in the in is a kind of an efficiency tool.

 

Paul Zelizer  34:14

See, you got this positive impact part in terms of and we're going to talk about what you do, in addition to see Sam, because in many, many intractable problems that you want to make a dent in and you really want to help. I'm going to talk about that in a second or you've come out of the gate really saying, Okay, let's This is a horrible problem, child sexual abuse material. And we're going to start right here, and we're going to show if I can paraphrase a little bit, you'll probably say more eloquent than I, Chris, show the world that it's possible to do something about this. And once people understand that, you can use technology for good in this way. We're going to take on some other intractable problems. Is that is that fair to say?

 

Chris Wexler  34:58

Yeah, absolutely. We really view this as a jumping off point, see Sam's one of the hardest, because there's a data problem. It's illegal for companies to hold this data. And so that's why we started here. It's actually I guess, you know, some people started the easiest problem for us, we started at the hardest. That's having having a great engineer, as our CTO is, I suppose one of the, one of the great things is you start at the heart and go to the easy, one unquote, easy. We started here, because like, if Facebook finds a bunch of these images, they're legally not allowed to keep it to train their own algorithms there, they have to delete it on because we're in partnership with the government in the UK, we're able to access that information in a privacy safe way. And, and then train our algorithms. And so we've solved the problem for the marketplace that dealing with illegal content, they can't do that for themselves. So that's why we started there. But when you look at this, the kind of the next like ring out of this as one is just applying this more broadly, live streaming is a critical area for the abuse of children, unfortunately, particularly in the age of COVID, what we're seeing is that, where people because they're stuck in their homes with with a pandemic, they're, they've lost access to children. And so, or they're not traveling globally to abuse children, which I mean, those are good things, right. But they've adapted their techniques to start using live streaming, where they pay for private shows of abuse. And the problem with that is live streaming, there is no documentation of it, it is a crime that happens and the child is abused, and it disappears. There's no evidence, there's no way to stop it. And so we're building a tool to to be able to scan for that, again, like a virus scan to make sure we're not looking and listening in on people's live streams. But what we would be able to do is essentially go is this Sam Nope, okay, then it's not see. So that's kind of like an inner ring thing that we that we can do to add on to, but we're going to be looking at violence detection. Like as we as we grow and build this blood detection, we're looking at potentially looking at ways to deal with extreme political, political extremism on the left on the right, anything that leads to violence. Anything that leads to terror and violence, I think that's critical. And we've seen, unfortunately, the negative effects of that extremism for the last 20 years really, right, be a major problem in the world. And so as we look at kind of all those harmful speeches, there are, there are ways to combat that with different technologies. And our goal is to eventually stand up teams that go and go and battle each one of those, and provide tools for companies and governments and fill in the blank to really help and protect their users and employees.

 

Paul Zelizer  38:10

It's important work you're doing so if you widen back a little bit, Kristen, you know, a lot of our listeners are deeply values oriented social entrepreneurs, and, and they're in that process of figuring out the both how to optimize their impact, and also really make it work as the business come up with a business model. And there's something really interesting here that you've found something that has a listener, if you're somebody who's not outraged by child sexual abuse material, go find another podcast, we don't want you to write, our listeners would be like, this is horrible. And you guys are, you know, making a positive impact that way, we also found a way to sync it up with what companies want their profitability, and how their customers and sort of the public at large view them right, this is a major problem for them. So if you just wind back not from this specific problem, but what you've learned from attending to see Sam, if they're social entrepreneurs, and we have a lot of them who are trying to sync up their values, what the marketplace also values, any words of advice, or anything you've learned as a team how to do that, well, that you can pass on to a listener who's still trying to, you know, get a little more effective at answering that question.

 

Chris Wexler  39:33

Yeah, it's, it's actually, I'm gonna use the word that you probably don't expect right here. The most powerful tool in your emotional toolbox is empathy. And I'm not talking about just empathy for people you're trying to help but to the businesses that you're approaching and working with. That's the critical thing because there are things That we accrue non care about that our customers don't. Or let me take that back. That is not their remit, their remit is to make sure they have a safe community. Our remit is we want to protect more kids. But we can do that by making their community better. And so even though our moat, our motivation, and our customers motivation, they're two different things. And that's, I think, something, we have to remind ourselves every day that we are what we have to walk a mile in our customers shoes. I learned this actually back in my marketing days, I remember sitting in meetings with Harley Davidson, I was lucky enough to work on Harley Davidson for years. And they were so excited because they had, I don't even remember how many layers but quadruple coat paint that is super strong and powerful. And they wanted to talk about how they did it, and this and that, and we marketers just looked each other and when I don't think anyone's gonna care. I just don't think anyone riding bikes gonna care that much about how we paint the bikes, when we talk about it, that if you dump your bike, your paint won't scratch, then they go Yeah, because that's the benefit of that, of them. And so for us, and for any, any social entrepreneur, it's critical to be selling the benefit of your product, who you're talking to, not your intention, your intention as good and as valid as it is, is not necessarily the reason someone's going to do business with you, they're going to do it for a benefit to them. And it could be the benefit could be they'll feel better, or they'll have a better brand story to tell, much harder to sell that than if you can find a spot where a social good aligns with a business need. And that's what we were, you know, that's one of the exciting things about crew nom is we found a social good that aligns with a business need. But that's the critical element when you're selling to a consumer or otherwise. In the end, you know, I've done lots of research in the in, in kind of motivation of consumers. And you know, oh, brand really works. And this really works. In the end price works the strongest no matter what, you know, I'm willing to pay, you know, maybe a couple pennies more if it's humanely raised, or if it's fill in the blank, it's better this or that, that might matter a lot matter a lot to you. But it doesn't matter as much to the consumer. It's a it's a lesson I learned working actually with AAA, I was on the marketing team. They're really back in 2009, that we really decided to externalize a lot of amazing decisions they were making about revolutionising the industrial food complex, they were really at the forefront of scaling organic, local, non GMO foods to a scale where they could feed 1015 102,000 restaurants. We didn't do that by talking about organic NGO, and that we talked about it in the context of taste, that this is going to taste better. So enough consumers and customers came in and went Yeah, this is a really good burrito. Then we told them the story of why it tasted better. And then they got emotionally attached to why they liked the Chipotle a burrito better than and then a burger from McDonald's. We lead with the benefit to them, which was taste it was always about what how great it tastes and how it's personalized for you. That's the critical element. Then we followed up with the story of why it is and then you had them for life. And so I think it's really critical that social entrepreneurs separate their intention from their benefit and and understand wind message both of those things.

 

Paul Zelizer  44:09

You talk my language growth listeners, you know about my spiritual highlighter, I like to circle things, you know, imagine if we were like, whiteboarding something and some big bright color a Christus shared, I like to call it empathy based marketing, I literally just had a back and forth with one of my private clients about it. And she has this fabulous service for a very particular type of client that we're both very excited about. And when I was reading the page that was you know, describing the service, it was all bullet points and I was like, we need a paragraph or two at the front that is just about like who they are and how you know, like, what is it that they're struggling with? And then you can tell them all these things, but right now it just looks like you know, the list of features and all the features are awesome. But you're not going to make many sales. Right? And I yeah, can't agree with what Chris just said listeners just circling it in some bright like fluorescent orange maybe is today's highlighter color? Yeah. So anyway, thanks

 

Chris Wexler  45:13

for the orange is my favorite color. How did you know Paul? That's amazing.

 

Paul Zelizer  45:16

Is that really true?

 

Chris Wexler  45:17

That's truly empathetic. Okay, well,

 

Paul Zelizer  45:19

like an orange guy. So when we were getting ready for this podcast because I asked you and I said, Well, what do you want our listeners to do kind of what's the call to action? If you want to use marker language? Like how do we help you, you've been so generous and you're like, just tell people to you know, learn about crinum. But in particular, there was a there was an audience you want our listeners to nudge and and I just want to encourage you, this nurses such important work that Chris and his team are doing. You talked about politicians, and like what right? You think like, you know, if you got connections and Facebook, go tell him to like, you know, Fast Track hire and Chris and his team on because they really need it right instead of other companies. And all that sounds like it's in the works and going well. But you talked about politicians, what do you want our listeners to know about? See Sam, and what you're up to and politicians. You know,

 

Chris Wexler  46:18

we're at an inflection point in the regulation of online communities. Up until literally a month right now, particularly here in the United States, we have given carte blanche freedom to encourage growth in this space. And frankly, lots of benefit has come from that. Some, some people may have heard floating around in the political ether, spoken about completely illogically, but section 230, which essentially absolves all of the platforms of responsibility for the content that's on their platform, that will be reformed, eventually. And we expect it to be reformed relatively soon, there's a kind of a bipartisan, understanding that your freedom and in responsibility for what's on your platform is just not going to be a long term solution. And so, and there's a lot of talk about online privacy, and those are two separate issues. And so I would say this to people, if they truly care about protecting kids, and helping fight illegal content, is to talk to your politicians, both local and national. And say I really care that companies are held responsible to police their communities, particularly in the areas of CSM, but also in the areas of other harmful speech, that it's part of much like, factory has to put scrubbers on a smokestack, they need to have this, this, this ability to they need to address this and have that in place. And that should be their legal responsibility, not just their moral and brand responsibility. And, and when we're talking about online privacy, and particularly end to end encryption, because I think we all initially kind of cringe when you go, What do you mean, you want to look at my private messages or my private this or that, if we push hard towards complete privacy on the internet, you might have your communication with your sister in law, and your mother and your best friend, protected from hackers. But you're also giving safe harbor to the worst of humanity. And we need to find a balance of online privacy and detection of illegal activities. And so I think it's critical that when we think about online privacy, that we think about it in a nuanced manner, that there are ways for these are government and these companies to to really police this communication. Otherwise, we're just giving a superhighway to traffickers and and evildoers all around the world. And I no means do I think we should open up and have the government look at everything. But there needs to be a nuanced approach to online privacy. And so I know how well nuance sells in politics today. But I think that's I think more and more people need to be thinking that way. Because otherwise we're going to just be allowing the powerful and the evil to run roughshod over society.

 

Paul Zelizer  49:45

Because this is such an important topic, and I hope we've hit some of the notes that you were really wanting to hit and if there was something we didn't yet get to, and you'd like to bring that up as we're starting to wind down or something you want to leave our listeners with? As we start to say goodbye, what would that be?

 

Chris Wexler  50:06

I think and it's a little off of some of the topics that I've said. But one of the reasons I left corporate America to do this was I want to have a be an example of a successful, socially driven or profit organization. I think too often, we have, for some reason, and since 1970s, or so, we have falsely separated, helping the world in business. And one of the great things about how business works is it scales solutions. It does, it can handle big problems. And as a result, people who want to do big things don't end up and often end up in the nonprofit side, because there are, there's just, there's just not enough tools in the tool chest, on the nonprofit side. And so I just want to encourage people to not think about when you think about solving problems, don't get caught in a for profit, nonprofit mindset, we need to use every means necessary to make the world a better place. And there's, I don't think there's any shame in asking businesses to pay for, you know, what they need to do to make it a better place. And we can attract more talented people, and sustain better lifestyles for people that care about, you know, the quote, unquote, right things. And we need to find that third way that brings both together. And so I hope I can inspire people to do that. I probably haven't done enough yet to inspire anybody, but at the very least, I can give words of encouragement, and go, let's find ways to not worry about labels, and really try to make the world a better place.

 

Paul Zelizer  51:59

Preach it, Chris. I ran a nonprofit for seven and a half years. And it was it was great. And I was very grateful for that work. But the language I use, it felt like it was disconnected from the conversations, the tables and the engine that drives so much what happens on planet earth and the tables where the real decisions were getting made. Would they get made. And then like people trying to do good work in the community would like fight for the scraps, and I got exhausted and burnt out. And I'm like, I'm going where the engine is? Yeah, no,

 

Chris Wexler  52:31

I think that's exactly right. I mean, if if, on some levels good was put into a ghetto, and, and as a result, business was allowed to do whatever they wanted. We need to break down those walls, because there are a lot of good people in corporate America. And now we just need to change the way we think about it. So we're not just focused on profits, but we're focused on multi stakeholders. And I think we're while we're on the way, I think the tide is turning, but it's early days.

 

Paul Zelizer  53:01

It's been fabulous to have you here, Chris, thank you so much for being on the show today.

 

Chris Wexler  53:06

It was a great conversation. Thank you, Paul, I really appreciate you having me. Thank you.

 

Paul Zelizer  53:10

So there'll be links in the show notes, go check out crew now and everything they're doing. And another big orange, bright highlight all around with Christian standard. I started this podcast because not only am I grateful to, you know, be in these conversations and work with clients, but I wanted to scale who our listeners could have access to and people doing great things. So good check what their check out what they're doing at crinum. It's a fabulous social enterprise doing things at scale in a really intractable issue. So before we go, I just want to say thank you so much for listening. And a reminder, we love listener suggested topics and guests. If you have the idea for a show, and go take a look at the aware printers website, go to our contact page. And you'll see three simple criteria. It's kind of the lenses we use to see who's a fit and who we invite on that comes from our listeners and the aware printers, community members. So we try to make it really simple and we try to be really transparent. It's right there and go check it out. If you have an idea that fits, please send an email through the site. For now I just want to say thank you so much for listening. Please take really good care as you go about the important work you do. And thank you for all the positive impact that you’re having.

Paul Zelizer