John Buchanan:
Hello, everyone, and welcome to our first Speaker Series of 2021. I hope everyone had a great holiday season and got a chance to rest and recharge coming into the New Year. This will be the first of our thought provoking series discussions with different folks. Tonight, we're joined by Jeff Orlowski, a two time Emmy Award winning filmmaker and founder of the production company Exposure Labs. Jeff directed and produced the Sundance Film Festival award winning documentary films Chasing Ice and Chasing Coral, among others.
His latest film, The Social Dilemma, has created an awful lot of buzz. It is streaming on Netflix, where in September it became the first documentary that has ever been the most popular movie on the platform in any given month. I hope you've seen it, if you haven't, you should. But regardless, we are going to talk about it today. It takes an important and somewhat dark look at social media platforms and the technology behind them. Jeff founded Exposure Labs to maximize the impact of film, creating a company dedicated to both quality storytelling and powerful campaigns. Chasing Ice won the 2016 Doc Impact Award, recognizing documentaries that have made the biggest impact on society. And Chasing Coral won the 2018 Fast Company World Changing Ideas Award.
In 2016, Jeff was the first recipient of the Sundance Institute Discovery Impact Fellowship for his focus on protecting the environment. And in 2017, he received the Champions of the Earth Award, which is the United Nation's top environmental advocacy honor. And finally, he also traveled on tour representing the Sundance Institute, President Obama's Committee for the Arts and Humanities and the National Endowment of the Arts. So please join me in welcoming Jeff Orlowski. Hi, Jeff, and thanks for joining us tonight.
Jeff Orlowski:
Thanks, John. That's an embarrassing intro but it was great to hear all those flashbacks too.
John Buchanan:
I know. I was reading it because it is so long and there's so many honors in there, but we're really pleased to have you tonight.
Jeff Orlowski:
Thank you.
John Buchanan:
I'm just going to jump right in here. Your first two documentaries were about climate change. How did you get the idea to create The Social Dilemma?
Jeff Orlowski:
I think I've always just been interested in the big issues affecting humanity, and spending over a decade working in climate change and spend a lot of time thinking about systems level, structures and problems and challenges. And climate is a systems level problem, and it's complex. And then, in 2017 I was introduced through some friends from college, to what was going on in tech. And I had so many friends…I went to Stanford, my friends went and worked at Google and Facebook and Twitter. So many friends ended up working in Silicon Valley at different companies. I almost went down that path. I was really passionate about tech. I used to work at Apple actually as a campus rep. I just love the computers, love the technology.
So when a friend of mine from school started talking about, there's a problem with the way the technology is designed, that there's actually a fundamental systems level challenge with how we've designed this technology and the impact that it's having on society, that was completely new and foreign to me. It made no sense. I couldn't actually wrap my head around like, "What are you talking about? All of our friends work there. These are awesome companies." And so, to go from that perspective, in 2017, we started just talking to more and more employees and insiders at companies to start to learn what exactly is going on. What are the challenges with the business model?
It became clear, one of our subjects used a phrase that they've created a climate change of culture. That the technology that a handful of designers in Silicon Valley make is being exported out to billions of people around the planet. And those handful of designers can change the code that then affects and puppeteers billions of people. You think about that asymmetric power that exists. And if you design something just slightly off the exponential impact that it will have downstream and the domino effects. That's what really set us down this path was really wanting to understand more about what is technology doing to society.
John Buchanan:
I did some reading last night and obviously everybody knows Facebook is everywhere and Google. But the numbers truly are astounding. Facebook, a billion seven users on a regular basis and obviously probably even more for Google. One of the things, it's easy to get your hands around why climate change poses such a threat, but your film presents the technology that's driving social media as almost equally an existential threat, at one point calling it checkmate on humanity. Talk a little bit about that and why it is such a threat.
Jeff Orlowski:
No, it's a great question. And it seems, I think this is one of the challenges when I first started making the film. It's like, "Wait, you're talking about this app on my phone is going to destroy a civilization?" It seems like a huge leap. I mean, climate change seems like a huge leap in many ways too. Why is it that we're seeing so much political polarization? Why is it that we're seeing so much misinformation going viral? Why are rates of teen suicide and self-harm going up?
For a long time, I think, everybody looked at these as separate, isolated problems or it was unclear as to what the source was of these tensions. But from the subjects in our film, they were drawing a thesis around, "These are all coming from the same source." And it takes a couple of steps to kind of explain everything and piece all the things together. But at its simplest form, I look at this as a breakdown of our information ecosystem, just like climate change is a breakdown of our natural ecosystem. These platforms are now breaking down our information ecosystem.
So just in a couple steps here. The business model is advertising micro targeted advertising. They collect all the information about you as they possibly can so that I can target to you, John, specifically. I need to know everything they can about you so that I can tell you the right pair of glasses that you're looking for. Because I know you like this particular style of glasses so I can advertise to you more accurately that way. We as passive consumers think, "Oh, that's a great experience. I'm getting the thing that I like." But the problem is how you have the algorithms work to reinforce those systems.
So if I show you all the information available in the world and I find that you like these things, now I'm going to show you those things. And now, I show you a whole bunch of things like that and of those things you now like these other things. And so, I can segment that, like this is of all the information available in the world, these are the things that John resonates and brings them back to the platform. The more I get you to come back, the more you see, the more you engage, the more relevant the content is.
So now, I'm giving you this very small slice of the big picture over how many years have you been on Facebook? How many years have you been using Google? They've created a model. They actually have hundreds of thousands of data points around us. And there is a model that exists for each and every one of us on each and every one of these platforms. And that model is constantly being tested upon to see what's going to work on John today? How can I get John to spend a couple more minutes today? Are you going to spend 12 minutes? Well, in the middle of the day, or can I get you to spend 14 minutes? Can I show you more advertising? What kind of advertising? How relevant is the advertising?
So all of these things are being done, not by humans, right? It's not an engineer sitting there trying to figure it out. These are automated algorithms that are reverse engineering each and every one of us to figure out what makes us tick and how these platforms can monetize us. Now, you have fallen into this particular bubble on the spectrum, but somebody else has fallen into this other bubble on the spectrum. And each of us, this is a phrase that Eli Pariser coined about a decade ago, filter bubbles. We all have been getting our own filter bubbles. So, if I have a set of information that I really think is true now and you have a different set of information that you really think is true. And if we've been living in these silos for a decade, just recognize how hard it is for us to come together and agree on something that we might disagree about.
I'm just referencing one of the slices of a myriad of examples here. But for me, I think the political polarization, the breakdown of truth, the increase in viral misinformation, all of these things are not separate symptoms. These are not separate problems. These are all coming from the same source. This is the same business model and structure that is causing these problems that we're seeing.
John Buchanan:
And I think one of the, we were talking earlier, one of the things that you state and we kind of know is “if you're not paying for it, you are the product.” Can you talk about that?
Jeff Orlowski:
It's considered almost trite in the tech industry now because the tech industry has known it for a long time. We almost didn't put it in the film. I thought it might have been overused. If you're not paying for the product, you are the product. I find that it's a really, really great and simplistic way to get a sense of, "Am I the customer?" We paid for a Zoom account, somebody paid for this account so that we can talk for an extended period of time. You pay for the phone that allows you to send messages or FaceTime.
There are other services that we don't pay for. We don't pay for Facebook or Twitter, Instagram, Snapchat, Google. And these are the services we're finding so many of these big challenges because their business model is our attention. Their business model is our eyeballs. They have designed the techniques to figure out how to get us to spend as much time on the platforms as possible. They figured out the techniques to grow the network as fast as possible. They're not designed around what is genuine and authentic human connection mean for you? How can you deepen the relationship with your friends and family? How are you going to find satisfaction and contentment in your life? None of these platforms are designed for that. They're designed to figure out what might you most likely be interested in that's going to get you to spend more time so that we can make more money passively as you're scrolling through.
John Buchanan:
And it is interesting because, obviously, conceptually, we've been advertised to for the 150 years in newspapers and on the side of buses, on a TV and so forth. But this switch over to, and Facebook obviously came out with a statement about the social dilemma, which probably was a good thing for the film. But their point was that. That we're just advertising. But they've switched now to social and political and other causes that's not happening on television.
Jeff Orlowski:
It's like we went from a BB gun to a nuclear weapon. These are completely different levels of manipulation. There's something if you all look at a billboard driving down the highway and there's a static image there and everybody's seeing it. If there's something that's inappropriate, it gets called out and we all see the same thing. There's something really meaningful about everybody seeing the same thing. That's not how these platforms are designed. They're intentionally, they're inherently, their entire existence is around everybody sees different things.
We've moved from one shared culture where we all can have like, the water cooler conversation to infinite subcultures, where everybody gets 10 different things based on their tastes. And it's breaking down the ability to have shared conversation. So, if I get advertised something that's completely different from you, it goes so far beyond that because an advertiser can now test thousands of different versions of a particular message to lots of different people.
We know of cases where political manipulation was 60,000 versions of a political ad were sent with slight variations to lots of different people to figure out what's most effective. And what's most effective on you might be different than what's most effective on me. And we can see different political messages about the same candidate and have completely different takeaways because we're getting different messages. We're not getting the billboard on the highway that we're all seeing the same thing. These are totally, totally different forms of manipulation.
John Buchanan:
And also, I mean, it was like I came home early tonight to do this and I was talking to my wife. She just told me a story about two people that we know and one posted something on Facebook and the other disagreed with it. And they had a little spat. And then the person cut her off and un-friended her. And so, it just shows not only does it feed you that bubble, but then it gives you the mechanism to say, "I don't even want to have any more conversations about this."
Jeff Orlowski:
And actually one of the ways that we actually got into the film in the first place was because of our past work on climate change. We had made two films about climate change. The science was known and understood and established, and we were trying to go out like, "Wait, we need to solve this issue. We need to get more people to see it. We need to get more people to talk about it." And we would come across Americans all around the country that denied climate change. They had completely different facts. And I was there banging my head against the wall, like, "How is it that people are coming with different information?" They have a different belief system that is not associated with the science, not associated with what we know from academia and what the research says, yet they're absolutely convinced of an alternate reality. And it's because that's what they're being sold on a daily basis.
One of the examples that we put in the film, if you do a Google search, even Google, I'm not even talking about Twitter or Facebook giving you polarized, customized feeds. If you go to Google and do a search for climate change is, you will get different auto fill suggestions based on what Google assumes about you, based on who you are, what you searched for in the past, where you live in the country or the world. And all of these results have nothing to do with what is true. They have to do with what Google assumes you might be interested in based on what's trending based on all these other factors.
So we have a system that inherently gives different information to different people. When you scale that out, when you scale that out over, one of our subjects, Tristan Harris, always referenced it like this. And I always love thinking like, if you zoom back out and you look at the ant colony that is planet Earth... So you've got planet Earth and we're looking down and we see seven, eight billion people zipping around. And now you put something in each and every one of their pockets that can whisper a different message to somebody who is leaning a little bit to the left and somebody is leaning a little bit to the right. And you let that I mean, it's compound interest, right? You are feeding an algorithm over and over and over again and it's reinforcing whatever you feed into it. And now, after a decade, you get something far left and far right from two people that might have been very, very close years ago. What does Thanksgiving dinner look like for countless people? It's getting harder and harder.
John Buchanan:
Yeah, I guess like the fundamental question would be, did they discover that, that's a better business model to have people, these untruths and conspiracy theories versus to your point, if you're whispering everybody's ear, "You should love your neighbor and be together." But that that was not a model that was going to make them money. Did they realize that or did the algorithm just take it over?
Jeff Orlowski:
I think there's a blend of, there are certain things that were intentional and designed and other things that were discovered and accidental. And there's kind of a big gray zone. The intentions seem to be very clearly around growing the advertising business model, growing users, growing revenue. And from a business perspective, you can understand, they were just trying to be a successful business. So in some cases, there were a lot of unintended consequences of that original pursuit.
A lot of these companies never expected to get as big as they became either. Twitter started off as an art project and grew. To think that now Twitter is like the center of global politics in some ways. But one of the things that they discovered is that extreme emotions and negative emotions tend to track more favorably than positive ones. Something that gets you angry and pissed off and ruffles your feathers is something that people spread very, very easily. And so, that's one of these unintended side effects where in terms of what types of emotions are more viral, the negative ones tend to be more viral. We're not seeing messages of love going as viral as like, "Can you believe what so-and-so said?"
And this is one of the sort of discoveries that I think unfolded over the course of time, yet has had massive and exponential consequences around what type of content is spreading on these systems.
John Buchanan:
Anybody that reads any comments on any stories on the internet, it's a very depressing activity because you just see the 80% of them are nasty, negative commentary.
Jeff Orlowski:
One of our subjects shared with us, these platforms have sold us on the idea of connection, but they're not really connection, they're connectivity. And so, you and I right now, I can see your face, you can see, we're responding to each other. There are mirror neurons in the back of our brains that are engaging and we're creating empathy in the course of a conversation. This is one of the reasons why film is such a powerful tool, because we have a psychological empathy response. That disappears when you take synchronous video with audio and video at the same time, you break that down. How much can you convey in a still photograph? Break that down even more. How much can you convey in text? Break that down even more. How much can you convey in one hundred and forty characters?
So, we've created, we have these systems that have in many ways decontextualized the nuances of life, the gray zone. So much can be so easily misconstrued from a tweet. And then now I'm going to retweet it and decontextualize it even more and then somebody else add on top of it. That's the same thing that you're talking about in the comments thread. And everybody wins points for how aggressive can you be, how snarky can you be? How witty can you be? Because the more you rip somebody else apart, the more affirmation you get. And now, we have a whole system that is affirming ripping other people apart.
John Buchanan:
Yeah, fascinating. I'm going to switch gears a little bit in terms of talking about social media addiction and describing the design techniques about this concept.
Jeff Orlowski:
So, in some cases, I mean, part of the problem here is that the engineers are just so good at their jobs. And that they built these algorithms that were incredibly effective. A big shift happened when, if you remember, if you were on the platforms in the old days, when everything was chronological. And if I posted something on Facebook, it would show up on your wall in chronological order. But at a certain point and I forget the exact years, they moved away from that system and they figured, "Well, wait a second, how can I keep John as engaged as possible?" And it's not just chronological order that is the most engaging. If I switch the order and let a computer figure out what resonates with John, "Here's a post that your wife made yesterday. Here's a post from the other family member. Here's a post from some interest that you have."
And if I show them in a different order, in a different way, based on all these factors that we know about you because of everything that we collected about you, I can increase your usage. You used to spend 28 minutes a day on the platform. Now I got you to spend 34 minutes a day and hey, you're using it more. So that must be better because you like the product. Great, I mean you're using it for an hour, you love our product. And that was the early impetus around the story that I think a lot of the engineers convinced themselves of where they were building tools for the user and that they were able to... The engineers made this assumption that time usage was a metric for quality and improvement in life and what the user wanted.
And it was just a metric that was easy to count that also made them more money. It was a way to sell more advertisements and place more advertising. So, the problem was that we were attaching the design to this business model in a way that wasn't ideally suited for the user. So now, we could spend hours and hours one day scrolling infinitely. One of the subjects that we met and worked with, his name is Azer Ruskin, he invented the infinite scroll. When you would scroll and get to the bottom of the page, you had to click the next. And there was a change in the technology that allowed them to skip that and just to scroll forever.
And we know from these great psychology research on the bottomless bowl. If you're given a bowl of soup and you finish eating the soup, if it's a bowl and there's a tube feeding more soup into the bottom, you'll eat a huge amount of extra soup because it just keeps feeding you more and more soup. So, that's what they discovered with infinite scroll. So, they kept designing all these little techniques that let it grow really, really fast. And this is one of the challenges. They use growth tactics and they use different tactics to get the size of the network to grow faster and to get more and more people to spend more time on it. Those are very, very intentionally done.
John Buchanan:
And so, in terms of this concept of addiction and in kids in particular, based on your research and filming, what advice are you giving parents in terms of social media?
Jeff Orlowski:
Right. Let me answer your question in just one second, because some of the ways we're talking about this is... What works on you, what an algorithm can reverse engineer about you and get you to spend more time on a platform might be different than me, and it's different than a 16 year old young girl. The algorithm knew that I was really interested in politics. During the 2016 election, politics just had me coming back all the time. But a young teenager who is just learning to develop social skills and interact with friends at school and all of those friends are now on these social platforms together, it has figured out what works on that mind.
And it's not necessarily the politics, it's so much of the social connective tissue that we are literally learning at that developmental stage in life, that now there's an entire new layer being applied to teenagers where they're not only learning social skills to navigate the world, they're having to do that in these digital platforms that have gamified this entire world. So, the social pressure for a young teenager in terms of their, "How many likes did your post get? How fast did the likes come in? Who does the likes come in from?" If it didn't get enough likes you delete it.
The entire notion that you have to change who you are to get approval, literally put the filters on, beautify your face, change you so that you get more affirmation from your friends is what we're training an entire generation of youth. I don't think, I mean, my mindset is wait till high school at the earliest or don't even get on to social media. There's a big difference as well between screen time and social media. The big problems that we're seeing amongst teenagers, the emotional challenges that we're seeing among teenagers and Gen Z are more associated with the social media platforms and screen time itself. YouTube also falls under social media as a reminder.
These are addictive platforms that change the way we think and see the world and see ourselves. I am all for technology, I love technology. The fact that we can be in two different parts of the same city right now, but have this efficient and documented for anybody around the world to see is a massive application of technology. But once again, we are the customers using it for our intentions and our purposes. And it's not some hidden secret financial business model that's puppeteering the whole system.
If a parent wants and this is one of the exciting things, there are all of these new technology platforms that are coming out in different phones that can help a parent have a kid stay connected without social media, without news ads, without any of that stuff. But you still want to be able to call your kid or send a message to them to figure out when you're picking them up from school. Those are valuable technological features. Yet we don't want to blend in with all the dark sides of social media.
John Buchanan:
Right. Okay, so where do we go from here to a certain extent? So, the business model, like privacy doesn't pay yet, but it could, in the future. And you think about Apple is clearly on that side of, "No, we create things and people pay for them." Because it's hard for me to understand how that you're going to see Facebook shrink until somebody really, I mean, obviously the government and regulatory issues, but they need a competitor, maybe a platform that says, "No, you can monetize yourself, but you're actually going to get paid for it instead of Facebook."
Jeff Orlowski:
And I'm hearing of those platforms. Literally, I had a call today, earlier today, with a platform that's trying to develop a model like that, where users own their data, where users can be paid for their data. I think we're going to see a huge shift. I think the micro targeted, surveillance driven advertising business model is not going to last long, I hope. I certainly, certainly hope because it is such a source of great... In terms of the asymmetric negative impacts that are happening on the planet, it's one of the places where I think there's a huge, huge thing where this particular business model is causing countless societal ills.
And I really do hope we can figure out a way off of it. And that's either going to come from regulation or it's going to come from users leaving the platforms or it's going to come from, as you're saying, like somebody inventing a better thing. Social media can be so much better. Imagine a platform that really made you feel deeper and more connected to your close friends and family after you used it? Like social media leaves most people feeling like vapid and empty after they use it. You feel more depressed after you use it because you just saw everybody else's highlights reel. That's how I look at it.
It's like everybody posting their best selves. And you're there at home in your pajamas thinking like, "Oh, I'm not on vacation right now." On any given day, one of our subjects shared this with us, on any given day, somebody that you know might be on vacation, but you're not on vacation every single day.
And so, when you look at it from that perspective, I want technology that's going to make me deepen the connection with my friends and family where I have stronger ties. Technology that makes me feel more informed about the world. That gives me a better sense and a better understanding of what's going on, that increases my trust in truth as opposed to breaking it down.
John Buchanan:
Sure. Just a question about the storytelling, you use that fictional storyline in the film, which I hadn't seen it in a documentary, but I don't know if that was brand new or you had seen it elsewhere. But ultimately it worked. I have to say, you start off the film and you're like, "Wait, what is this?" And it's kind of quirky. But by the end, you're pulled into it. How did you come up with that idea?
Jeff Orlowski:
Yeah, thank you. And it was a big gamble for us. And in large part because it hadn't really been done in this fashion before. There are documentary films that have used recreations or reenactments, but that's in most cases trying to literally show what the subjects are talking about. This was a different approach where we were having a separate storyline that was running parallel to the documentary. The hope here was, "How do we make the story as accessible as possible?" And we easily could have had a wall-to-wall hour and a half or 10 hours of subject matters or experts talking nonstop.
And it was, at a certain point you get fatigued when you just hear the nonstop talking. And I think when we were doing all the test screenings and looking through the material, it's like this is really, really dense. And we have ways that we can bring this to life and we can visualize it. So those are some of the seeds that helped shape the thinking. The Big Short was a big inspiration for me, where Adam McKay took a really complicated subject and they made it accessible through the story. And there were things that they needed to explain to the audience and they broke the wall and had somebody talk directly to the camera and explain, "This is what this financial apparatus is." And then they went back to the narrative.
And we kept referencing, what is the documentary version of that? Like, if you take that and flip it upside down, we have a documentary that can give you levity and break the heady ideas and make the thing more accessible and at the same time, can we visualize it? Can we give people, what is an algorithm? Nobody knows what an algorithm is. Nobody knows what the code is and how it operates and yet it is constantly testing on each and every one of us. Can we give somebody something to look at that brings that concept to life? What's the visual metaphor for that?
And so, I feel like our team was able to really come to the challenge and figure out ways to communicate those very complicated ideas and make it a little bit more relevant. I hope that the audience member can watch the film and be able to pick up their phone afterwards, they're like... You can picture of any Kartheiser on the other side of that like, "Why does certification come right now? Why am I seeing this thing? What exactly is going on?"
John Buchanan:
One of the things, I don't know if you know the answer to this, but I assume they're also listening because you talk about whatever Japanese steak knives and all of a sudden they show up in your feed and Instagram.
Jeff Orlowski:
In some cases they are, in some cases they aren't. And the scariest cases are the ones in which they're not. So Facebook and Instagram, from everything I've heard, they do not listen. Yet, their ability to predict what you're talking about or know what you're talking about is so scary. And the scary part is that they don't even need to listen. They know that 10412 people just like you are talking about Japanese steak knives right now. And I think the odds are like 82.6% likely that you're going to be interested in these Japanese steak knives, let me show them to you as well.
And that's what's actually going on, is that they're using big data to create these correlations around who you are. And this is where it gets going deep and kind of dark for a second. This is where the algorithms, one of our subjects said algorithms don't predict the future, they cause the future. Algorithms, you give them historical data and they can reproduce what it learned from the data. So it can, algorithms perpetuate the status quo as opposed to transcend it.
So some dark examples, there were algorithms that were fed, hiring algorithms for companies where it's like we want to hire people that have done well at our company. So, when you feed at the data and there are some algorithms that recommended that this particular company hire white men named Jared that played lacrosse because historically white men named Jared who played lacrosse did well at that company. So, it recommended hiring more people just like that.
So the algorithm, if it knows that John likes this particular type of Japanese steak knife, I'm going to keep showing you these particular things. And now, the algorithm is going to keep pushing you towards things more like that. And this is where, in some ways, we are already living in the Matrix, we don't even realize it, but we already live in The Matrix. I've been sharing this, if you get your news and information from an algorithm feed, I am not going to trust your understanding of the world as much. Because your information that you're being fed has been pre filtered based on an algorithm for years now. And it's giving you, in my mind, a warped sense of reality. And apply that to everybody who's getting their news from social media, that is sort of the state that we're in.
John Buchanan:
I was reading something that Amazon had done a study and they basically have figured out, if they just send you stuff, people will keep 75% of it because they know what people need.
Jeff Orlowski:
And it's also more work to send the thing back.
John Buchanan:
Right, that's true. So, Jeff, we're kind of coming to the end here. I have a couple more things. One is what's next? What are you working on now?
Jeff Orlowski:
Our team is really focused on, as you were sharing earlier, our team is focused around impact campaigns. So, one of the things that we do at Exposure Labs is we do films and then we do campaigns that go hand in hand with those films. One of the problems where this comes from, then I'll sort of tie this all together in a second. But as we released our earliest films, Chasing Ice and Chasing Coral, we were quickly seeing that the people who understood what was going on with climate change were the people who came and saw those climate change movies.
It's the same filter bubble problem, who's going to spend money to go to a movie theater to watch a movie that you disagree with? So, what we were realizing was the people who needed to see our movies weren't the people who were naturally coming to us. We had to do the hard work to take the movies to other places and to engage the conversation in other places and to sort of set our team on this path of how do you... You've spent all these years making the movie, how do you maximize the impact that can come from that film? And that's different for every scenario.
But in the case right now with The Social Dilemma, our team is really investing heavily on how do we leverage this film and the story to have maximum impact. In our mind, that is changing the way the technology is used, the way it's regulated and the way it's designed. So, engaging with the public to have a better understanding of what's going on, to protect themselves and how they use their own technology. A bunch of resources and tips on the website, thesocialdilemma.com.
How do we change the way the technology is regulated? Engaging with politicians to better understand what's at stake, what exactly is going on? How do we fix it? How do we regulate? How do we move off of this business model? And then lastly, how do we actually change the way the technology is designed? And what influence can be had in Silicon Valley itself? How do we move these companies off of these business models? So, our team is really focused on how do we get into schools, how do we educate youth? How do we help teens who are struggling from mental health issues understand the role that technology is shaping here? How do we engage with politicians? And in all of these questions, that's what our team is working on. So that's a real, real big focus for us.
John Buchanan:
That's great. I mean, I was watching something, congressional hearing, and you had a congressman that is talking to one of these tech executives and he's literally like using an easel behind him with a poster board. Just the dichotomy of that was just mind blowing.
Jeff Orlowski:
One of my personal favorite things during one of the last Senate hearings, Senator Lindsey Graham asked Mark Zuckerberg and Jack Dorsey if they had seen The Social Dilemma and encouraged them to watch it. I was like, that's an awesome win.
John Buchanan:
That's a win. Yeah, for sure. That's very cool. Well, thank you for being here. Just one last kind of totally different note here, it says in your bio that you have two Guinness World Records and so that piqued our interest. And so can you tell us what those are?
Jeff Orlowski:
So, for Chasing Ice, it was a film over documenting how glaciers were changing. And there is a point during the summer of 2008 where my friend Adam LeWinter and I lived on the side of a glacier for the summer and we were just filming this chunk of ice all day, every day. Literally, the two of us 24/7, waiting for something to happen. And what we happened to capture that's in that film and you can find it online as well, we filmed the largest calving event that's ever been documented, which was also the longest calving event. So, it was over a 90 minute event that we filmed. And the volume of ice was equivalent to lower Manhattan breaking off and tumbling, except even taller than the buildings. And so, it was a monumental event to be able to witness. And we shared that in the film and it's online and Guinness recognized that.
John Buchanan:
Oh, that's very cool. That's awesome. Well, thank you so much. Again, Jeff Orlowski, The Social Dilemma. We encourage everybody to have a look at it. It's an important film and fantastic. So, thanks a lot, Jeff. And we'll see everybody else next month. Thank you.
Jeff Orlowski:
Thank you so much.
John:
Thanks.