Unlearning the World: Through the Looking Glass

The Dark Side of Social Media: How Algorithms Shape Our Truth

Richard, Beki, and Shannon Season 1 Episode 4

What’s a moment in your life that changed how you see the world?

Episode Summary
In this episode, we dive into the powerful and sometimes dangerous influence of social media and digital platforms on democracy, free speech, and how we perceive truth. Richard, Beki, and Shannon discuss the role of media in shaping our perspectives and emotions, from the content we consume to the algorithms that guide our feeds. We explore the fine line between empowering voices and spreading misinformation, and how critical thinking is essential to navigating today's fast-paced media landscape. Join us as we examine the impact of social media on our world, from TikTok to Twitter/X, and why it's crucial to be mindful of the information we consume.

Key Points Covered This Episode

  • The impact of social media algorithms on our perceptions of truth
  • The fine line between free speech and misinformation
  • How media can empower voices or fuel division
  • The importance of critical thinking in an age of instant information
  • The dangers of consuming content without questioning its source
  • Exploring the balance between connection and manipulation on platforms like TikTok and Twitter/X
  • Why the speed of media and news dissemination make us more susceptible to misinformation

Takeaways

  • Media and social platforms have a profound influence on how we see the world, often shaping our beliefs and emotions without us realizing it.
  • While social media can empower voices, it also amplifies misinformation, requiring us to be vigilant about what we consume.
  • Critical thinking is crucial when navigating the vast sea of information on the internet, especially with the rise of AI and deepfakes.
  • Balance is key: use social media thoughtfully, be aware of your information sources, and always question what you see and hear.
  • As media evolves, it’s essential to stay informed, question narratives, and ensure that we're not passively accepting the information being presented to us.

Thank You for Listening!
If you enjoyed this episode, subscribe, leave a review, and share Unlearning the World with someone who could use a new perspective. Together, we can learn, grow, and see the world differently.

See you next time!

Support the show

Unlearning the World: Through the Looking Glass is produced by Crowned Culture Media LLC.

Original theme music by The Dj Blue.

[00:00:00] Richard: There was a time when I felt overwhelmed by sadness, yet I couldn't figure out why I was feeling so low. Then a friend pointed out something I hadn't considered. My playlists have been dominated by Amy Winehouse's Back to Black, Adele's 21, and the emotional, soulful sounds of Nina Simone. While my sadness wasn't entirely because of my music choices, it was clear they amplified the heaviness I was feeling.

[00:00:27] This moment made me realize something profound. Media, whether it's music, social media, News or entertainment holds incredible power in shaping how we view the world and ourselves. The messages we consume shape our emotions and perspectives in ways we don't always recognize. Today, with the speed at which media moves, The impact is even more profound.

[00:00:53] Information that used to take hours and days to reach us now arrives instantly, right in the palm of our hands. [00:01:00] Social media has become a critical platform for free speech, a tool for spreading information, and a powerful connector for people across the globe. But with this power comes responsibility, both for those creating content and those consuming it.

[00:01:16] It's no longer just about choosing what to post or share. It's about being discerning of what we allow into our minds. Welcome to Unlearning the World Through the Looking Glass, where we explore moments, learnings, and experiences that change how we saw the world forever. I'm Richard, and I'm joined by Becky and Shannon.

[00:01:38] In this episode, we dive into the profile influence of media on democracy, focusing on the dual role of social media. On one hand, it's an essential tool for spreading knowledge and empowering voices. On the other, it can amplify misinformation, distorting perspectives, and reinforcing bias. As we explore the complex landscape of modern [00:02:00] media, we also examine how it connects us, shapes our truths, and challenges the way we think, sometimes without us realizing it.

[00:02:08] The need for critical thinking has never been more urgent. Let's get started. With the way that all the social media platforms are being ran right now, I mean, even TikTok, everybody was sad to see it go, but the way it came back, I think it turned a lot of people off, and myself included. I agree. I worry that it won't be the same platform that it has been.

[00:02:31] And then you talk about, I'm definitely not going to, I never touched true social. That's just, that's just the truth. That's the actual truth. , and I haven't messed, you're gonna call that a fact, and I haven't messed with X since it's been Twitter. And everything with Beta, the stuff that Mark Zuckerberg has been doing, you know, it's kind of like one of those things you, you, you never necessarily was crazy about the dude and the stuff that he did, but then you kind of hit that line.

[00:02:57] I'm at that line. I really wanted to delete [00:03:00] all of my platforms and not participate in any of them. But you think about having a business, you kind of need them. You kind of need them to promote and certain people that I only talk to through social media. So do I leave these platforms and take my voice away because it might, I might be pushed down for the things that I'm saying.

[00:03:22] Do you leave it completely or do you stay on it? And be a part of the cog where you're still helping making these companies money, but I don't know. I'm really at a crossroads. I'm somewhere between kind of staying on a couple of them and just deleting everything. 

[00:03:38] Shannon: Listen, no, I'm, I am keeping all of my socials.

[00:03:43] This is just my own one, because. I manage other people's socials. So I can't, I manage someone's Twitter X. So I can't remove myself. This is how I stay astute and attuned to what's happening and algorithms and trends. So part of that is [00:04:00] in my job description. So that's one. So I can't, but also I think it's not about leaving.

[00:04:06] It's just how you show up on these. In these spaces, right? Because I think that if you remove yourself, there are people that listen to your voice. There are people that follow you. There are people that connect with you on these platforms. You don't rob them of that. You just watch what's going around, right?

[00:04:25] Watch what's happening on TikTok. See how it's showing up. I don't think it requires exit. I think it requires pivot. I think you need to pivot how you utilize it, pivot to what you put on it, what you showcase. Like, I don't always want to post my kid anymore. So there's just certain avenues I won't do. I don't post my whole life on there, you know, but I'm not leaving.

[00:04:46] I, I like to learn and I like to observe. And part of observing is keeping this platform to see what's going to happen. That's just my personal anecdote or opinion. 

[00:04:56] Beki: Yeah. I think I pulled back really [00:05:00] far a long time ago. And I don't have a real deep and abiding desire to dive back in. There are a couple of accounts that I may delete just because I'm out there maybe once every four to six months.

[00:05:18] And I don't think that's the point of social. The only concern I have about that is international family and friends, that that is a mechanism to stay in touch. And I don't really worry about giving people too much money through those. I'm not enough of a participant that I'm feeding anybody's need financially, um, because I'm not paying for the account and I'm not participating in the revenue generating things that at least that I'm aware of.

[00:05:47] But I do know that I will go deeper and deeper into observation only and intentional scrolling. Many of the apps have been off my phone [00:06:00] for a very long time because I just don't want it to be that convenient. 

[00:06:06] Richard: Man, I see. It's, it's almost like running away. I think if we took out like the ad revenue portion of it, it would be completely different for me.

[00:06:16] But it's like I almost feel like I'm abandoning and giving up. And that is my knee jerk reaction. It's the kind of, you know, hey, I'm taking my ball and I'm going home. When it's people who are not going to leave the app because they're not aware, or they don't care about the things that are going on. And if I am a voice that somebody listens to, why take away a source of good and give in to evil?

[00:06:41] Not saying that it's good or evil necessarily in those contexts, but You know, just, you know what I'm saying? So, so like why give in and take my voice away, a voice that's trying to speak goodness, it's, it's just really hard. Cause it's almost like one of those things to relate it back to something else.

[00:06:59] When a [00:07:00] musician does something that you find appalling, but you love their music. It's like, I stopped listening to their music because. I feel like listening to their music, I'm condoning that behavior. So, I don't know. I mean, there's so many morally gray areas. It's like, I stopped listening to this music I love, like Kanye West's first and second album and even the third one to an extent are some of my favorite pieces of music of all time.

[00:07:30] But the rhetoric and the things that he's done over the years is, I can't even rock with it. So it's like, I don't, it's not even like, I don't listen to the new stuff. I don't listen to the old stuff either. 

[00:07:42] Shannon: I will say this because I'm always, I'm always this. There has to be a balance, right? You, you want to go into politics.

[00:07:49] You want to go, there's a left than the right, but there's a good amount of people that are in the middle. And I think it's important that that middle, that that central group, that that balance that [00:08:00] can see both sides, or just really wants to connect with somebody that feels the same way that they're represented.

[00:08:06] And so I think it's important that those people don't go away on these platforms because then it really will be like Wild Wild West on these platforms. So I think come up with your own boundaries of how you want to utilize it and show up. And if you want to remove it, cool. But for me, it was just some of them I would not be on if I wasn't not like professionally required to be.

[00:08:33] Um, but, In general, I think there is power in the balance of people staying on that are in the middle and that could provide insight and input and information, um, that do have the facts that do have the information that sets the tone, right? Cause there, everyone has to coexist. And so I think if people pulled away, that would make it easier for like this course, [00:09:00] you guys know what I'm saying, negative stuff to happen left or right.

[00:09:04] Beki: Right. Well, and I think there's the other side of that too, which is that it comes back to your point about personal choice. Part of it is, you know, when, when I grew up and they talked about stop, drop and roll with fire, I kind of think about that as stop, block and scroll, where it becomes this idea of stop looking at it as much as you did block the people who, or, you know, stop following them.

[00:09:31] I guess that would be a second stop. The people who are writing things that you find incredibly offensive, because you can't not know whether it's music, whether it's a post, whether it's whatever. And if it has offended your core for a long period of time, stop them or block them. And then scroll through the things that do fill up your life.

[00:09:58] This is supposed to be [00:10:00] something that fills us up and gives us something. It isn't meant to rip us down. It was, my belief is it was never constructed, at least the ones I'm on, were not constructed to shred people's hearts. 

[00:10:15] Richard: A lot of the perceptions that we have about different things, Shannon talked about it in the election episode about the proximity.

[00:10:22] And so many of us are connected to people that we might not even ever have met in person, uh, through social media. So knowing how much social media can influence people, it's scary to see what's been going on with some of these platforms. TikTok was banned shortly. It was just for a short amount of time.

[00:10:42] And the CEO of TikTok made a TikTok video that played when you turned it on right before the band saying that they're only going away for a little while. And he specifically called out he's working. He's happy to work with President Trump to get this taken care of. So to hear him say [00:11:00] that was really like, it called me off guard.

[00:11:02] I was like, really? So TikTok actually did turn off for maybe a week. Less than 24 hours. And then when it came back on, it was a message that said, welcome back. And specifically called out president Trump for getting TikTok back online. The best comment that I've seen that a user posted a TikTok saying, And I quote, coming to remind people that if somebody sets your house on fire, then puts the, puts the fire out, you still punch them in the face.

[00:11:33] They set your house on fire. You don't need to thank that man, end quote.

[00:11:40] So boom, let's start there. 

[00:11:42] I can 

[00:11:45] Beki: absolutely say I would. Maybe I wouldn't punch him in the face, I'd be too afraid they'd punch me back, but I certainly would not be thanking someone for lighting my house on fire. I think that that would be fair. 

[00:11:55] Shannon: See, I'm always, again, I don't know if it's [00:12:00] age or life experiences, I always like to look at both sides, right?

[00:12:04] And I'm still of the generation that there was a time in my life where there was not social media. Okay. So there was a time in my life that I, I, young, there was my space, but still like before that there was nothing there was aim and then that was it. So I was before aim.

[00:12:24] Richard: I wasn't going to call it. I remember I signed up my first AOL account. I was like, yes. Yeah. And it's still a long time 

[00:12:31] Shannon: ago. Yes. But I had an existence before I was checking a page. You know, several, several times a day before I, there was a life that I had outside of social media. And there is a group of people at this point in time that don't know anything else.

[00:12:50] And I think specifically with Tik Tok, because I'm going to be honest, I have a Tik Tok, but it's like my Google for house cleaning and recipes. That's all I [00:13:00] really used it for. I didn't really post on it. Um, but there are people that live and breathe, like I can tell you, someone close to me was like, I'm emotionally not okay that TikTok is going away.

[00:13:11] And so when you have that type of connection to electronic or social media platform, it breathes room for people to take advantage of that. And I think that politics. And power go very much along and how you get to those people. And this is not a plug, but just think about it, the marketing tactic and the strategy of the timing of when tech talk was bought back on and the writeup of thank you, all of that is strategy.

[00:13:41] So like, I, I need people to get off of like. Social media is truth and everything on this platform is real to understand that a lot of stuff behind it, a lot of things that are, it's all strategy and people that haven't lived in a world outside of social media, I think, are very much struggling and will [00:14:00] struggle for the rest of their lives because it's, it's foreign to them and it's alarming to me, um, 

[00:14:06] Beki: to see that.

[00:14:08] And I guess I, I think about this. In terms of what is it that creates that level of drama, trauma, concern, whatever it might be, that this would go away. And especially when it's a single platform, which actually perplexes me because I guess it's because of where it's owned. I, I answer my own question, but it's just one of these elements where I'm like, what draws that sort of fix that goes with navigating that feed?

[00:14:45] And I've gotten lost. I actually do not have a TikTok account specifically. I am considering actively the idea of reducing my social media footprint anyway, just because similar to what I'm about to [00:15:00] say, I find that I lose a lot of time. In being in that space. and not getting a lot of value for it. I said to someone earlier today, I can go online and look for videos of kittens and puppies anytime I want to.

[00:15:16] And essentially that's how I got started on another social media was I wanted to kind of follow formula one and I liked the cute videos and those two. Things I could probably let go without any drama associated with that. And at the same time, I do also appreciate that when I added the otters and my workout training stuff and all of those other things that I started following, I had more of an affinity, but I can fill those things elsewhere if I want to.

[00:15:52] Richard: When I think about people feeling emotional about the band, and I admit I was Feeling emotional about the ban, [00:16:00] but it had less to do about TikTok and more to do about the U. S. government seemingly wanting to control the flow of information. That, so TikTok was not, it's not an American entity. Well, it might be by the time this airs, at least partially as far as what's going on now, because it's very fluid.

[00:16:24] The thing that scared me is that that had been a place for free speech. It's people that you know, online through social media. Now, if you see the kids, the younger kids, they don't even give each other phone numbers. They say like, what's your Insta or what's your Snapchat? They exchange social media profiles.

[00:16:40] So I think people being emotional to TikTok and other platforms of that nature, it takes me back to the pandemic. When it was people that you communicated with, that was the only way that you could get a hold to actual personal connections was doing it through social media. So I can kind of see it there.

[00:16:59] So [00:17:00] that's just one part of it. But I think the other part was that TikTok had been such a passion of free speech. People were getting educated and learning about things that were happening all over the world that traditional news media didn't cover. So it allows you to see that stuff and it allowed a lot of people to shape different opinions about different things because they were getting more information and sometimes from multiple different sources.

[00:17:24] So I think that that was, it's kind of like you can't put the genie in a bottle. I remember going without social media and there's times where I do like, I'll call it like a social media fast. Where I turn off all my social media, because I have had times where it has been overwhelmingly negative, especially like if you're having a struggle in your life and everybody's posting their highlights in their life and you're just seeing everybody's happy and it starts to, it starts to really warp your perception of the world.

[00:17:50] Shannon: Yeah. 

[00:17:50] Richard: So, so I definitely understand that. But what social media has done as a whole is connect people that might've not ever been able to connect otherwise. [00:18:00] You think about before cars and planes. You would, if you went to California from Michigan, then chances are you wasn't coming back to Michigan.

[00:18:10] Right? You might not see, you're not going to see somebody cause it's going to take you a long time to get to California using accent, like talking about Oregon trail. Right? So social media allows us instant communication with people all around the world. And I think that's more of the emotional thing.

[00:18:27] That I had to, to connect with. And with the way that everybody's been moving, you think about Elon Musk bought Twitter. Now X, your mama named you Twitter. I'm gonna call you Twitter, but Elon Musk has that, uh, at one time now president Trump. He was kicked off of Twitter when Twitter, before Twitter was owned by Elon Musk, and he went and he made something that ironically is called Truth Social.

[00:18:54] And so you think about the President of the United States being so closely tied to Elon Musk, now it's like, I can't [00:19:00] trust Twitter and I can't, I wouldn't touch truth. And then you think about Mark Zuckerberg who owns Facebook, who, who has Facebook and Instagram and WhatsApp, him cozying up to, to Donald Trump as well.

[00:19:15] So yeah, So, when we talk about the consumption, it's very important to be able to trust what you're being fed, because YouTube is my favorite social media app of all time, over everything. Yeah. Sometimes I forget that it's social media. And I even have to be careful with some of the videos that it starts to feed me, because you have that autoplay and sometimes you get into videos and it's like, hold up, this is veering off course.

[00:19:40] And if you're not paying attention, you can start to, and take things that might shift your, your thoughts in a different direction if you're not guarding your own thoughts. 

[00:19:50] Shannon: Yeah, I think to go back to something you said, so TikTok started off right in this thing of free speech, and it [00:20:00] was a space where people could speak freely and it connected internationally, right?

[00:20:07] But I think the switch happened. And I think this will always be, especially for America, the switch, the switch happened during COVID. So if you look at his first time around the block, President Trump, he didn't like it because it was positioning information, right? Maybe some information he didn't like free speech, right?

[00:20:26] Then COVID happened and people started making money. off TikTok. Okay, you got TikTok famous. You were able to make some money off of that. Well, from an American standpoint, that really honestly is founded upon money. That's where the power is, right? Money. Example, Elon Musk. Now people are making a profit off this.

[00:20:47] We need to look into this. We need to figure out it's not about banning it. Now I want in on it and fast forward, look where we are. I think it has to come to a point where [00:21:00] when power and money starts to come into it, you have to start questioning. The context of what you're what you're consuming and the content that you're consuming because now things can be placed now content can be made by anybody.

[00:21:12] It was a bad at the time, but now that they're in on it, of course, it's great. Of course, we so free speech is beautiful, but it also could be very negative. Free speech means that. Donald J. Trump can say, Hey, this is what it is. It's your responsibility to do your due diligence. And a lot of people in the social media area do not.

[00:21:32] They're like, the media is doing false information, but where, where are your, where's your information? Where is that coming from? Right. And social media has shortened and quick in that length of time that a lot of people will just say, Hey, cause my algorithm is showing me 10 things. That I want to hear because again, that's what it is.

[00:21:51] The algorithm is based off of your perspective of the world. Everyone else is liars. The world is. [00:22:00] That's not right. That's misinformation. 

[00:22:02] Beki: And there is a difference between misinformation and disinformation. Yep. And then there's the whole aspect of just apathy. Where it's, well, you can't believe what anybody says, kinds of arguments that come up.

[00:22:17] And I absolutely agree with you that there is a responsibility to consider, wow, does that make sense? And here's the thing. I think that's, that's true in any kind of conversation that you're having with someone is to use those critical thinking skills and say, wow, that seems weird. And help me understand how that can be true.

[00:22:42] I don't care if it was news on television 40 years ago. I don't care if it's newspapers where it was printed. What I think that we're really kind of hitting on here with social media is anyone can post anything at [00:23:00] any time, and they don't need to have expertise in that area. They get to post opinions as though it's fact, and the fact is the idea behind.

[00:23:13] social media and consuming that social media is to remember that it's mostly opinions and perspectives. It doesn't make it true. Just because someone believes something doesn't mean it's true. It's only true that belief has far more power than truth does. 

[00:23:34] Richard: That's a great point. That's 

[00:23:35] Shannon: that, that, that whole thing.

[00:23:37] Exactly. 

[00:23:39] Richard: That's something you don't even think about because As much as it's great to be connected to so many people, it's also a negative. Something that made news recently was when Mark Zuckerberg came and he said that Facebook was moving away from content moderators and moving to a system like X.

[00:23:56] When he said that he had lost me immediately, when he said he's moving to [00:24:00] something more like X, completely lost me right there, um, to have people moderate the content themselves. And that's like, so you're having the people who make the content check the content. Which is kind of like, you know, it's like kind of like a scary thing.

[00:24:13] Fox in the 

[00:24:13] Beki: henhouse, one might say. 

[00:24:16] Richard: So you think about how fast you can get information, and for so long it's been so good. And now you think about misinformation, disinformation, just wrong information, like people reporting, trying to report super fast and instantly. And not having all the facts and a lot of times the first thing that's out is the truth to most people.

[00:24:38] That 

[00:24:39] Richard: is, that is what it is. Like as soon as, as soon as it's out, that information travels so fast and the more people pile on to it, by the time the retraction comes, it's way too late. In most people's mind, guys, it is what it is. 

[00:24:54] Beki: There's science for that, though. I mean, that is literally a cognitive distortion, that whole [00:25:00] anchor trap, where When you think about when you were a kid and we all had siblings, right?

[00:25:06] And so you and your sibling are in trouble. What's the first thing that you want to do? You want to get your story to a parent or some kind of parent figure, right? Because as soon as you tell your version of the events, which may or may not be true, this is the first disinformation that ever existed as childhood, is this whole idea of, okay, so if I tell.

[00:25:30] Then the other person who comes in has to overcome the story that I just told and. And that's because our brains go, well, I know this now. And so now everything, it's also this cognitive, uh, confirmation bias where now that I think I know this, now it's hard for me to actually think about the fact that I'm not trying to collect information consciously or unconsciously to affirm what I think I [00:26:00] already know.

[00:26:01] And so, yes, absolutely, that's why they want to be first to press in each of these situations is because that's the story that holds people because when it keeps on recycling the story, people start to tune out. Oh, I already know, so I don't have to listen to that. And 

[00:26:20] Shannon: also, we live in a world. Where instant gratification is just like as the generations come, the shorter the attention spans are.

[00:26:31] I think there was a study of this. And they did different ages and how, like how quick it is. I think it's like three seconds, seven seconds, like it's gotten progressively shorter as the generations have gone along. So I think even that perspective of like social media has turned into like a gossip train, if you think about it, right?

[00:26:51] Right. Where. Before when it first launched, it was social. It was about connecting. It was about community, right? And then you had the media and [00:27:00] politicians and celebrities that were like, Oh, what's this social media? I can make money off this. I can promote all this. And now it's like a It's like a whisper that turns into like a how right where something you just have to plant the seed and then move because it will just, it will just go.

[00:27:18] And I think the fact that it's instant gratification, I could read one post algorithms like, Oh, you're interested in this. Let me, let me show you 30 more exactly what they're saying. And then you're like, There's nothing else. What are you talking about? It's not like I'm turning on the TV. Half people don't even have cable anymore.

[00:27:36] No, no one's looking to say, Hey, am I, am I seeing this anywhere? It has become an engulfed kind of like cycle of if this is where you're putting it. If this is what you're interested in, if these are the people you follow, your world and scope of the news and your free speech is literally involved in what you want, right?

[00:27:56] And so there is no conflicting information. So when you hear [00:28:00] it, you're like, the media makes that up. That's no, that's not the full story. They're, they're altering this footage. And so it becomes very easy to the, for the mind to be. focused on one thing because one, you got instant gratification. You got the information, you got it quick.

[00:28:18] And then everything else is supporting that information because your algorithm is set up for you. 

[00:28:24] Beki: Right, right. It, it, it feeds the beast that's already there. And I think one of the things that really stood out to me as you were saying that Shannon is that there's such a difference between the freedom of speech and the accuracy of the speech that you use.

[00:28:44] The freedom of speech doesn't actually promise that people are telling the truth. And that's where that critical thinking, that I was talking about before, comes into play. That I can say [00:29:00] anything, nearly, there's exceptions, right? You can't yell fire in a theater is a very common one. And that's because that spreads panic, which I might argue some of the information that is out there sometimes also drives panic, but not quite in the same way all the time as fire.

[00:29:20] in a closed space where people might end up trampling on each other, right? And so I don't expect everything that I'm consuming to be true. And just because the internet said so never made it true. It quotes Abraham Lincoln talking about the internet. I'm just going to go with, he wasn't talking about the internet.

[00:29:43] Richard: I mean, but when you think about it, there are some notable, I will call them social media news people that I like to get information from. And it takes, the main one that I like to listen to is Philip DeFranco, been doing his shows for years, but it's so many of them popping up. When you think [00:30:00] about how social media has changed and become the place where most people get their news from, the social media news people do not have to go through as rigorous fact checking and sources.

[00:30:13] That the regular news media used to, I say used to, because I'm not really crazy about the regular news media right now either. And, and the pseudo, the pseudo news media is a one particular China where some people strictly get their information from, but you know, I think it's a little bit, I think it's a little bit more dangerous.

[00:30:36] You think about now that, so if you say it and you say it believably believable That a lot of people were like, Hmm, that makes a lot of sense. 

[00:30:46] Shannon: I mean, Becky, I think you said it right. I love free speech. I love that America has given us the opportunity to say whatever we want to say without me getting [00:31:00] locked up because probably I would have already been in jail, right?

[00:31:03] Okay. So thank you for the free speech, but free speech could be opinion based. Right? Like most of it is. Well, yes. So again, it just gives me the opportunity to say whatever the heck I want to say. So that's why I said I find it interesting part of free speech. Don't yell at me. All hate speech is under that.

[00:31:25] Right? Because if I'm saying something that's hateful, harmful, hurtful, right? Unless we're in, like you said, unless you're in a movie theater inciting some type of thing where it's going to cause someone to get crushed or, you know what I mean? It is open because that is Quote unquote, what America's for.

[00:31:44] I think the social aspect of it now where it becomes dangerous is it's, it's so insidious and it, it spreads, it spreads faster, right? So I could be. You know, talking to my 10 friends saying [00:32:00] something wild, crazy out of my mouth, right? If I go on to a podcast, if I go on to, you know, social media and put it and it gets repost, reposted 10 times or go on TikTok and it goes viral, right?

[00:32:12] Then someone could say, that's hate speech. They're, they're targeting in a separate, but it. But again, where's the line of free versus, because free speech is protected. I can say whatever I want, as long as I'm not harming, physically harming or whatever. So I just want to make sure in the world that we're living in is free speech is fine.

[00:32:36] But when you're using it as factual information, which it's not. It just gives me the right to say what I want to say in the United States of America. It's not me saying I'm a professional, I've done my due diligence, I've gone to school, I have all the fact points, all the receipts, whatever you want to call it.

[00:32:55] And I think the line is so blurred at this point in time. [00:33:00] And whether it is nice things, hate speech, whatever, it's not news. It's not like, do you know what I'm saying? Well, I think you said 

[00:33:10] Beki: it before. I think you said it before when you said how much of it is gossip. Yes. And if it is couched as gossip, I mean, I probably won't follow it, but it just gets boring to me quickly.

[00:33:24] That said, then people know how to understand it. I'm sitting here thinking. When I think about hate speech in particular, I do think that there are some maybe states that have passed some hate speech legislation, and that's not the only place where there's other action that could potentially be taken in these situations.

[00:33:51] I mean, yeah, we have the freedom to say whatever we want to say, which creates a consequence, which we [00:34:00] also get to freely receive. I mean, if we defame someone, if we slander someone, then there's a legal consequence to something like that. I'm not saying it's easy to get someone convicted based on that kind of thing.

[00:34:16] I actually don't know. See, this is that whole thing where I say I am not an expert, but it comes back to this place of, it's not just freedom of speech. That's what, that's the bell that we ring because that is that first amendment that is a powerful piece of what we. really believes is important. This idea of freedom of religion.

[00:34:42] Yes. Freedom of speech, things like that, that are really an identity piece within the United States. Reasonable people can disagree. Right? Like back to Shannon, where you were saying, I could see both sides. [00:35:00] I can see both sides as well. It's one of the things that I actually enjoy the most in my life is being able to poke the bear, be it my friends or family, by playing devil's advocate and choosing a side I don't believe just to see what people will say.

[00:35:16] And. I have the right to do that. I can criticize a president if I choose to do that. What I can't do is make comments about what he's done that aren't true. Um, demonstrably not true so that, you know, he ends up getting into trouble because I actually would potentially. And so there's this layering of all of our legal system, which admittedly.

[00:35:47] It's having a bit of an identity crisis right now, 

[00:35:50] Richard: but we 

[00:35:51] Beki: really do need to think about the distinctions in these things. 

[00:35:55] Shannon: Listen, I'm trying not to be the cynical person [00:36:00] that thinks that eventually the righteous and the right thing and everything will prevail. 

[00:36:08] Richard: That's cynical. But 

[00:36:09] Shannon: just looking on Trent.

[00:36:11] That's the opposite, right? You 

[00:36:12] Richard: know, that's hopeful. Just 

[00:36:13] Shannon: look at, no, no, no, 

[00:36:15] Richard: that's, 

[00:36:15] Shannon: no, I'm saying, I would love to, that was, that was, that was, that was the groundwork.

[00:36:22] But realistically, I'm just looking at where we are and where we're going. And it doesn't seem like we're going to get to the rainbow or the pot of the pot of gold under the rainbow. It looks like we are just going left 

[00:36:38] Beki: or right. 

[00:36:40] Shannon: I was going to I was not even going to say that, but I'm just 

[00:36:47] Beki: here to help.

[00:36:48] Honestly, at this point, 

[00:36:51] Shannon: I don't even think that exists. I think we are just in a messy, convoluted [00:37:00] mess. And I think that the United States of America, because I'm only going to speak of that, is a self serving selfish nation and we are self preserving to, to, to a fault and looking at Tik Tok, right? And any other social media platform, right?

[00:37:25] If it's causing issues of misinformation and disinformation, instead of fixing that, we just weaponize it and make it worse. 

[00:37:34] Richard: I interviewed somebody and they said that facts is the information that's presented and truth is how you interpret that fact. If you think about it and information in that sense, that means the same people can get the exact same information and have separate truths from it.

[00:37:51] Shannon: We were just talking about this prior to recording and I'm like, yes, that's my point. 

[00:37:56] Richard: Yeah, so it's so many different angles. I think when [00:38:00] it comes to social media, the thing that scares me most now because I think one of the biggest problems with social media is the same biggest problem with AI and a lot in the same thing that's wrong with our, our United States government, in my opinion, is that the people that are in office, most of them are super out of touch with what's going on.

[00:38:22] And they're, they're out of the generation. They don't, they don't know what's going on. They don't understand the technology from seeing Mark Zuckerberg being on trial and the questions that they asked him and not understanding the way wifi works to, to all of those other tech trials that was going on.

[00:38:38] It's clear to see that we have a government that doesn't necessarily understand technology, or at least most of them don't. And technology moves incredibly fast. So that's one of the things with. Facebook and Instagram and all the social media and the internet to a certain point, it's moved so fast that we haven't been able to put regulations in [00:39:00] place to control what's happening.

[00:39:01] It's moving faster than the regulations is moving. So the thing that really scares me right now is that it's moving so fast that it's basically, it's basic things that we still haven't figured out yet. Like how to make sure that misinformation doesn't spread at light speed. Uh, how to. Keep, uh, you even think about how some people say that they watch videos on YouTube and eradicalize them to different causes.

[00:39:29] So it's a lot of that going on and we haven't figured it out. I think that the scariest thing is instead of trying to figure that out, it seems like we have so much consolidation. We have Mark Zuckerberg's Meta, who started out with Facebook, which was the biggest thing, and then they bought Instagram, and then they bought WhatsApp, and then they were allegedly lobbying against TikTok, because I guess TikTok was competition, and they don't want competition, they want the whole market.

[00:39:58] And then you think about the [00:40:00] other social media platforms that we have, which has become the bastion of news and free speech, right? AOC put up a TikTok slash Instagram reel where she was talking about, do you see how things are moving? So Meta controls so much, Elon controls some, and actually Donald Trump controls some of it too.

[00:40:23] You have all of those people together controlling that information stream and the way that it's presented to people. When we think about back to that truth statement. If you feed people a certain amount of things in a certain kind of order and don't show some stuff, show certain stuff, then that truth is going to start to become a reality.

[00:40:42] Yeah. 

[00:40:42] Richard: I think that is the biggest thing that scares me right now is that we have potential bad actors who could manipulate. Who may already be manipulating, uh, the algorithms to feed information to certain people. And then I'll throw on, on top of that, [00:41:00] thinking about that Facebook experiment where they were feeding you content to try to control your mood and they got caught doing that.

[00:41:07] And that big payout. So I think we're in great hands guys. 

[00:41:10] Beki: Well, here's the thing that I would say about that. And I'm not known necessarily for being overly optimistic. So this is a bit out of character. I, I hear you say that there are bad actors. Who are doing that potentially, potentially, allegedly, and I think that that I actually believe that that's true.

[00:41:32] I think that there are bad actors who are out there. I also think that there are people who are unconsciously doing that. Perhaps not some of them that you have named, but they are people who are out there just. posting what they think. That's still the facts as they know them, that then become the truth based upon how someone has interpreted that information.

[00:41:57] And so we all have our own little filters that we run [00:42:00] things through, right? Just based on our experiences, all the things that we talk about on this podcast, right? Like this is my perspective. This is why my perspective exists. So it doesn't even require, that's what actually concerns me more. It doesn't require a bad actor Because really, so much of that we do ourselves, because it's how we interpret those facts.

[00:42:25] And when we don't have all of the facts, frankly, it's challenging to really get to anything close to a, what I'll call a common truth, because the three of us having this conversation could take. Okay. very different experiences away from having this conversation. There's still a common truth. There's what happened during the conversation.

[00:42:49] Shannon: Yeah. I just want to say, cause I feel like in my mind, I love social media. Hey, I got all of them at this point. [00:43:00] Mostly, mostly all of them. And while I can see the benefits of using them, right. From a, from a, from a selfish standpoint, cause. selfishly. I like to connect. I like to communicate. I like to see what's going on in my friends lives that's across the country.

[00:43:17] Like, I like those things, right? There are positives to having platforms such as a TikTok, such as Instagram and Facebook, right? There are benefits. But to what you said, Richard, yes, it is concerning. I think it's more concerning. This is just me and I feel like I'm going to sound very old, but it's concerning for the generation under millennials.

[00:43:42] Right? The ones that literally live and breathe for social and are in the world. Like you said, Richard, everything's moving fast. They've never had like, everything is this quick, quick, quick, quick, quick. Right. [00:44:00] And so I am concerned for them because if this is their main source of information and whatever they're shown is true.

[00:44:10] And I know some young kids that will yell at me and have arguments about a TikTok video and do no other research outside of that. Those are the individuals that I'm concerned with because your mind is being molded based off of one platform. Or two platforms, whatever the case may be, but nothing outside of that world.

[00:44:30] And those are the ones where I would be concerned of what you're saying. Uh, there is no regulation. So whatever information they're getting fed, that's their truth. That's their fact. That's their news. That's their everything versus someone that I'm going to speak to my, about myself. Cause I can only speak for myself.

[00:44:48] I use social media for, I do follow news outlets, but more so it's a social thing for me. It's, I have a balance. Not everybody has a balance. And I think that's where the [00:45:00] concern for me specifically comes in. And for those that don't have balances and go around this world, thinking that what they see is fact without any type of background, those are the ones that I'm concerned about because that, if that's what the future is going to be, I'm very much concerned.

[00:45:17] Beki: Yeah, and it makes me think I'm actually really hopeful that they would argue with me about the TikTok reel and not because I, I do love debate, but it's not just for that. It's actually because then that pushes them. To think about what they were looking at and I'm, I'm all full applause on something like that.

[00:45:41] So if they have people who are in their life or who they could add to their life, who would be willing to have that conversation with them, then it becomes something to discuss. I can't say that 

[00:45:54] Shannon: that happens all the time though. I will tell you they were like, you're stupid. You're stupid and you don't know what you're talking about.

[00:45:59] Beki: Yeah. [00:46:00] Yeah. Yeah. Well, and also it becomes. It just reinforces their belief in some instances as well. And please understand, I don't debate because I think you're going to agree with me. Right. Right? I debate because I think it's an interesting thought exercise and I want you to think about it. It's not necessarily that I even need or want you to agree with me.

[00:46:24] I, I am perfectly happy with people having a different point of view. And I would also say that there's stuff on social media that's actually to the degree it can be true. Yeah. It's not just this evil place where all things are false and no one should believe anything because if that were true, it could be gone.

[00:46:46] It's I think what we're really talking about is when you only get the algorithm feeding you a single source of truth and a single angle or lens on truth, that's [00:47:00] where the challenge becomes, you know, real is because it's. You don't have the ability to see that that video was edited and eliminates a really salient point for that content.

[00:47:17] Richard: That was well said. Thanks. So what both of you two were talking about is completely on point. I think the thing that really starts to blur the line even more now is AI. When you have a technology that can replicate the way that people look, the way that people talk, You can make fake images and put people there that never was in these images.

[00:47:36] You can take their voice and you can make it sound directly like them. And the video stuff, it's getting really, really hard to distinguish what's real and what's fake. And I think that muddies the water even more because, what's that old saying, believe half of what you see, none of what you hear, it's becoming way more true now because I can go in Photoshop and make something [00:48:00] that looks real.

[00:48:00] And it's something that never even existed, it just existed in my mind, but it looks photorealistic. I think that is scary when you start to see, because truth, what's truth? Now I think everybody has plausible deniability. How that video was doctored. I was never there. Like, how do you, how do you compete with that?

[00:48:21] Beki: I agree with that. And I think that it, it really comes back to, I need, I need more, right? And, and I'm, I'm, I'm subject to analysis paralysis and I, I know this about myself, but. I, I don't, I don't know of much that I would say that I am 100 percent convinced of. I try not to use words like never and always because it's just too hard to get me to believe to that degree.

[00:48:55] There are a couple of things that I'm pretty polarized on, but [00:49:00] I think that we have to be in a place where. Does it make sense? And I agree that AI is a whole nother technology layered onto social media that creates an even harder thing for us to legislate. But I do think that that means that our critical thinking skills just need to be sharpened up.

[00:49:21] Just that little bit more. 

[00:49:22] Shannon: Yeah. I think it all comes down to balance. I think as we move forward, there's going to be more innovation and there's going to be more technology that comes. And it's not to say that all social media is going to get bad or worse, but it's really just to say there needs to be a balance, right?

[00:49:38] This, you can't utilize one source as your main source of information. Or misinformation or disinformation, right? There needs to be, like you said, some critical thinking and some outsourced outside resources that you use in order to gather information. Right. And understand the [00:50:00] world in general, not just one platform.

[00:50:03] Um, but it is a great tool, but just understand it's always going to constantly be changing. So it's a really having the balance and understanding how to utilize it. That works best for you. 

[00:50:13] Richard: It's crazy because whenever something is said, I say, always be skeptical of things. Like depending on who was coming from, even from people that you trust, And I remember my dad would even give me money sometimes and I would count it.

[00:50:28] It's just like, you don't trust your dad's counting. No, it's just like, I need to check it for myself. And we walk around with these very expensive computers connected to all of the information in the world. At all times, we have it with us. And a lot of times, we won't even take the time to go and do a Google search.

[00:50:48] Like I said, with AI, it's getting a little bit harder because Google's gonna have some fake images on there too. AI is starting to be more prolific. But, I think that is one step. You [00:51:00] can't always do your research. But I think what you said, Shannon was really on point. It's a news app that I like to use.

[00:51:07] And what it does is it gives you a centralist view of what the topic is, and then you can scroll over it to give you what the left is saying about it. And then you can scroll over and you can see what the right is saying. And the craziest thing is, is being in marketing. I understand this, but I don't know if everybody does.

[00:51:24] If you change some wording and you change some perspective, you can make it. say something completely different. It's going back to that, that truth and facts thing. You can make the facts sound completely different. If you say I'm in a certain way, it's the way that things are presented. I could present something in one way.

[00:51:41] And a lot of times when you do that, you're trying to lead somebody to a certain conclusion and that's how we do it in marketing. And if you don't believe that the news is marketing to certain people, And trying to get you to believe certain things. Hate to tell you, it's happening. 

[00:51:57] Beki: Yeah. Yeah. [00:52:00] See, also, critical thinking.

[00:52:01] Richard: Yeah, critical thinking is important. But, and my fear now is that we're not teaching it the way that we should. Yeah. Not teaching the kids critical thinking skills. 

[00:52:11] Beki: Yeah, that gets us into a whole nother path. That's another path. That's another And that's another story. 

[00:52:20] Richard: Thanks for joining us on this journey.

[00:52:22] Remember, the world looks different through every lens. Before you go, we'd love to hear from you. What's a moment in your life that changed how you see the world? Use the link in the show notes to share your story with us. Your voice helps us grow, and your stories keeps the conversation going. I'm Richard, and on behalf of myself, Becky, and Shannon, thanks for listening.

[00:52:47] Until next time.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Short Story Long Artwork

Short Story Long

Beki Fraser
Still Talking Black Artwork

Still Talking Black

Richard Dodds