Hi! Welcome Back and Stay Tune! The kids aren't alright online - Mukah Pages : Media Marketing Make Easy With 24/7 Auto-Post System. Find Out How It Was Done!

Header Ads

The kids aren't alright online


A person lays in bed in the dark, scrolling on a phone. Their face is obscured by the device.

In the weeks since two young men used semiautomatic weapons to murder children in Uvalde, Texas, and Black shoppers in Buffalo, New York, the media has rushed to identify and analyze the shooters' online behavior, looking for clues about their personalities and motives

What we've learned so far is that the two alleged 18-year-old shooters turned to social media and messaging boards and platforms as an outlet for their disturbed, conspiratorial thinking. The evidence should be the latest wake-up call that when young people who are emotionally or psychologically unmoored use digital tools in ways connected with dehumanization and violence, the consequences can be tragic. Most of the deadliest mass shootings that have occurred in the past few years were committed by men who were 21 or younger and who documented aspects of their violent behavior online.  

While it would be easy to focus only on the shooters' digital footprints, we also know that one assailant had an audience with a Discord chat group where he discussed getting shot while wearing body armor, and that the other interacted with young women via social media who were distressed by his behavior and found there was little recourse to stop it, or felt it was just a normal, if disturbing, part of online culture.

These dynamics suggest that too many young people, whether they're participants in or witnesses of repugnant behavior, feel utterly lost about what ethical behavior and respectful treatment looks like on platforms that are often stripped of humanizing context. And even when they can personally live by these values, they may not know how to demand the same from their peers and the platforms that host them. They need and deserve thoughtful guidance about how to shape their online lives that goes far beyond protecting themselves from catfishing or bullying. They should learn how to know when digital tools and platforms are contributing to someone else's dehumanization, how to understand the mechanics that manipulate their world view, how those experiences are connected to the broader culture, and how to talk about or draw attention to them. 

The goal of such guidance isn't to necessarily prevent future mass shootings, but rather to help young people gradually shift online culture in meaningful ways toward inclusion and justice. This might sound idealistic, but it could steer some young people away from targeting others, online or off. Reaching young people at home and in their classrooms with sophisticated media literacy education that both their teachers and parents understand is an important first step. 

Critical media studies scholar Dr. Whitney Phillips, Ph.D., lays out this kind of approach in a forthcoming, untitled book about digital ethics written for younger teens, which will be published next year by MITeen Press and Candlewick. Phillips, who co-authored the guide with Dr. Ryan Milner, Ph.D., an internet studies scholar, says the guide focuses on wellness and mental health, and draws connections between how young people are feeling with what and how they share online.

She argues that there's a spectrum of digital behavior we need to engage with, "not just the really extreme articulations of violence." Fundamentally, online tools can "reduce people to fetishized objects" wherein they lose their humanity, and social media offers little incentive or space for an empathetic response.

Think, for example, of how quickly someone else's embarrassing moment can become a meme. Often that prompts laughter rather than reflection about whether the subject is an "actual person with feelings who might not want to be reduced to a digital punchline," says Phillips, an assistant professor in the School of Journalism and Communication at the University of Oregon. This dehumanization is only exacerbated when the person laughing holds disrespectful or extremist views about the person being humiliated.

Salvador Ramos, whom authorities say killed 19 children and two adults at Robb Elementary School, had a history of alluding to violence or making vague threats against young women with whom he messaged. For Ramos, direct messaging apps turned young women into avatars at which he could hurl insults or threats.

Payton Gendron, the suspect accused of attacking a Buffalo grocery store frequented by Black shoppers, livestreamed the killings. He also said in personal writings that he initially kept private on the instant messaging social platform Discord that he didn't consider himself racist until he immersed himself in the anonymous message board 4chan, known for trolling and promoting racist and antisemitic memes. Shortly before Gendron shot and killed 10 Black people, he shared that diary with certain contacts. For Gendron, Black people weren't human beings but characters featured in a conspiracy theory about "replacing" white people

While misogyny and racism exist first and foremost offline, digital tools have made it far easier to engage with the violence of these ideas, find like-minded communities who share those views, and target vulnerable individuals online or offline with aggressive or violent behavior. 

"You have an online space that makes it so much easier to streamline" these beliefs, says Phillips.

But she also argues that we tend to focus on extreme cases without examining problematic online behavior that anyone, teens or adults alike, might engage in, because they have no intention of harming others, they're just having fun, or because the content seems lighthearted or ephemeral. Phillips says that culture encompasses Twitter pile-ons and digital blackface, for example. The casual vilification of Amber Heard in memes, TikToks, and other types of digital commentary is another instance.  

To counter such dehumanization, Phillips recommends talking to young people about what she describes as "the winds of social media." She defines these as affordances, or the tools and functions that allow us to manipulate and share information; the attention economy; algorithms; and assumptions about information, like the false notion that correcting misinformation makes it go away. These unseen forces affect our online experiences — and conclusions about the world that people draw as a result. 

Affordances, which include functions like retweeting, commenting, and meme creation, are part of what makes the internet fun, but they also make it easy to decontextualize just about anything. In turn, people consuming that content may not grasp the consequences of sharing it; they might not even believe they should be concerned about related ethical issues, because they haven't set out to hurt anyone. They also might not fully understand what they're encountering, which can trigger reactions like rage, alienation, or confirmation of personal biases. When misogynist and racist memes elicit cheers in the echo chamber of an anonymous messaging board, that amplifies and reinforces the conviction that such views are widely held or acceptable, particularly when the content goes unaddressed by the platform or authority figures, says Phillips. 

She says that helping young people open up their "narrative aperture" means teaching them about how digital communication tools and platforms incentivize provocation and polarization, pull them toward certain themes and accounts, and otherwise influence consumption choices that may seem like they're freely made but are instead guided by invisible forces.

Phillips believes that everyone, including young people, needs the skills to understand their online environment with an ethical framework. This means actively considering the relationship between themselves, technological tools, and others. Right now, Phillips argues that people view slices of a full picture as if it's the whole story and lack critical awareness of where they are or what's happening around them. This is disorienting for adults, but imagine how bewildering it can be for teens who experience such environments as normal. 

Ideally, the burden wouldn't be on young people and their caregivers or teachers to navigate digital ecosystems that aren't ethically designed. But in the absence of successfully holding companies accountable for building products that incentivize dehumanization, we can't pretend that kids are alright online. They shouldn't expect to shrug off violent threats, hateful screeds, or more mundane acts of humiliation as typical.   

The sooner they hear that from trusted adults, the faster we can attempt to shift online — and importantly, offline — culture toward something more humane. 

If you want to talk to someone about how disturbing online content or mass shootings are affecting you, Crisis Text Line provides free, confidential support 24/7. Text CRISIS to 741741 to be connected to a crisis counselor. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email info@nami.org. You can also call the National Suicide Prevention Lifeline at 1-800-273-8255.



✍ Credit given to the original owner of this post : ☕ Mashable

🌐 Hit This Link To Find Out More On Their Articles...🏄🏻‍♀️ Enjoy Surfing!




No comments

Comments are welcome and encouraged on this site. Comments deemed to be spam or solely promotional will be deleted. Including link to relevant content is permitted, but comments should be relevant to the post topic.

Comments including profanity and containing language that could deemed offensive will also deleted. Please respectful toward other contributors. Thank you.