A Parent’s Guide to ‘Sadfishing’: What You Need to Know
I have three teens, two daughters (13 and 15) and a son (17). And if you’re in a similar boat, you probably have experienced some drama from time to time.
TikTok: The shortform video app built on the backs of dancing, lip-syncing teenagers that has stirred up Congress, the Senate and, unsurprisingly, social media.
Teens, quite understandably, love the app. Pew Research estimates that 63% of American teenagers use it regularly. But that poses a bit of a problem for parents. Should you allow your kids to download the app and, thus, allow it to potentially track their data? Should you block them from using it in case they accidentally come across inappropriate content?
I won’t tell you what to do: there are pros and cons with every form of social media, including TikTok. But if you do let your kids use the app, I strongly urge setting up some guidelines and restrictions. Enter Plugged In’s “Parental Controls: TikTok.”
In the tutorial, I break down everything you need to know about the app and its capabilities, demonstrate how to connect a parent account with a teen one, detail the different parental control options and explain the differences between a regular teenage account and a view-only under-13 account. So mosey on over to Plugged In to figure out exactly which TikTok parental controls are right for your family.
If you’re a parent—particularly if you’re a parent of a teenager—you’ve probably heard about the impact of social media on mental health. You may be familiar with the benefits or harms of allowing your child to have a social media account. And you’ve likely made decisions about whether or not your family will use social media and, if so, you might’ve even thought about how, when and where you’ll use it.
But Surgeon General Vivek Murthy worries that parents may not have the complete picture. In an opinion piece for The New York Times, Dr. Murthy called for congressional legislation to “shield young people from online harassment, abuse and exploitation and from exposure to extreme violence and sexual content.” According to Adam Holz for Plugged In, “He further advocated waiting until high school before giving a child access to social media.”
Social media is such a touchy subject. Bring it up at your next Bible study and you’re likely to get a swarm of opinions ranging from “it’s totally harmless” to “it should be banned for everyone under 18.” You’ve probably heard some of the social media horror stories. You might even know someone who has been personally impacted by the effects of social media.
But will Dr. Murthy’s suggestion—to require a surgeon general’s warning label on social media platforms, stating that social media is associated with significant mental health harms for adolescents—scare people away?
“Probably not,” Holz wrote. “But the Surgeon General is hopeful that such a high-profile warning becomes another important component in our culture’s engagement with social media. And just as we saw happen with public sentiment regarding smoking and its health hazards, perhaps our contemporary conversation about social media’s hazardous effects will follow a similar trajectory.”
We have apps for nearly everything now: finding nearby businesses, scheduling appointments, answering emails, video-chatting, even brewing your morning coffee. And you can even find a mental health professional that way, too. But now, some experts are hoping to incorporate artificial intelligence into the mix.
Bob Hoose writes for Plugged In that the Substance Abuse and Mental Health Services Administration has declared that nearly 58 million Americans are affected by mental health issues. But even with the seeming ease of access mentioned above, the nonprofit Mental Health America estimates “that more than 28 million of those potentially struggling folks, young and old, don’t receive any kind of treatment at all. And in many cases, that’s because they live in federally designated ‘mental health shortage areas,’ where good docs are in short order.”
Well, researchers are now trying to bridge that gap by incorporating artificial intelligence into the process. Dartmouth College’s “Therabot” is an AI chatbot capable of “compiling related health data; analyzing patient patterns; and then providing personalized advice or recommendations.” The goal, of course, is to provide mental health access to folks who—due to finances, lack of “good docs,” or perhaps even fear of old stigmas surrounding mental health—might not seek care otherwise.
“Human doctors,” Hoose writes, “warn that AI still hasn’t gotten to the point where it can totally replace a flesh and blood caregiver.” Because although we can program an AI to analyze information, we can’t program it to interpret non-verbal cues, empathetically understand a patient’s feelings or even deal with more complex human emotions.
Your teenager may feel more at ease, more anonymous, texting a chatbot. And it may even help ease them into the conversation of working with a human counselor. But for the reasons stated above, you should still encourage them to seek help from a human, not a robot. And, of course, you can contact a Christian counselor or mental health professional through Focus on the Family, too.
Have you ever unintentionally signed up for an email list or bought something online only to discover it cost exponentially more than the advertised price? You may have fallen victim to a dark pattern.
“Coined in 2010 by Dr. Harry Brignull,” writes Kennedy Unthank for Plugged In, “dark or deceptive patterns are “tricks used in websites and apps that make you do things you didn’t mean to, like buying or signing up for something.” And a 2022 report by the Federal Trade Commission’s Bureau of Consumer Protection has found that dark patterns are only becoming more common.
According to Dr. Jenny Radesky (a collegiate and associate professor of pediatrics at the University of Michigan), children are more susceptible to these tactics due to underdeveloped self-control and decision-making skills, among other things. And the FTC has begun cracking down on companies employing deceptive patterns. (Unthank reports that the popular video game Fortnite was forced to pay $520 million in restitution after violating the Children’s Online Privacy Protection Act with dark pattern tactics.)
Savvy parents can learn about dark patterns and teach their kids to avoid them. Read more about these tactics at Plugged In’s website, but here are a few to get you started:
Hopefully this kickstarts your education into dark tactics—and helping you build your own tactics to avoid them. And as Unthank puts it, “just as a parent might teach their child how to interact with such salesmen, helping your child navigate these dark patterns is key to helping him or her become a screen-savvy individual.”