Place your order now for a similar assignment and have exceptional work written by our team of experts, At affordable rates
For This or a Similar Paper Click To Order Now
Dear Class,
Please choose ONE of the following two articles, with the corresponding ethical theory designated, to write your Part B Evaluate Business news ethics item paper due Jan 5, 2023.
FOR EXAMPLE, the first article is assigned the Utilitarianism theory. Therefore, you would refresh your memory on what that theory is from your readings and videos, then use that theory ONLY and talk about what may have happened if the company had used THAT theory in resolving their decision/situation that you just read about in the article. For example, if TikTok had applied Utilitarianism in making their decision on making algorithms that promote self harm, would the end result still be the same? why or why not?
Also, make sure to read the instructions for this assignment that are included within the assignment itself in your Module 2.
. Report: TikTok Boosts Posts About Eating Disorders, Suicide
Associated Press. “Report: Tiktok Boosts Posts about Eating Disorders, Suicide.” Local News 8, 14 Dec. 2022, https://localnews8.com/news/2022/12/14/report-tiktok-boosts-posts-about-eating-disorders-suicide/
TikTok’s algorithms are promoting videos about self-harm and eating disorders to vulnerable teens, according to a report published Wednesday that highlights concerns about social media and its impact on youth mental health
.
Researchers at the nonprofit Center for Countering Digital Hate created TikTok accounts for fictional teen personas in the U.S., United Kingdom, Canada and Australia. The researchers operating the accounts then “liked” videos about self-harm and eating disorders to see how TikTok’s algorithm would respond.
Within minutes, the wildly popular platform was recommending videos about losing weight and self-harm, including ones featuring pictures of models and idealized body types, images of razor blades and discussions of suicide.
When the researchers created accounts with user names that suggested a particular vulnerability to eating disorders — names that included the words “lose weight” for example — the accounts were fed even more harmful content.
“It’s like being stuck in a hall of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself,” said the center’s CEO Imran Ahmed, whose organization has offices in the U.S. and U.K. “It is literally pumping the most dangerous possible messages to young people.”
Social media algorithms work by identifying topics and content of interest to a user, who is then sent more of the same as a way to maximize their time on the site. But social media critics say the same algorithms that promote content about a particular sports team, hobby or dance craze can send users down a rabbit hole of harmful content.
It’s a particular problem for teens and children , who tend to spend more time online and are more vulnerable to bullying, peer pressure or negative content about eating disorders or suicide, according to Josh Golin, executive director of Fairplay, a nonprofit that supporters greater online protections for children.
He added that TikTok is not the only platform failing to protect young users from harmful content and aggressive data collection .
“All of these harms are linked to the business model,” Golin said. “It doesn’t make any difference what the social media platform is.”
In a statement from a company spokesperson, TikTok disputed the findings, noting that the researchers didn’t use the platform like typical users, and saying that the results were skewed as a result. The company also said a user’s account name shouldn’t affect the kind of content the user receives.
TikTok prohibits users who are younger than 13, and its official rules prohibit videos that encourage eating disorders or suicide. Users in the U.S. who search for content about eating disorders on TikTok receive a prompt offering mental health resources and contact information for the National Eating Disorder Association.
“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need,” said the statement from TikTok, which is owned by ByteDance Ltd., a Chinese company now based in Singapore.
Despite the platform’s efforts, researchers at the Center for Countering Digital Hate found that content about eating disorders had been viewed on TikTok billions of times. In some cases, researchers found, young TikTok users were using coded language about eating disorders in an effort to evade TikTok’s content moderation.
The sheer amount of harmful content being fed to teens on TikTok shows that self-regulation has failed, Ahmed said, adding that federal rules are needed to force platforms to do more to protect children.
Ahmed noted that the version of TikTok offered to domestic Chinese audiences is designed to promote content about math and science to young users, and limits how long 13- and 14-year-olds can be on the site each day.
A proposal before Congress would impose new rules limiting the data that social media platforms can collect regarding young users and create a new office within the Federal Trade Commission focused on protecting young social media users ‘ privacy.
One of the bill’s sponsors, Sen. Edward Markey, D-Mass , said Wednesday that he’s optimistic lawmakers from both parties can agree on the need for tougher regulations on how platforms are accessing and using the information of young users.
“Data is the raw material that big tech uses to track, to manipulate, and to traumatize young people in our country every single day,” Markey said.
***Analyze the above TikTok article using Utilitarianism Theory
Facebook Parent’s Oversight Board Criticizes ‘Cross Check’ Program That Protects VIP Users
New report faults Meta Platforms for giving some users unfair deference and misleading the public
Meta Platforms
Inc. has long given unfair deference to VIP users of its Facebook and Instagram services under a program called “cross check” and has misled the public about the program, the company’s oversight board concluded in a report issued Tuesday.
The report offers the most detailed review to date of cross check, which Meta has billed as a quality-control effort to prevent moderation errors on content of heightened public interest. The oversight board took up the issue more than a year ago in the wake of a Wall Street Journal article based on internal documents that showed that cross check was plagued by favoritism, mismanagement and understaffing.
The board’s report doesn’t take issue with the value of a secondary review system for moderating posts from high-profile or sensitive accounts. But the board said the program in practice has put Meta’s business interests over the program’s stated goals of protecting public discourse, and it noted that the highest levels of protection are generally reserved for accounts that might stir up trouble for Meta’s senior leadership.
“Cross check is currently neither designed nor implemented in a manner that meets Meta’s human rights responsibilities and company values,” the report said. “Despite significant public concern about the program, Meta has not effectively addressed problematic components of its system.”
In response to the report, Meta noted that it had already made some changes in line with the board’s recommendations and that it had sought the board’s advice on additional improvements. The company has agreed to provide a more robust response within 90 days.
The Journal article last year found that under the cross check program, also known as “XCheck,” some VIP users were “whitelisted,” meaning they were exempted from some or all penalties for violating platform rules. Other accounts were given “remediation windows” in which they could remove violating posts without penalty or received watered-down punishments for misconduct.
In total, more than five million accounts received some level of enforcement protections and because of understaffing, allegations of misconduct by those accounts were routinely never reviewed, the Journal reported.
“We are not actually doing what we say we do publicly,” an internal review of the program stated in 2019, the Journal reported last year. That review declared the company’s actions “not publicly defensible.”
How should Facebook address its “cross check” program? Join the conversation below.
Meta executives last year said criticism of cross check was fair but that the company was working to address issues with the program and that much of the material cited by the Journal was outdated.
“There’s no news here, despite how much the WSJ wants to make it seem like there is,” the company’s vice president of integrity wrote on Twitter at the time.
Still, Meta asked the oversight board to review cross check last October and committed to addressing the group’s questions. The board is a group of professors, lawyers, human-rights activists and others from around the world that Facebook created to evaluate cases that have the potential to guide future content decisions and rules. Meta says the board’s content decisions are binding, but it isn’t obligated to follow its recommendations more generally.
The board’s report doesn’t discuss the entities that received protection in specific terms. In an interview on Monday, oversight board member and British journalist Alan Rusbridger said Meta had declined to provide that information.
“We were sort of operating in the dark without having a clear idea of who was in the program and who wasn’t,” he said, noting that Meta acknowledged there were 5 million accounts. “I personally felt they could have gone further.”
Earlier this year Facebook parent Meta quietly formed a team to deal with an uncomfortable reality: the most popular posts on its platform were trash. WSJ tech reporter Jeff Horwitz joins host Zoe Thomas to discuss the war room Meta convened to deal with the problem and why fixing it was so critical to the platform’s future plans. Photo: Johanna Geron/Reuters
The oversight board credited Meta with making improvements to the program since its details were revealed last year. During the period of its review, the board noted, Meta committed to performing annual reviews of cross check, created a version that wasn’t exclusive to public figures and published clearer rules for how protections are added and removed.
But the board’s opinion faulted the company for the program’s continued understaffing, opacity and unfairness.
“Meta has repeatedly told the board and the public that the same set of policies apply to all users. Such statements and the public-facing content policies are misleading,” the report states.
Because the company takes on average five days to review content from cross check protected accounts, the board noted, the program fails to stop harmful content from going viral in a relevant time frame. In some instances, the review process takes weeks or even months.
“Delaying any enforcement while taking weeks to reach a decision results in functionally exempting entitled entities from the rules,” the report stated.
The oversight board urged Meta to make 32 changes to the program. One suggested improvement would be to separate the process of granting protections on public interest grounds from the process of granting protections to Meta advertisers. Another would be to prevent likely violating content from high-profile accounts from going viral before Meta reviews it. A third would be to insulate the program from the influence of Meta’s public policy team and other executives.
Mr. Rusbridger said he doesn’t expect Meta to accept all of the board’s recommendations and considers the current program’s emphasis on heading off reputation-damaging moderation errors to be understandable. But given the board’s stated purpose of holding Meta accountable to its own values, he said, a failure to significantly alter cross check would be worrying.
Horwitz, Jeff. “Facebook Parent’s Oversight Board Criticizes “Cross Check” Program That Protects VIP Users.” WSJ, 6 Dec. 2022, www.wsj.com/articles/facebook-parents-oversight-board-criticizes-cross-check-program-that-protects-vip-users-11670324402?reflink=desktopwebshare_permalink. Accessed 17 Dec. 2022.
*** Analyze the above Meta article using Rawls’ Justice Theory
Why Work with Us
Top Quality and Well-Researched Papers
We always make sure that writers follow all your instructions precisely. You can choose your academic level: high school, college/university or professional, and we will assign a writer who has a respective degree.
Professional and Experienced Academic Writers
We have a team of professional writers with experience in academic and business writing. Many are native speakers and able to perform any task for which you need help.
Free Unlimited Revisions
If you think we missed something, send your order for a free revision. You have 10 days to submit the order for review after you have received the final document. You can do this yourself after logging into your personal account or by contacting our support.
Prompt Delivery and 100% Money-Back-Guarantee
All papers are always delivered on time. In case we need more time to master your paper, we may contact you regarding the deadline extension. In case you cannot provide us with more time, a 100% refund is guaranteed.
Original & Confidential
We use several writing tools checks to ensure that all documents you receive are free from plagiarism. Our editors carefully review all quotations in the text. We also promise maximum confidentiality in all of our services.
24/7 Customer Support
Our support agents are available 24 hours a day 7 days a week and committed to providing you with the best customer experience. Get in touch whenever you need any assistance.
Try it now!
How it works?
Follow these simple steps to get your paper done
Place your order
Fill in the order form and provide all details of your assignment.
Proceed with the payment
Choose the payment system that suits you most.
Receive the final file
Once your paper is ready, we will email it to you.
Our Services
No need to work on your paper at night. Sleep tight, we will cover your back. We offer all kinds of writing services.
Essays
No matter what kind of academic paper you need and how urgent you need it, you are welcome to choose your academic level and the type of your paper at an affordable price. We take care of all your paper needs and give a 24/7 customer care support system.
Admissions
Admission Essays & Business Writing Help
An admission essay is an essay or other written statement by a candidate, often a potential student enrolling in a college, university, or graduate school. You can be rest assurred that through our service we will write the best admission essay for you.
Reviews
Editing Support
Our academic writers and editors make the necessary changes to your paper so that it is polished. We also format your document by correctly quoting the sources and creating reference lists in the formats APA, Harvard, MLA, Chicago / Turabian.
Reviews
Revision Support
If you think your paper could be improved, you can request a review. In this case, your paper will be checked by the writer or assigned to an editor. You can use this option as many times as you see fit. This is free because we want you to be completely satisfied with the service offered.