Connect with us

Information Technology

Facebook hosts creators workshop in Lagos

Published

on

On Thursday, July 18, 2024, Facebook hosted its first-ever creator workshop in Lagos, Nigeria. The event brought together over 20 Facebook creators to reinforce the app’s commitment to the region’s creative community. The workshop equipped creators with the knowledge and resources needed to earn from the engaging content they share – whether it’s a Reel or longer video.Some of the Facebook creators who attended the workshop includes; Mark Angel, a comedian, content creator on Facebook and video producer with over 22 million followers; Taaooma, a comedian who has amassed a large following with her unique brand of humour; Crazeclown, an actor, content creator, doctor, comedian; Kie Kie, a content creator, brand influencer and actor; Aproko Doctor, a health influencer and actor; Gina Ehikodi, a food content creator and TV host and Chukwuebuka Emmanuel, Brain Jotter, an award-winning Nigerian comedian.Other notable creators in attendance were MC Shem, a comedian; Oluwadolarz, a comedian; Kenzy Udosen, digital creator known for his iconic Madam Theresa persona; Omoye Isabota, an award-winning food blogger; Chivera Media, a video creator; Juliet Godwin, artist, community manager and the principal admin of the Naija Graphics Designer FB community and David Obi, founder of Yorochi Facebook Community.Meta recently announced the availability of two new monetisation features on Facebook (In-Stream Ads and Facebook Ads on Reels) for eligible creators in Nigeria, enabling them to earn money by crafting original videos and cultivating community.#FacebookCreatorsNGDr. Chinonso Egemba, Aproko Doctor, a health influencer and actor.Enitan Denloye, Regional Director, Africa; Kezia Anim-Addo, Communications Director, Africa, Middle East & Turkey (AMET); Oluwasola Obagbemi, Corporate Communications Manager for Sub-Saharan Africa; Betty Ansah, Product Communications Manager (EMEA); Marie Cubeta, Communications Manager(Facebook) and Rof Maneta, Strategic Partner Manager, Global Partnerships, Sub-Saharan Africa.Gina Ehikodi, Foodies&Spice, a food content creator and TV host.Oluwasola Obagbemi, Corporate Communications Manager for Sub-Saharan AfricaEnitan Denloye, Regional Director, AfricaBetty Ansah, Product Communications Manager (EMEA)Bukunmi Adeaga-Ilori, Kie Kie, a content creator, brand influencer and actorRof Maneta, Strategic Partner Manager, Global Partnerships, Sub-Saharan Africa.Abiola Abdulgafar, Cute Abiola, is a prominent Nigerian comedian and actor.On screen: Noman Ali, Product Manager (Facebook)David Obi, founder of Yorochi Facebook Community and Dr. Emmanuel Ogonna Iwueke, Crazeclown, a medical doctor, content creator, comedian and actor.Juliet Godwin, artist, community manager and the principal admin of the Naija Graphics Designer FB community.Maryam Apaokagi, Taaooma, an award-winning Nigerian comedian and content creatorAyodele Aguda, MC Shem, a comedian and Mark Angel, most followed Nigerian comedy content creator on Facebook and video producerChukwuebuka Emmanuel, Brain Jotter, an award-winning Nigerian comedian and Dr. Emmanuel Ogonna Iwueke, Crazeclown, an actor, content creator, doctor, comedian.Bukunmi Adeaga-Ilori, Kie Kie, a content creator, brand influencer and actor and Moji Delano, an online media aficionado, communications strategist, and entrepreneur.Omoye Isabota, Omoye Cooks, an award winning food blogger and ChiVera Obiajulu, Chivera Media, a video creator.Oladapo Adewunmi, founder of Apollo Endeavor Limited, a Creative Director, Movie Producer and Talent / Content Agent and Halimat Olatunji,Regional Manager, Social Media and Influencer Marketing at Jumia Group.Dr. Chinonso Egemba, Aproko Doctor, a health influencer, actor and Ekene Mfoniso Nna-Udosen, Kenzy Udosen, a digital creator.Ogunleye Olamide Babatunde, Oluwadolarz, a comedian ; Ayodele Aguda, MC Shem, a comedian and Mark Angel, most followed Nigerian comedy content creator on Facebook and video producer

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Information Technology

New Instagram campaign to raise awareness and help protect teens from sextortion scams

Published

on

By

Takeaways

Instagram is announcing measures to further protect people from sextortion, including hiding follower and following lists from potential sextortion scammers, preventing screenshots of certain images in DMs, and rolling out our nudity protection feature globallyThese updates, which are part of a campaign informed by NCMEC, Thorn & Childnet, also aims to help parents feel more equipped to support their teens in avoiding these scams Sextortion is a horrific crime, where financially-driven scammers target young adults and teens around the world, threatening to expose their intimate imagery if they don’t get what they want. Today, we’re announcing new measures in our fight against these criminals – including new safety features to further help prevent sextortion on our apps, building on protections already in place. New safety features to disrupt sextortion Meta is announcing a range of new safety features designed to further protect people from sextortion and make it even harder for sextortion criminals to succeed. Now, we’re making it harder for accounts showing signals of potentially scammy behavior to request to follow teens. Depending on the strength of these signals – which include how new an account is – we’ll either block the follow request completely, or send it to a teen’s spam folder. Sextortion scammers often use the following and follower lists of their targets to try and blackmail them. Now, accounts we detect as showing signals of scammy behavior won’t be able to see people’s follower or following lists, removing their ability to exploit this feature. These potential sextorters also won’t be able to see lists of accounts that have liked someone’s posts, photos they’ve been tagged in, or other accounts that have been tagged in their photos.

Soon, we’ll no longer allow people to use their device to directly screenshot or screen record ephemeral images or videos sent in messages. This means that if someone sends a photo or video in Instagram DM or Messenger using our ‘view once’ or ‘allow replay’ feature, they don’t need to worry about it being screenshotted or recorded in-app without their consent. We also won’t allow people to open ‘view once’ or ‘allow replay’ images or videos on Instagram web, to avoid them circumventing screenshot prevention.

We’re constantly working to improve the techniques we use to identify scammers, remove their accounts and stop them from coming back. When our experts observe patterns across sextortion attempts, like certain commonalities between scammers’ profiles, we train our technology to recognize these patterns. This allows us to quickly find and take action against sextortion accounts, and to make significant progress in detecting both new and returning scammers. We’re also sharing aspects of these patterns with the Tech Coalition’s Lantern program, so that other companies can investigate their use on their own platforms. Finally, after first announcing the test in April, we’re now rolling out our nudity protection feature globally in Instagram DMs. This feature, which will be enabled by default for teens under 18, will blur images that we detect contain nudity when sent or received in Instagram DMs and will warn people of the risks associated with sending sensitive images. We’ve also worked with Larry Magid at ConnectSafely to develop a video for parents, available in the Meta Family Center’s Stop Sextortion page, that explains how the feature works. This campaign and new safety features are in addition to our recent announcement of Teen Accounts, which gives tens of millions of teens built-in protections that limit who can contact them, the content they see and how much time they’re spending online. Teens under 16 aren’t able to change Teen Account settings without a parent’s permission.With Instagram Teen Accounts, teens under 18 will be defaulted into stricter message settings, which mean they can’t be messaged by anyone they don’t follow or aren’t connected to. In the EU, we will start placing teens into Teen Accounts later this year and in the rest of the world Teen Accounts will be available from January.

Taking action against sextortion criminals Last week, we removed around 1,600 Facebook Groups and accounts that were affiliated with Yahoo Boys, and were attempting to organize, recruit and train new scammers. This comes after we announced in July that we’d removed around 7,200 Facebook assets that were engaging in similar behavior. Yahoo Boys are banned under Meta’s Dangerous Organizations and Individuals policy — one of our strictest policies — which means we remove Yahoo Boys’ accounts engaged in this criminal activity whenever we become aware of them. While we’ve been removing violating Yahoo Boys accounts for years, we’re putting new processes in place which will allow us to identify and remove these accounts more quickly.We’ll continue to evolve our defenses to help protect our community from sextortion criminals. This includes helping teens and their families recognize these scams early, preventing potential scammers from reaching their targets, and working with our peers to fight these criminals across all the apps they use.

Continue Reading

Information Technology

Helping Teens Avoid Sextortion Scams

Published

on

By

Takeaways:

●        Meta has worked with the National Center for Missing & Exploited Children (NCMEC) to expand Take It Down to more countries and languages, allowing millions more teens to take control of their intimate imagery.

●        Meta has also partnered with Thorn to update our Stop Sextortion hub, offering new tips and resources for teens, parents and teachers on how to prevent and handle sextortion.

●        Meta is supporting safety organizations and creators around the world to help raise awareness of sextortion scams and what teens and parents can do to take back control.

Having a personal intimate image shared with others can be devastating, especially for young people. It can feel even worse when someone threatens to share it if you don’t give them more photos, sexual contact or money — a crime known as sextortion. That’s why, this Safer Internet Day, we’re announcing new efforts to help combat this kind of criminal activity. These include giving more teens control over their intimate images, helping teens — and their parents and teachers — feel better equipped against those trying to exploit them, and supporting creators and safety organizations around the world as part of a global campaign to raise awareness of sextortion. Expanding Take It Down to More Languages and Countries Take It Down is a program from NCMEC, supported by Meta, which is designed to help teens take back control of their intimate images and help prevent people — whether it’s scammers, ex-partners, or anyone else — from spreading them online. First launched last year in English and Spanish, Meta and NCMEC are now expanding the platform to many more countries and languages, making it accessible to millions more teens around the world. There are several ways people can use Take It Down to find and remove intimate imagery, or help prevent people sharing them in the first place:● Young people under 18 who are worried their content has been, or may be, posted online● Parents or trusted adults on behalf of a young person● Adults who are concerned about images taken of them when they were under 18 Take It Down was designed to respect young people’s privacy and data security. To start the process, people can go to TakeItDown.NCMEC.org and follow the instructions to assign a unique hash — a digital fingerprint in the form of a numerical code — to their image or video, privately and securely from their own device. Teens only need to submit the hash, rather than the intimate image or video itself, which never leaves their device. Once the hash has been submitted to NCMEC, companies like Meta can find copies of the image, take them down and help prevent anyone who’s threatening them from posting them in the future. “Making Take it Down available in 25 languages is a pivotal step towards safeguarding children from the horrors of online exploitation all over the world,” said John Shehan, a Senior Vice President with the National Center for Missing & Exploited Children. “We aspire to ensure that every child, regardless of language or location, has the opportunity to reclaim their dignity and privacy by having their illicit content removed from participating platforms.” Take It Down builds off of the success of platforms like StopNCII, which helps prevent those seeking to exploit people from sharing adults’ intimate images online. New Resources for Teens, Parents and Teachers to Help Prevent Sextortion These moments can be upsetting and isolating, especially for young people, who may feel too scared to ask for help. That’s why we’ve worked with Thorn, a nonprofit that builds technology to defend children from sexual abuse, to develop updated guidance for teens on how to take back control if someone is sextorting them. It also includes advice for parents and teachers on how to support their teens or students if they’re affected by these scams. The new resources can be found in our updated Sextortion hub within Meta’s Safety Center. Kelbi Schnabel, Senior Manager at Thorn, said: “Our work with Meta to provide targeted, robust sextortion resources has helped Thorn significantly enhance our efforts in combating sextortion. Our joint initiative is already empowering parents and teens to understand the risks and take action, which is a testament to the power of collaborative action in tackling complex challenges like sextortion. The result of our collaboration underscores the importance of accessible, comprehensive resources in the digital era.” To help make sure teens and parents everywhere know about these scammers and what they can do to avoid them, Meta is launching a global campaign, supporting safety organizations and working with incredible creators around the world to help raise awareness. Our Work to Help Protect Teens From Sextortion On Our Apps Today’s updates build on the work we already do to help young people know there are steps they can take if someone has shared, or is threatening to share, their intimate images. We show Safety Notices to people on Instagram when they’re messaging someone who has shown potentially scammy or suspicious behavior. These Safety Notices urge people to be cautious, encourage them to report any account that threatens to share their private images, and remind them that they can say no to anything that makes them feel uncomfortable. We also direct teens to Take It Down at relevant moments when using Facebook and Instagram, such as if they report someone for sharing their private images, for nudity, or for sexual exploitation. 03_Saftey-Notice.webp And we work to help protect teens from unwanted contact in the first place. We default teens under 16 (and under 18 in certain countries) into private Instagram accounts when they sign up, which hides their follower and following lists, and we restrict adults over 19 from messaging minors who don’t follow them. Last month, we announced stricter default message settings, meaning teens under 16 (and under 18 in certain countries) won’t receive messages from anyone they don’t follow or aren’t already connected to, providing more protection against potential scammers.

Continue Reading

Information Technology

Introducing Community Chats: Connecting Your Community in Real Time on Messenger and Facebook

Published

on

By

Introducing Community Chats.

We’ve helped more than a billion people connect with loved ones in a trusted space on Messenger, and we know that people want to engage in real time with larger communities over shared interests. Today, Mark Zuckerberg announced we’ll begin testing the ability for people to start Community Chats in Messenger in the coming weeks, allowing people to create a Facebook Group, start chats and audio channels, and invite others to join their new group all within the app. We’ll also be expanding Community Chats to even more Facebook Groups.

Why Community Chats?

Community Chats let people connect more deeply with communities in real time around the topics they care about in multiple formats, including text, audio and video. The experience seamlessly blends Messenger and Facebook Groups to allow people to connect when, where and how they want. Admins can now start a conversation about a topic and get in-the-moment responses instead of waiting for people to comment on a post. And, rather than navigating multiple topics in a single Messenger group chat, the person who creates the Community Chat can organize chats into categories so group members can easily find what’s most interesting to them. For example, a band’s fan group could have a “Breaking News” category with chats dedicated to new album drops, tour dates and group activities.

How It Works

Check out how Vanessa Yaeger, admin of the Women Who Surf Facebook Group, uses Community Chats to bring her members together in real time for a spontaneous IRL meetup.

Admins can choose from several options to help their communities connect. They can start a chat for group members around a specific topic, an event chat for an outing or meetup, a view-only broadcast chat for admins to announce group-wide updates and an admin-only chat to collaborate with admins and moderators. Admins can also create audio channels so group members can share live commentary or receive real-time support. Participants also have the option to enable video once they’re in the audio channel. For example, the admin of a Facebook Group for chemistry students could create audio channels for study groups during finals season, and participants can turn on video for live tutoring. Community Chats are only accessible to members of a group. To learn more about how to use Community Chats, check out the Facebook Community Blog.


Given the more public nature of Community Chats, we’ve developed a robust suite of tools to help admins easily manage both chat and audio experiences. This includes moderation capabilities like blocking, muting or suspending group members, and removing members or messages, as well as Admin Assist, which allows admins to set custom criteria that will automatically suspend users, remove reported messages, and stop messages from ineligible authors or containing violating content from being sent. Members of Community Chats can also report messages to group admins or Meta, block users or leave a chat at any time. Learn more about privacy and safety controls in Community Chats.



What’s Next?

We’re committed to building messaging experiences that help people connect with their communities, friends and families. As Community Chats rolls out to more people and groups around the world, we’ll continue exploring new features and capabilities to make it easier to connect with one another.

Continue Reading

Trending

Mega Awareness 2023