Facebook is working on a new feature that will automatically alert you if it detects that another user is impersonating your account by using your name and profile photo.
Facebook’s Head of Global Safety, Antigone Davis, explains that when Facebook detects that another user may be impersonating you, it will send an alert notifying you about the profile. You will then be prompted to identify if the profile in question is impersonating you by using your personal information, or if it belongs to someone else who is not impersonating you.
Davis said the feature, which the company began testing in November, is now live in about 75% of the world and Facebook plans to expand its availability in the near future.[quote font=”georgia” font_size=”22″ font_style=”italic” bgcolor=”#” color=”#” bcolor=”#” arrow=”yes”]He explains that under the test, when someone reports nudity on Facebook they’ll have the additional option of not only reporting the photo as inappropriate, but also identifying themselves as the subject of the photo. Doing so will surface links to outside resources — like support groups for victims of abuse as well as information about possible legal options — in addition to triggering the review process that happens when nudity is reported. [/quote]
He said the impersonation alerts are part of ongoing efforts to make women around the world feel more safe using Facebook. The company says it has been hosting roundtable discussions around the world with users, activists, NGOs and other groups to gather feedback on how the platform can better address issues around privacy and safety.
Davis also revealed that the company is also testing other safety features as a result of the talks: new ways of reporting nonconsensual intimate images and a photo checkup feature.
He explains that under the test, when someone reports nudity on Facebook they’ll have the additional option of not only reporting the photo as inappropriate, but also identifying themselves as the subject of the photo. Doing so will surface links to outside resources — like support groups for victims of abuse as well as information about possible legal options — in addition to triggering the review process that happens when nudity is reported.
Davis said initial testing of these reporting processes has gone well but they are still looking to gather more feedback and research before rolling them out more broadly.
Facebook already has fine-tuned privacy controls in place but users, particularly those in India and the other countries where the feature is being tested, aren’t necessarily familiar with how to use them, Davis said. The photo checkup is meant to bridge that gap by walking users through a step-by-step review process of the privacy settings for their photos. The photo checkup tool is live in India, as well as other countries in South America, Africa and southeast Asia.