Home Education Meta combats celebrity scam ads with face recognition tech

Meta combats celebrity scam ads with face recognition tech

10
0


Getty Images An anonymised man in a suit looks at a mobile phoneGetty Images

Facebook and Instagram owner Meta is to introduce facial recognition technology to try and crack down on scammers who fraudulently use celebrities in adverts.

Elon Musk and personal finance expert, Martin Lewis, are among those to fall victim to such scams, which typically promote investment schemes and crypto-currencies.

Mr Lewis previously told the Today programme, on BBC Radio 4, that he receives “countless” reports of his name and face being used in such scams every day, and had been left feeling “sick” by them.

Meta already uses an ad review system which uses artificial intelligence (AI) to detect fake celebrity endorsements but is now seeking to beef it up with facial recognition tech.

It will work by comparing images from ads flagged as being dubious with celebrities’ Facebook or Instagram profile photos.

If the image is a confirmed to be a match, and the ad a scam, it will be automatically deleted.

Meta said “early testing” of the system had shown “promising results” so it would now start showing in-app notifications to a larger group of public figures who had been impacted by so-called “celeb-bait.”

Deepfakes

The problem of celebrity scams has been a long-running one for Meta.

It became so significant in the 2010s that Mr Lewis took legal action against Facebook, but he ultimately dropped the case when the tech giant agreed to introduce a button so people could report scam ads.

In addition to introducing the button, Facebook also agreed to donate £3m to Citizens Advice.

But, since then, the scams have become more complex and significantly more believable.

They are increasingly powered by so-called deepfake technology, where a realistic computer-generated likeness or video is used to make it seem like the celebrity is backing a product or service.

Meta has faced pressure to do something about the growing threat of these ads.

On Sunday, Mr Lewis urged the government to give the UK regulator, Ofcom, more powers to tackle scam ads after a fake interview with Chancellor Rachel Reeves was used to trick people into giving away their bank details.

“Scammers are relentless and continuously evolve their tactics to try to evade detection,” Meta acknowledged.

“We hope that by sharing our approach, we can help inform our industry’s defences against online scammers,” it added.

Social media

Meta A graphical representation from Meta of the new features announced. The central image is a selfie cropped into a circle - directly beneath it the words take a video selfie and a button marked recover. Around the circle there's also a graphical representation of a woman holding a padlock style security symbol and another image of a notification sent to a celebrity alerting them to the addition protection.Meta

The tech will also be used for unlocking social media accounts

Meta has also announced it will also use facial recognition tech to help people who find themselves locked out of their social media.

Currently, unlocking Instagram or Facebook accounts involves uploading official ID or documents.

But now video selfies and face recognition is being tested as a way to prove who a person is and and regain access more quickly.

The material provided by the user will be checked against the account’s profile image to see if it is a match.

However, the widespread use of facial recognition is controversial – Facebook has previously used it, before ditching it in 2021 over privacy, accuracy and bias concerns.

It now says that the video selfies will be encrypted and stored securely, and won’t be shown publicly. Facial data generated in making the comparison will be deleted after the check.

But the system will not be initially offered in areas where permission from regulators has not yet been obtained, including the UK and EU.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here