People love to speak to Ai –Some, a little too much. And according to contract workers for MetaThose who check the interactions of the people with the company’s chatbots to improve their artificial intelligence are a little too willing to share personal, private information, including their real names, telephone numbers and e -mail addresses, with the AI of Meta.
Business Insider spoke with Four contract workers who use Meta Aligner And SCASTET AI ownerTwo platforms that include human experts for the training of AI, and the contractors said that “non -edited personal data for the meta projects were more common on which they worked” compared to similar projects for other customers in Silicon Valley. According to these contractors, many users have shared very personal data on the various platforms of META such as Facebook and Instagram. Users spoke to the AI of Meta as if they were talking to friends or even romantic partners, sending selfies and even “explicit photos”.
Be clear, People who come too close to their AI chat bots is well documented and metas in practice in practice Humane contractors to evaluate the quality of KI assistants To improve future interactions – is hardly new. Back in 2019, The guardian reported how Apple regularly heard extremely sensitive information of Siri users, although the company had “no specific procedures for coping with sensitive recordings” at that time. Similar, Bloomberg reported how Amazon had thousands of employees and contractors All over the world, check and transcribe clips manually. Vice And Main board Report about Microsoft’s contractors who record and check language content often heard the contractors often heard on their Xbox consoles by accidental activation.
However, Meta is a different story, especially in view of its track record in the past ten years when it comes to the dependence on third -party contractors and the company’s errors in data management.
Metas checked records about the privacy of the user
In 2018, The New York Times And The guardian Reported, such as Cambridge Analytica, a political advisory group of the Republican Hedge Gimmardist Howard Mercer, used to harvest data from ten million users without their consent, and to profile this data to profile US voters and to aim with personalized political ads to select President Donald Trump in 2016. It led to Facebook is hit with a fine of 5 billion US dollars from the Federal Trade Commission (FTC)) one of the largest data protection settlements in US history.
With the developer platform from Facebook, the Cambridge Analytica scandal made broader problems that made enormous data access possible, but only had limited surveillance. Published according to internal documents from Frances HaugenA whistleblower in 2021 often revealed the leadership of meta growth and commitment against privacy or security concerns.
Meta was also examined on the use of contractors: 2019, Bloomberg reported Without knowing how it was preserved. (Facebook said at the time that the records only came from users who had chosen the transcription services and added that they had “applied this practice during the break”.)
Facebook has tried to rehabilitate his image for years: It was renamed Meta in October 2021 and the name change was deleted As a future -oriented postponement of the focus on “meta -verse” and not in response to controversy about misinformation, privacy and platform security. But Metas legacy when dealing with data throws up a long shadow. While the use of human reviewers to improve large -scaling models (LLMS) is a common industry practice at this time, the latest report on the use of contractors through META, and the information companies say that they are able to ask new questions about how data is treated by the parent company of the most popular social networks worldwide.
In an explanation AssetsA META spokesman said that the company has “strict guidelines that regulate access to the personal data for all employees and contractors”.
“While we work with contractors to improve the quality of the training data, we deliberately limit which personal data you see and we have processes and guardrails to prove how you can deal with such information that you may be able to encounter,” said the spokesman.
“For projects that focus on AI personnel, contractors are approved in the course of their work in order to access certain personal data in accordance with our publicly available data protection guidelines and AI terms. Regardless of the project, all non -authorized parts or misuse of personal information are a violation of our data guidelines and we are added,” she added.