The U.S. Federal Trade Commission (FTC) announced a final rule prohibiting the sale, purchase, and use of fake reviews and testimonials, including AI-generated ones.
The FTC aims to combat deceptive practices in online marketplaces and social media. The rule, set to take effect 60 days after publication in the Federal Register, targets several key areas:
- AI-generated fake reviews: The FTC prohibits businesses from creating, selling, or knowingly disseminating reviews by non-existent individuals or those without actual product experience.
- Incentivized reviews: Companies are barred from offering compensation or incentives for reviews expressing specific sentiments, whether positive or negative.
- Insider reviews: Undisclosed testimonials from company officers, managers, employees, or agents are prohibited.
- Review suppression: The rule bans unfounded legal threats, intimidation, or false accusations to prevent or remove negative reviews.
- Fake social media indicators: Selling or buying artificial followers, views, or other engagement metrics is prohibited when the buyer knows or should know they’re fake.
“Fake reviews not only waste people’s time and money but also pollute the marketplace and divert business away from honest competitors,” FTC Chair Lina M. Khan said in a press release.
The Commission emphasized that case-by-case enforcement without civil penalty authority may not sufficiently deter deceptive practices.
The final rule, approved unanimously by the Commission, follows public comments and an informal hearing held in February 2024.
Merchants have been using fake and paid reviews, especially on Amazon. The company revealed it detected over 200 million suspected fake reviews in 2020.
Yelp reported over 950 people shady groups, posts, or individuals engaging in deceptive review practices online in 2021.
Regarding the misuse of AI, YouTuber David Millette recently filed a class action lawsuit against OpenAI, alleging unauthorized use of video content for AI training.
The U.S. Federal Trade Commission (FTC) announced a final rule prohibiting the sale, purchase, and use of fake reviews and testimonials, including AI-generated ones.
The FTC aims to combat deceptive practices in online marketplaces and social media. The rule, set to take effect 60 days after publication in the Federal Register, targets several key areas:
“Fake reviews not only waste people’s time and money but also pollute the marketplace and divert business away from honest competitors,” FTC Chair Lina M. Khan said in a press release.
The Commission emphasized that case-by-case enforcement without civil penalty authority may not sufficiently deter deceptive practices.
The final rule, approved unanimously by the Commission, follows public comments and an informal hearing held in February 2024.
Merchants have been using fake and paid reviews, especially on Amazon. The company revealed it detected over 200 million suspected fake reviews in 2020.
Yelp reported over 950 people shady groups, posts, or individuals engaging in deceptive review practices online in 2021.
Regarding the misuse of AI, YouTuber David Millette recently filed a class action lawsuit against OpenAI, alleging unauthorized use of video content for AI training.