New! Sign up for our free email newsletter.
Science News
from research organizations

Online reviews: Filter the fraud, but don't tell us how

Reviews are compromised with transparent filter policies

Date:
January 18, 2024
Source:
Rensselaer Polytechnic Institute
Summary:
When you try a new restaurant or book a hotel, do you consider the online reviews? Do you submit online reviews yourself? Do you pay attention if they are filtered and moderated? Does that impact your own online review submissions?
Share:
FULL STORY

When you try a new restaurant or book a hotel, do you consider the online reviews? Do you submit online reviews yourself? Do you pay attention if they are filtered and moderated? Does that impact your own online review submissions?

A research team comprising of Rensselaer Polytechnic Institute's T. Ravichandran, Ph.D., professor in the Lally School of Management, and Jason Kuruzovich, Ph.D., associate professor in the Lally School of Management; and Lianlian Jiang, Ph.D., assistant professor in the Bauer College of Business at the University of Houston, examined these questions in recently published research. In a world where businesses thrive or die by online reviews, it is important to consider the implications of a platform's review moderation policies, the transparency of those policies, and how that affects the reviews that are submitted.

"In 2010, Yelp debuted a video to help users understand how its review filter works and why it was necessary," said Jiang. "Then, Yelp added a section to display filtered reviews. Previously, Yelp did not disclose information about its review filter. This change presented the perfect opportunity to examine the effect of policy transparency on submitted reviews."

Ravichandran and team compared reviews of over 1,000 restaurants on Yelp to those same restaurants on TripAdvisor, whose practices remained unchanged and was not transparent about its review filter. They used a difference-in-difference (DID) approach. They found that the number of reviews submitted to Yelp decreased. Those that were submitted were increasingly negative and shorter in length compared to TripAdvisor. Also, the more positive a review, the shorter it was.

"Platforms are pressured to have content guidelines and take measures to prevent fraud and ensure that reviews are legitimate and helpful," said Ravichandran. "However, most platforms are not transparent about their policies, leading consumers to suspect that reviews are manipulated to increase profit under the guise of filtering fraudulent content."

Platforms use sophisticated software to flag and filter reviews. Once a review is flagged, it is filtered out and not displayed, and it is not factored into the overall rating for a business.

"Whether or not to be transparent about review filters is a critical decision for platforms with many considerations," said Kuruzovich.

Users may put in less time and effort into their reviews if they suspect that they have a significant chance of being filtered, or they may do the opposite to make their reviews less likely to be filtered. Since most fake reviews are overly positive, users may assume that positive reviews are most likely to be filtered and act accordingly. However, with a transparent policy, those who submit fake reviews may be incentivized to change their ways.

"Review moderation transparency comes at a cost for platforms," said Ravichandran. "Users reduce their contribution investment, or the amount of time and effort that they put into their reviews. This, in turn, affects the quality and characteristics of reviews. Although transparency helps to position a platform as unbiased toward advertisers, the resultant decrease in the number of reviews submitted impacts the platform's usefulness to consumers."

"This research informs businesses on best practices and consumer behavior in the digital world," said Chanaka Edirisinghe, Ph.D., acting dean of the Lally School of Management. "Online reviews pose great opportunity for firms, but also raise complex questions. Platforms must earn the trust of users without sacrificing engagement."


Story Source:

Materials provided by Rensselaer Polytechnic Institute. Original written by Katie Malatino. Note: Content may be edited for style and length.


Journal Reference:

  1. Lianlian (Dorothy) Jiang, T. Ravichandran, and Jason Kuruzovich. Review Moderation Transparency and Online Reviews: Evidence from a Natural Experiment. MIS Quarterly, 2023 DOI: 10.25300/MISQ/2023/16216

Cite This Page:

Rensselaer Polytechnic Institute. "Online reviews: Filter the fraud, but don't tell us how." ScienceDaily. ScienceDaily, 18 January 2024. <www.sciencedaily.com/releases/2024/01/240118122118.htm>.
Rensselaer Polytechnic Institute. (2024, January 18). Online reviews: Filter the fraud, but don't tell us how. ScienceDaily. Retrieved April 28, 2024 from www.sciencedaily.com/releases/2024/01/240118122118.htm
Rensselaer Polytechnic Institute. "Online reviews: Filter the fraud, but don't tell us how." ScienceDaily. www.sciencedaily.com/releases/2024/01/240118122118.htm (accessed April 28, 2024).

Explore More

from ScienceDaily

RELATED STORIES