Press "Enter" to skip to content

The Case For and Against RateMyProfessors.com

RateMyProfessors.com (RMP) has become the leading source students use when researching instructors. But students and professors often have different opinions about the usefulness of the site.

Originally founded in 1999 as TeacherRatings.com by software engineer John Swapceinski, RMP is now the largest online professor rating website in the nation. Although the website has been sold several times, it is currently owned by Cheddar.Inc, the live-streaming financial news network. According to a study done by the Professional Leadership Institute, RMP has over 20 million ratings, 1.8 million professors, and an estimated 9 million viewers monthly.

Despite its fame and popularity, some debate exists around whether RMP is a credible resource or not. Many students find RMP to be invaluable, but some professors say that its ratings are biased and untrustworthy.

In a random survey conducted by the Telescope, with 37 Palomar student responses, a majority (68%) said that they use RMP before enrolling in classes. When asked how important professor RMP scores were to them, 73% classified website ratings as “important” or “very very important”; and 76% said that RMP scores were highly influential to their class choice.

In fact, many stated that RMP is their only resource for choosing classes other than their friends’ opinions.

“I use it for every class,” said Palomar College communications student, Jasmine Vuong.

Voung, who is finishing her first year at Palomar, said, “I think [RMP] is really important because teachers are a really important part of your education.” She said that RMP is her number-one resource when searching for classes and that she trusts other students when they share their experiences on the site.

Chloe Chemtob, a computer animation major, said she finds RMP to be helpful, but said, “When people [on RMP] give their personalized opinions on how to pass the class, I don’t find that to be super accurate.”

She said every student has their own study style and that putting that information in a review was not personally helpful for her. Yet, Chemtob said that she still writes reviews for both good and bad professors.

Julianne Littlefield, a humanities student said, “It helps me a lot, otherwise I don’t really know what to do. There’s usually three or four different options for professors… and I don’t know which one to choose.” She also relies on friends and their recommendations, but word of mouth has its limitations.

Even though most students find RMP to be helpful, some professors don’t see it in the same light.

“RMP data is flawed, inaccurate, biased, and does not represent the truth,” wrote Texas State University professors, Alexander Katrompas and Vangelis Metsis, in a 2021 study conducted for the 2nd International Conference on Computing and Data Science (CDS).

“We found indications that gender is a factor and it is clear that gender bias exists on Rate My Professors. In addition, this bias appears to be strongest for women in STEM,” said Katrompas and Metsis in their study.

A current Palomar College instructor who wished to remain anonymous said, “Based on prevailing research and 25 years of teaching experience, I see few redeeming features of RMP. It’s hardly more than a thinly disguised popularity contest that contributes to our culture of anti-intellectualism and lowering of standards in higher education.”

“RMP has too many flaws to be considered a serious source of accurate information… because of its manifold flaws, inaccurate reviews, unlimited reviews, impostor reviews, voluntary response bias, gender bias, STEM bias, anti-rationalism, anti-intellectualism, etc., it misleads students about their own higher education – which can even affect their career prospects,” the instructor said.

However, not all professors are in agreement.

“I tell my students that you should use it,” said Palomar College Biology professor, Beth Pearson. “If I’m going to find an honest opinion, then, that’s probably the place I’m going to find it,” she said. But Pearson also admitted that RMP is not perfect.

“I don’t think RateMyProfessors is completely accurate or truthful, but like any rating, like Yelp… you know how to read through the comments,” she said.

Pearson also addressed the issue of anti-intellectualism. She said, “I tell my students, yeah, read through the comments. But don’t choose the easy people, because you’re not going to leave Palomar a better student.”

Pearson said she encourages her students to choose professors that are passionate about their subject. “How will you know that other than word of mouth, right? What a powerful tool,” said Pearson.

But, if RMP is in-fact flawed, as Katrompas’s and Metsis’s study indicated, is there another way for students to research and evaluate professors and classes?

Pearson said that she believes that there could be. She’s on the Committee for Tenure and Evaluations and the idea of implementing a different kind of evaluation system has come up.

When Pearson taught at CSUF, every class was reviewed at the end of every semester.

“I would love a public rating system that was done every class, every semester and available for students to look at,” said Pearson, “I think it gives more strength to the student voice.”

Students realize that RMP can be used in a positive light, but it also has its limitations. Until better resources are available, navigating through college class choice is still like sailing through murky waters.

Comments are closed, but trackbacks and pingbacks are open.