Think of Bruinwalk.com, UCLA students’ preferred platform for reviewing classes and instructors, like Yelp, and the class you’re thinking of taking as an overpriced California burrito. When enrollment passes arrive and you’re getting hungry, you need to find a restaurant. The places with rave reviews have inconvenient, long waits to get a table, but you don’t want to drop below two stars and end up with food poisoning.
In other words, you’re searching for the sensible choice, the 3.5-star hidden gem where the food comes hot and the service is friendly.
The problem with Bruinwalk, though, is you rarely see moderate reviews.
Crowdsourced review forums like Bruinwalk, a platform operated by the Daily Bruin, aims to provide students with helpful information. However, the opinions they feature are often so extreme they fail to paint comprehensive images of the courses, and thus do not accurately inform students. This makes sense, as very few would devote time to writing a review unless they had a particularly exceptional experience.
In order to give students more balanced information when it comes to course evaluations, UCLA should consider making the results of its end-of-quarter internal course and instructor evaluations viewable to students. Doing so would not only help students make more informed decisions about the courses they enroll in, but also give professors a less polarized rating system than Bruinwalk offers.
At the end of each quarter, students are highly encouraged to fill out course evaluations through the UCLA Evaluation of Instruction Program on MyUCLA, rating both the class and instructor in numerical and free-response form.
These course evaluations are designed to help instructors improve their teaching style and build fairly standardized performance records for review by their respective departments. However, students do not have access to these results, even though such numerical information would be helpful to students when they are selecting courses.
Instead, students are left to rely on Bruinwalk, which often promotes extreme reviews based on a professor’s personality – either deifying or demonizing what is, more often than not, an average academic experience.
“A lot of times, students evaluate professors based on criteria that (aren’t) useful or fair and you miss out on a middle ground,” said Lauren Dembowitz, a UCLA teaching associate.
And students feel the same way, too.
“I’ve shied away from certain classes before because they didn’t have ratings online,” said Kelsey Conway, a second-year undeclared student who regularly uses Bruinwalk.
And this isn’t the only way relying on Bruinwalk can skew enrollment. Many of the website’s reviews evaluate personality instead of quality of instruction, promoting subjective accounts as fact and unnecessarily prejudicing students against instructors or classes they might not otherwise be offended by.
“I get a number of students who say they wanted to take my class due to reviews they have seen on Bruinwalk,” said Steven Levy, a philosophy lecturer at UCLA. “In that case they are taking a class based on personality, not course material, and they are often ill-prepared for the course.”
As such, UCLA should give students access to the numerical ratings for the course and the instructor reviews. Expanding the number of reviews students have access to would help create a balanced review system and highlight the full range of options available to students.
“The more information that is available to a student, the more informed choice that student can make,” Levy said.
And there certainly is a lot of information out there. The EIP’s course evaluations have about a 50 percent response rate, said Marc Levis-Fitzgerald, director of UCLA’s Center for Educational Assessment. This would provide considerably more data than Bruinwalk does, giving students more varied and, presumably, more accurate reviews to draw upon when making enrollment decisions.
Granted, these internal course evaluations are meant for professional development, and opening the results up to students could present privacy concerns, as publishing these results makes a professor’s work portfolio public. But Bruinwalk’s ubiquitous usage has already compromised this privacy.
“Given that there are public forums online where anything can be said, I don’t see making the official information public as being terribly intrusive to my privacy,” Levy said.
Levis-Fitzgerald and Adrienne Lavine, the faculty director of the Office of Instructional Development, also raised concerns about how opening up the results of the survey could compromise the original purpose: to provide instructors with an opportunity to improve on their weaknesses as perceived by students.
This increased transparency, however, could actually inspire more professional growth in faculty. Dembowitz agrees that making reviews accessible to students could be beneficial.
“Something like this might even encourage faculty teaching assistants to feel more accountable,” Dembowitz said.
There is a way to accommodate professors and their development while making sure students are equipped with the information necessary to find a best fit with courses and instructors. Opening up the results of course evaluations would require discussions among faculty and administrators to maintain a comfortable learning environment for instructors.
And these discussions wouldn’t be difficult to facilitate. UCLA can’t afford to ignore the need for standardized, and publicized, course evaluations given the extreme nature and popularity of Bruinwalk reviews.
Doing otherwise would not only leave students without the balanced information needed to successfully curate an undergraduate experience, but also abandon instructors to merciless internet trolls.