Uncovering racial bias
One of the major debates surrounding the sharing economy industry in 2016 was the academic research paper coming out of Harvard Business School underpinning racial discrimination on platforms like Airbnb. The researchers studied 6,400 listings in 5 major US cities using fake “stereotypical race” profiles and found that guests with African-American sounding names were 16% less likely to be accepted for a booking. In addition, people of colour would earn less money as hosts.
This sparked the #airbnbwhileblack hashtag where users highlighted examples of potential racial bias via the Airbnb platform. Users would detail how hosts would decline their request to book for no apparent reason ultimately confirming the study’s findings.
The study found that guests with African-American sounding names were 16% less likely to be accepted for a booking
Airbnb’s response was a non-discrimination policy with detailed and specific guidance to hosts on how to avoid discriminating based on race, gender, religion and more; but that was widely criticised for not being ambitious enough.
Many types of bias
As more and more of these platforms are built, the same issues we face in broader society are showing up in the communities, which are being built around these sharing platforms. And it is not only racial bias, but issues related to gender, sexual orientation, religion and more.
A growing partnership between reputational venture Deemly, and home-sharing disruptor Innclusive gives insight into possible solutions
This is naturally not just a problem for Airbnb. Both Uber and Lyft have been criticised for discriminating against minorities and women. This study revealed that drivers would take longer to pick up passengers, and also take longer to drop them off or never pick them up at all. Similar issues have been raised for the LGBT community. So how do we solve these issues with the sharing economy?
Data can prevent discrimination
It seems 2017 is setting out to continue this discussion, as new research from University of Michigan Ross School of Business suggests that more information about guests is important for eliminating bias. Specifically the researchers tested adding a single host review to each profile which evened out the bias.
It wasn’t just positive reviews which swayed hosts to allow bookings. Even negative reviews on the fictitious guests received the same acceptance rate and were statistically even across both name groups. The researchers concluded that Airbnb should incentivize hosts to write reviews on new guests and for guests to signal their credibility in a more structured way.
The researchers tested adding a single host review to each profile which evened out the bias
Deemly CEO, Sara Green Brodersen believes that using technology and data tools to track and then inform reputations in the sharing economy offers a significantly more accurate, standard and robust way to reduce discrimination. Such reputational scores allow guests and hosts to quickly and reliably gauge past reliability across the sharing landscape, and eliminates the subjective nature and interpretation of many review systems today.
Tackling discrimination from the start
When Airbnb competitor Innclusive started in late 2016, it took on the mammoth task of designing discrimination-free technology and processes. They opted to remove temptation for bias by reordering when hosts would see certain biographic information. Additionally, they built an organization and business practices committed to using available data and monitoring to predict and act on observed biases on their platform.
While both of these sharing companies are still young, they are tackling the issues of bias and discrimination head-on
Innclusive’s Head of Strategy, Kevin Simmons believes that companies have a responsibility to lead by example and in the sharing economy, that means insisting that any and everyone be able to benefit fairly. While supporting the idea that increased data availability could help increase comfort levels between participants he warns that it is not a cure all, and that companies should ensure their technology and practices do not enable biased actions to be taken even in a data rich environment.
While both of these sharing companies are still young, they are tackling the issues of bias and discrimination head-on and proving that data and bias-free business design offer the best solutions to combating prejudice and discrimination.
Special thanks to Kevin Simmons from Innclusive for contributing to this piece. Kevin is the Director of Strategy & Business Development at Innclusive. They are the leading home sharing platform for minority groups. Founded in 2016 it already has hosts in more than 130 countries.