In the world of online platforms, our communication and purchases seem to be relatively anonymous. Online media are less biased and more progressive. They don’t look at your weight, height, skin color; they don’t know about your political and social views.

Right? But research, unfortunately, suggests otherwise. The website of Statista provides data that social networks, the primary purpose of which is the exchange and distribution of content, are the leading carriers of hate messages. In 2018, the number of xenophobic and discriminatory materials was 14 310, and in 2019 – 17 555.

Research is now documenting racial or ethnic discrimination in various areas on the Internet, from labor markets to loan applications and housing. Two notable features accomplish it. First, traits of race or ethnicity – most often photographs, but also more subtle traits such as names – can cause deliberate or unconscious discrimination. The second feature is the increased freedom of action between sellers and buyers with whom they enter into deals.

After several studies by Airbnb, Uber, and others, researchers found that design flaws – when your photo and name might not play in your favor – directly affect the spread of discrimination on social media.

What can be done today to reduce discrimination on online platforms?

Forbes writes that digital platform creators need to understand how their design choices and algorithms can lead to market discrimination. Managers can actively investigate and solve the problem.

Even within an industry, platforms often differ in design, leading to different levels of discrimination. For example, the HomeAway vacation rental home search results page only displays photos of the rental property and does not host pictures of the hosts until the last page or are not used at all. In contrast, Airbnb has historically included host photos on its home results page search.

Tips for reducing discrimination on the online platform.

1) Raise awareness of potential discrimination on platforms

Platforms need to understand how their design choices and algorithms can affect the degree of discrimination in the marketplace. It is a straightforward task for the UI/UX design company, whose specialists must work with lawyers and data managers.

Especially for large organizations, it can be helpful to have a team dedicated to monitoring new projects solely for the risk of discrimination.

2) Measure discrimination on the platform

It is a good idea to prevent discrimination on your platform to measure the level of discrimination and study the racial and gender composition of your audience.

Regularly reporting or auditing users at risk of discrimination and assessing each group’s success on the platform is an essential step towards identifying and resolving any problems.

3) Hide sensitive data

Hiding potentially sensitive user information, such as race and gender, until the transaction has been agreed upon is a practical move. Several platforms, including Amazon and eBay, are already doing this.


Most website design and development services companies recognize the importance of eliminating discrimination and bullying on social media and platforms. A good company policy in this area gives excellent loyalty to all participants and users of online services.

Author’s bio: Anastasiia Lastovetska is a technology writer at MLSDev, a software development company that builds web & mobile app solutions from scratch. She researches the area of technology to create great content about app development, UX/UI design, tech & business consulting.