Blacks on dating apps - manage somehow
How algorithms on dating apps are contributing to racism in our love lives
At a time when racial inequality dominates the headlines and the Black Lives Matter movement gains momentum there is a renewed focus on the role that ethnicity filters and algorithms play on dating apps in contributing to unconscious bias and racial profiling. What part are your dating 'preferences' playing in this?
“It’s really horrible,” declares writer and fat acceptance advocate Stephanie Yeboah about her experience as a plus-size black woman on dating apps. “White men in particular tend to reinforce stereotypes about black women,” she explains. “They say things like, ‘I’ve never been with a girl with dark skin before’, or, ‘I’ve heard you guys are really aggressive and hypersexual’. It makes me feel very othered.”
As someone who has taken on the word ‘fat’ and owned it by turning it into something that is a factual and descriptive term rather than an instant negative, Stephanie is a breath of fresh air. She’s even written a book called Fattily Ever After). But it’s clear within minutes of chatting to her about the dating world, that, unsurprisingly, a lot of it stinks.
“People find insidious ways of saying that they just want to date a white person, adding messages like ‘No Blacks, No Asians, No Middle Easterns’ to their profiles, the implication being that they want someone with blonde hair and blue eyes,” she says.
Read next
The proliferation of racial bias (both overt and unconscious) that Stephanie describes is not new. An infamous 2014 study by OKCupid found that black women and Asian men were likely to be rated lower than other ethnic groups on the site.
A blog post about the study (which has now been deleted) looked at the interactions of 25 million people between 2009 and 2014. Users ‘preferences’ on the site reflected racial bias from the real world.
But at a time when public discourse is centred on racial inequality and solidarity with the Black Lives Matter movement there is an overarching feeling that enough is enough. Racial profiling on dating apps is being recognised as part of the problem and is finally being clamped down on.
Grindr recently announced that it will be removing its ethnicity filter in the next update of the app, after years of receiving criticism for allowing racism to run rife on the platform.
Does anyone else feel like you’re catfishing online daters with your own pictures?Dating
Does anyone else feel like you’re catfishing online daters with your own pictures?
Read next
In 2018 the dating and hook-up app which is popular with gay, bisexual, trans and queer people launched a campaign to make the space ‘Kindr’ acknowledging toxic elements of the space. It took that a step further in 2020 with changes to filters in an effort to address ongoing problematic behaviour. There are now calls for other apps like Hinge to follow suit.
Many dating platforms are also keen to demonstrate that they are cognisant of the cultural and social zeitgeist. Adapting the functionality of a platform like removing problematic filters is just one way of reading the room. Other platforms are showing they ‘get it’ by adding new features. “OkCupid have initiated a BLM hashtag so that people can add it to their profile and Bumble has also added a BLM filter,” says Stephanie about some of the recent changes to the spaces that she’s been using.
Whether this is a short term performative move or a concerted effort to bring lasting change remains to be seen. Stephanie sees it as a positive that could develop into something more long term: “If they can keep it up so that it’s a more permanent thing beyond this time when people are posting black squares on timelines then that would be a good thing.”
The fact that these changes are happening acknowledges that a problem exists. Yet, tackling racial prejudice on dating apps is not a straightforward endeavour. It’s complicated. Human beings have long made romantic choices based on someone’s looks, socio-economic background, status, education, religious or ethnic group. But this has been deeply affected and challenged by social, cultural and technological change.
Read next
“In big cities there is a lot more interaction between ethnic groups, so a lot of the racial endogamy that existed before doesn’t necessarily work any more,” says Viren Swami, a Professor of Social Psychology at Anglia Ruskin University and the author of Attraction Explained: The Science Of How We Form Relationships.
Yet a look at the dating market shows that it is still very much catering to people who want to state a ‘type’ or ‘preference’ or remain within a certain group even if on the face of it, it’s not specific to race. There is literally an app for everything. From sites like J-Date and Muzmatch which cater to religious groups or alternatively, to platforms for the rich and influential such as The League or Ruxy where professional success, education, net worth and number of Instagram followers mean something.
Unpacking what the implications of filters on dating apps really mean is like peeling back the layers of an onion where each layer reveals something new. The layer between ‘type’ and ‘preference’ resides dangerously close to ‘bias’ and ‘prejudice’ - much of which goes unnoticed even by the source.
Dr Pragya Agarwal, a behavioural scientist and author of SWAY: Unravelling Unconscious Biasexplained to Glamour that we have biases or prejudices that we may not always be aware of that affect how we interact with others. Internalised stereotypes affect how we perceive others who do not fit within a certain stereotype or ‘ideal’.
Read next
Recent images showing white women attending BLM demonstrations holding signs with sexualised messages about black male bodies went viral - but not for the reasons they may have expected. Stating a preference in this way is misguided and is unwittingly contributing to the problem. It objectifies and fetishises black men into one homogenous group and others them in the process. “Some people think they’re being allies. With imagery like this, call it out. Until people understand why it’s problematic it’s not going to change,” says Prof Swami.
Existing biases whether conscious or unconscious are also revealing themselves through algorithms. Think about your dating app algorithm as a recipe that involves collecting ingredients (information) to make (process) the perfect bread (match) except the result of what comes out of the oven isn’t always necessarily nutritious or satiating (long lasting).
Dating apps give the impression that the technology they’re using and the data they’re collecting somehow results in a magic recipe which allows people to create specific choices that can lead algorithms to predict what will be a successful match.
This is the unique proprietary that so many dating platforms are secretive and protective about. “Algorithms are trying to put people together based on simple or surface information. But human beings aren’t a match score.” says Prof Swami. “Humans are complex, relationships are messy, people come with baggage from previous relationships or from their parents or carers. An algorithm can’t predict that in advance.”
Read next
12 of the very best dating sites to sign up to today, if you're looking for love but you're all app'ed outDating
12 of the very best dating sites to sign up to today, if you're looking for love but you're all app'ed out
The flawed reality of algorithms is something that online daters appear to be wise to. I carried out a very unscientific piece of research asking my social media followers to tell me if they’d encountered prejudice or bias on dating apps (I didn’t specify racism). One of the respondents, a South Asian woman in her 30s based in Delhi, expressed her discomfort at elitism and colourism online. “Some of it is set up so casually that most don't even question the bias,’ she explained. “Here in India caste and complexion are options for preferences and there are apps that only cater to alumni from tier I and II universities. My family wanted me to join Elite Matrimony. Their argument was it was convenient because the men on there would be highly educated and "prefer" educated women. I have also found it odd how dating apps like Promatch, Aisle and TrulyMadly to a degree rely on LinkedIn profiles in their algorithms.”
Another, a white woman based in London in her 20s, outlined her scepticism about the efficacy of the technology. “I truly believe that the filtering of partners is a hindrance. The way these apps work is through an algorithm based on who you’ve liked and who you’ve disliked, what your bio says and what theirs says, where you went to school etc. Call me a romantic but can an algorithm really lead you to your ‘perfect match’? The point is, the perfect match doesn’t exist but these apps lead you to believe it does. This can only result in feeling unfulfilled,” she wrote in an Instagram DM.
So is there hard evidence that algorithms on dating apps reinforce or even create bias? In 2019 a game called MonsterMatch (created by the tech company Mozilla) lifted the lid on the problem. The game simulates a dating app and teaches users how algorithms suss you out by “collaborative filtering”.
Read next
Many people will be familiar with this when they get a book or film recommendation based on what they’ve just consumed. This type of filtering when applied to dating can end up potentially separating you from lots of people you would otherwise match well with. The game demonstrates that algorithms learn from users ‘preferences’ and serve that back to them exacerbating bias in the process.
The best movies and shows to watch on Netflix to educate yourself on Black Lives MatterTV Shows
The best movies and shows to watch on Netflix to educate yourself on Black Lives Matter
But there are also instances where online daters have received biased results even when they’ve not stated a preference. In 2016, Buzzfeed famously reported that users of the Coffee Meets Bagel app were served images of people from their own race even when they’d stated ‘no preference’ for ethnicity. The app’s creators argued that the algorithm wasn’t racist. They said that in the absence of a preference and by using empirical (observational) data the algorithm knows that people are more likely to match with their own ethnicity. Glamour reached out to Coffee Meets Bagel to ask if it still uses this method of creating matches and will update this piece upon receiving a response.
Yet the question around the role of algorithms and racist behaviour on apps remains. Is it up to the tech companies or users themselves to address? Prof Viren Swami believes it should be a mix of both. He says the language we are exposed to can be insidious: “If a user is being explicitly racist, apps have a responsibility to address that. Otherwise it normalises the perception that this language is ok.”
Stephanie Yeboah feels that while race filters are extreme, there is still a place for other more nuanced filters. “Coming from experience, for plus-sized women the majority of the time we get rejected is because of how we look. It’s horrible to be unmatched because of body type. I do think it’s worth exploring ways of creating filters that would help women like me. Perhaps a feminist centrice space like Bumble could trial new features that would still keep women safe.”
Technology has a transformative power to connect people and be a source of learning to expand horizons. But it is also often a reflection of some of the worst parts of human behaviour. “Apps can play a role within society" says Prof Swami. But in order for there to be real progress racism needs to also be tackled outside these spaces. We tend to treat technology as a vacuum. As long as racism exists in society it will exist online.”
Lifestyle
-
-