Racial dating statistics

Despite that, she noticed that all the men she was being sent appeared to be Arab or Muslim (she based this on contextual clues in their profile such as their names and photos).This frustrated her – she had hoped and expected to see lots of different types of men, but she was only being served potential matches that were outwardly apparent to be the same ethnicity.

racial dating statistics-35

A friend (who wishes to remain anonymous because she doesn't want her family knowing she online dates) noticed something strange recently after she had been using the dating app Coffee Meets Bagel for a while: It kept sending her a certain type of guy.

Which is to say, it kept suggesting men who appear to be Arabs or Muslim.

And they have indeed analyzed the bizarre and somewhat disheartening information on what kinds of ethnicity preferences people have.

In a blog post examining if the myth that Jewish men have a “thing” for Asian women, the company looked what the preferences for each race was (at the time, the app was 29% Asian and 55% white).

“The algorithm is NOT saying that ‘we secretly know you're more racist than you actually are…’ What it's saying is ‘I don't have enough information about you so I'm going to use empirical data to maximize your connection rate until I have enough information about you and can use that to maximize connection rate for you.’In this case, the empirical data is that the algorithm knows that people are more likely to match with their own ethnicity.

Perhaps the fundamental problem here is a disconnect between what daters think selecting "no preference" will mean ("I am open to dating all different types of people") and what the app's algorithm understands it to mean ("I care so little about ethnicity that I won't think it's weird if I'm shown only one group).

“For example, many users who say they have ‘no preference’ in ethnicity actually have a very clear preference in ethnicity when we look at Bagels they like – and the preference is often their own ethnicity.

I asked Kang if this seemed sort of like the app is telling you we secretly know you’re more racist than you think.“I think you are misunderstanding the algorithm,” she replied.

A similar analysis of women’s preferences showed that of white women who only preferred one race, 100% were for white men.

The app’s goal is to use what they’ve learned about people’s behavior to make the best possible match suggestions.

So how does the algorithm find the rest of these dudes? ”Anecdotally, other friends and colleagues who have used the app all had a similiar experience: white and Asian women who had no preference were shown mostly Asian men; latino men were shown only latina women.

Tags: , ,