Nextdoor's recent anti-racism statement suggests that the company is aware of the racism on its platform.
According to the app itself, Nextdoor is a place where communities "come together to greet newcomers, exchange recommendations, and read the latest local news." It is a hyperlocal closed social media platform, which, as Buzzfeed states, "is a great place to complain about a neighbor’s barking dog."
In March this year, the philosopher Simone Webb wrote about Nextdoor for Logically, saying that the community is "rooted in a sense of exclusion, and fear, of outsiders," adding that "this fear breeds a lack of compassion, smothering the kindness that the app claims to cultivate."
Others have also commented on the hostility and racism of the Nextdoor community.
On 20 April 2021, Vice published an article titled "Nextdoor knows its users are racist," citing the app's recent anti-racism statement. The statement in question says that content related to All Lives and Blue Lives Matter is sometimes prohibited, and content related to the White Lives Matter is always prohibited.
The fact that Nextdoor had to release such a statement does indeed suggest that the company is aware of racism on its platform (though it is of course also true that other social media platforms have released statements on racism and the Black Lives Matter movement).
According to an article by Mashable, "cases of racial profiling on the service — the classic curtain-twitching posts about 'sketchy' Black men — were showcased as early as 2015."
In 2020, the Verge referred to the app's "Karen" problem, citing the experiences of a woman who had moved into a predominately white neighborhood and also cited the racism built into the service. For instance, the article mentions the times that users of color have been reported for using a fake name when they have in fact used their real names.
In 2020, Nextdoor deleted posts about racial injustice and the Black Lives Matter movement. The CEO admitted responsibility for this. As NPR reported, "Nextdoor CEO Sarah Friar said the company should have moved more quickly to protect posts related to Black Lives Matter by providing clearer guidance," adding that the CEO said it was "was really our fault" that moderators were deleting the posts.