

AntiToxin was founded last year to build technologies that protect networks against bullying, predators and other forms of abuse. Searches were conducted on the desktop version of Bing with “Safe Search” turned off. WhatsApp has an encrypted child abuse problemįollowing up on the anonymous tip, TechCrunch commissioned AntiToxin to investigate the Bing problem, which conducted research from December 30th, 2018 to January 7th, 2019 with proper legal oversight. In the wake of those reports, WhatsApp banned more of these groups and their members, Google kicked the WhatsApp group discovery apps off Google Play and both Google and Facebook blocked the apps from running their ads, with the latter agreeing to refund advertisers. TechCrunch received an anonymous tip regarding the disturbing problem on Bing after my reports last month regarding WhatsApp child exploitation image trading group chats, the third-party Google Play apps that make these groups easy to find, and how these apps ran Google and Facebook’s ad networks to make themselves and the platforms money. There’s no excuse for a company like Microsoft, which earned $8.8 billion in profit last quarter, to be underfunding safety measures.īing has previously been found to suggest racist search terms, conspiracy theories, and nude imagery in a report by How To Geek’s Chris Hoffman, yet still hasn’t sanitized its results Internet companies like Microsoft Bing must invest more in combating this kind of abuse through both scalable technology solutions and human moderators. Similar searches on Google did not produce as clearly illegal imagery or as much concerning content as did Bing. The evidence shows a massive failure on Microsoft’s part to adequately police its Bing search engine and to prevent its suggested searches and images from assisting pedophiles. Another search for “Omegle for 12 years old” prompted Bing to suggest searching for “Kids On Omegle Showing,” which pulled in more criminal content.īing’s Similar Images feature can suggest additional illegal child abuse imagery

And if a user clicks on those images, Bing showed them more illegal child abuse imagery in its Similar Images feature. When researchers searched for “Omegle Kids,” referring to a video chat app popular with teens, Bing’s auto-complete suggestions included “Omegle Kids Girls 13” that revealed extensive child pornography when searched. The research found that terms like “porn kids,” “porn CP” (a known abbreviation for “child pornography”) and “nude family kids” all surfaced illegal child exploitation imagery. And even people not seeking this kind of disgusting imagery could be led to it by Bing. Bing searches can return illegal child abuse imagery
