‘Virtual Lolita’ bot imitates a schoolgirl to trap chatroom paedophiles

One of Negobot’s creators, Dr. Carlos Laorden, told the BBC that past chat bots have tended to be too predictable: “Their behaviour and interest in a conversation are flat, which is a problem when attempting to detect untrustworthy targets like paedophiles.” The most innovative aspect of Negobot may be a key differentiator that makes it appear more lifelike: namely, the incorporation of the advanced decision-making strategies used in game theory. In a paper about their creation, the researchers describe how they’ve taught the robot to consider a conversation itself as a game.

For example, the bot identifies the best strategies to achieve its goal in what its programmers have taught it to understand as a competitive game. Negobot’s goal is to collect the information that can help to determine if a subject involved in a conversation has paedophile tendencies, all the while maintaining a convincing, kid-like prattle, sprinkled with slang and misspellings, so the subject doesn’t get suspicious. Negobot keeps track of its conversations with all users, both for future references and to keep a record that could be sent to the authorities if, in fact, the subject is determined to be a paedophile.

The conversation starts out neutral. The bot gives off only brief, trivial information, including name, age, gender and hometown. If the subject wants to keep talking, the bot may talk about favorite films, music, drugs, or family issues, but it doesn’t get explicit until sex comes into the conversation. The bot provides more personal information at higher levels, and it doesn’t shy away from sexual content. The Negobot will try to string along conversationalists who want to leave, with tactics such as asking for help with family, bullying or other typical adolescent problems. If the subject is sick of the conversation and uses less polite language to try to leave, the bot acts like a victim – a youngster nobody pays attention to and who just wants affection from somebody. Robot. Image courtesy of Shutterstock.From there, if the subject has stopped talking to the bot, the bot tries to exchange sex for affection. Is this starting to sound uncomfortably like entrapment?

That’s exactly what gets some experts worried. John Carr, a UK government adviser on child protection, told the BBC that overburdened police could be aided by the technology, but the software could well cross the line and entice people to do things they otherwise might not: “Undercover operations are extremely resource-intensive and delicate things to do. It’s absolutely vital that you don’t cross a line into entrapment which will foil any potential prosecution.” The BBC reports that Negobot has been field-tested on Google chat and could be translated into other languages. Its researchers admit that Negobot has limitations – it doesn’t, for example, understand irony.

Still, it sounds like a promising start to address the alarming rate of child sexual abuse on the internet. Hopefully, the researchers will keep it reined in so as to avoid entrapment – a morally questionable road that could, as Carr pointed out, ruin the chances for prosecutorial success. What do you think? Are you comfortable with the premise, or does the chances of entrapment sour the concept for you?

You can read the original article, here.