AI Dungeon fans blast Latitude for developer’s new sex filters

AI Dungeon is a limitless, procedurally created video game in which gamers develop a traditional text adventure-style story by composing innovative or smart triggers. In a current upgrade, AI Dungeon designer Latitude carried out a brand-new system that stops the video game from producing sexual material including minors — and the neighborhood is outraged at the execution.

The system (which the designer is presently screening) was created to spot “explicit content involving descriptions or depictions of minors.” Because scenario, AI Dungeon will inform the gamer “Uh oh, this took a weird turn…” and require them to attempt other triggers. According to a declaration from Latitude, the system cast a far larger internet than prepared for, in some cases obstructing the procedural generation of stories including kids or anything associated to particular expressions like “five years old.”

On Tuesday, Latitude published a long blog site describing the upgrade in an effort to lighten the neighborhood’s greatest issues.

The other day, we launched a test system to avoid the generation of specific sexual material that breaches our policies, particularly material that might include representations or descriptions of minors (for which we have no tolerance), on the AI Dungeon platform. We did not interact this test to the Neighborhood ahead of time, which produced an environment where users and other members of our bigger neighborhood, consisting of platform mediators, were captured off guard. Due to the fact that of this, some false information has actually spread out throughout Discord, Reddit, and other parts of the AI Dungeon neighborhood. As an outcome, it ended up being tough to hold the discussions we wish to have about what kind of material is allowed on AI Dungeon.

The designer stated that the test had unintentional ramifications, composing: “While this test has largely only prevented the AI from generating sexual content involving minors, because of technical limitations it has sometimes prevented the generation of content that it wasn’t intended to.”

Fans have actually been responding to these modifications — both the desired function and unintentional adverse effects — on Latitude’s social networks. A few of these posts are memes that are suggested to be a basic dunk on the designer and little else, while other posts appear to show genuine anger.

A picture of the AI Dungeon subreddit since April 28, showing the neighborhood response

Users have actually been sharing examples of their stories concerning an abrupt end when the test system appears to spot questionable material … even when there plainly isn’t any. AI Dungeon permits unrestricted roleplay and storytelling possibilities, and Latitude supports other NSFW product, consisting of sex, violence, and swearing. Some gamers feel alarmed that their personal fiction with adult styles might be based on small amounts and checked out by another individual from the advancement group. Latitude stated that its system will flag possibly rule-breaking posts, which can then be more examined by a team member.

“Latitude reviews content flagged by the model for the purposes of improving the model, to enforce our policies, and to comply with law,” the designer stated. In reaction to the concern “Is Latitude reading my unpublished adventures?” the designer composed, “We built an automated system that detects inappropriate content.” We’ve connected to Latitude for remark and information.

This has actually users stressed over their security and personal privacy, particularly if they’ve sent susceptible or individual info into the AI Dungeon system. In the meantime, the test system is still in location.

Jobber Wiki author Frank Long contributed to this report.