Kickstarter Introduces New Policy for AI Projects
Kickstarter has recently announced a new policy that will affect projects utilizing AI technology, including video games.
Starting from August 29th, creators will be required to disclose whether their projects involve the use of AI, according to a statement on the official Kickstarter website.
If a project fails to properly disclose its AI usage during submission, it may face suspension.
Kickstarter states that creators intentionally misrepresenting their projects or attempting to bypass guidelines will be restricted from submitting new projects on the platform.
To be eligible for Kickstarter, projects must provide relevant details on their project page. This includes explaining how they plan to utilize AI content, distinguishing between original work and elements created through AI outputs.
If the project itself is an AI tool, AI tech, or AI software, creators must disclose information about the databases and data sources they intend to use. Additionally, they must clarify how these sources handle consent and credit for the data they utilize.
It’s important to note that Kickstarter’s new policy does not ban the use of AI in projects. Instead, it aims to ensure human creative input, proper crediting, and permissions for any artist’s work referenced in the project.
Kickstarter believes that transparent disclosure and specificity regarding AI usage build trust and contribute to project success.
This policy announcement by Kickstarter follows Valve’s recent attempts to clarify its stance on AI. Valve reportedly banned a Steam project that incorporated AI-generated artwork.
In response, Valve stated that it aims to not discourage the use of AI in games on Steam. Their actions were based on existing copyright laws and policies, rather than personal opinion.
Valve continues to learn about AI’s role in game development and how to review games incorporating AI for distribution on Steam. Their priority remains to publish as many titles as possible.
Introducing AI can sometimes complicate the process of proving sufficient rights to use AI-generated assets, including images, text, and music.