Surely AI can analyze artistic patterns, generate music, paint images, and even craft poetry. It can recognize trends, mimic styles, and predict what might appeal to audiences, but the million dollar question is that whether AI can truly imitate indescribable human touch? Art has always been a uniquely human endeavor, driven by emotion, intuition, and personal experience.
Bob Mankoff, the former cartoon editor of The New Yorker and a cartoonist for 40 years expresses a conflicted view of AI in the creative field, saying, “I both like it and hate it at the same time.”
Eventhough he finds AI very helpful for generating many different ideas on anything he’s seen, he questions its utility stating, “I would not cede to it the ultimate judgment” of what is considered funny, a good song, or a good novel.”
Mankoff believes AI could be used as a tool at its best. However, he asserts that a creator must understand that they, as a human, are the ultimate arbiter and not surrender to the realms of AI.
He warns that “If all of a sudden we cede to algorithms completely… we lose a certain kind of agency in our lives.”
Major tech companies like OpenAI, Meta, and Anthropic are already under increasing scrutiny for their use of copyrighted creative works such as books, music, and visual art to train their generative AI models without obtaining proper authorization. This practice has sparked a series of legal challenges and prompted calls for greater transparency and regulation in the AI industry.
OpenAI and Microsoft were sued by The New York Times for allegedly using millions of its articles without consent, impacting the newspaper’s subscription and advertising revenues, while Anthropic faced a lawsuit from music publishers, including Universal Music Group, for allegedly using copyrighted song lyrics to train its AI chatbot, Claude.
During the case, an expert witness for Anthropic was accused of citing a fabricated source generated by AI, highlighting concerns about the reliability of AI-generated evidence in legal proceedings. Meta was implicated in a lawsuit where internal emails revealed the company had downloaded extensive data from unauthorized sources to train its AI models, with allegations that CEO Mark Zuckerberg personally authorized the use of such data.
Generative AI models require vast datasets to function effectively. Often, these datasets include copyrighted materials scraped from the internet without explicit permission from the rights holders. Companies argue that this constitutes “fair use,” a legal doctrine that allows limited use of copyrighted material without permission under certain conditions. However, many content creators and legal experts dispute this claim, especially when the AI outputs closely resemble the original works.
Mankoff perspective underscores the legal and ethical consensus that human authorship remains the cornerstone of creative ownership, even as AI technologies become increasingly sophisticated.
Start-ups assisting creative industries in selling content to AI companies are gaining investor support, amidst heavy scrutiny.
This shift is more about survival than fairness. The creative industries have long been at odds with AI companies over copyright concerns, that’s why Mankoff encourages aspiring creators, emphasizing that they “can find a way to be great in [their] own way” and not cede to the AI algorithm.
In response, startups are emerging to ensure creative professionals are compensated for their work. Several marketplaces, including Pip Labs, Vermillio, ProRata, and Human Native, are being developed to enable artists, musicians, and writers to license their content for AI training. This shift, highlighted in the Financial Times, aims to ensure creators profit from the use of their work in AI development, addressing concerns of exploitation. The AI licensing market is predicted to expand significantly, from $10 billion in 2025 to $67.5 billion by 2030. Consequently, investors view these structured content marketplaces as crucial for fostering ethical AI innovation.
As Vermillio’s mission says, “you own it. we protect it,”
This AI rights management platform is designed to protect creators’ intellectual property (IP) and likeness in the age of generative AI. Its flagship product, TraceID, offers tools to monitor, license, and monetize creative content, ensuring that artists and rights holders maintain control over their work.
Founded in 2021 by entrepreneur Dan Neely to help celebrities protect their likeness from being misused. The company bills itself as “the first generative AI platform” built specifically to protect creative content.
Vermillio safeguards clients by creating a comprehensive “holistic likeness model” from their content. Their software then searches prominent platforms for potential matches. In the era of generative AI, the startup aspires to become the internet’s equivalent of a “blue checkmark,” according to Neely.
In a recent development, the startup has secured $16 million in Series A funding, led by Sony Music and DNS Capital, marking the first instance of Sony Music investing into an AI music protection service. Neely stated that “the licensing of content that doesn’t exist on the open internet is going to be a big business,”
As Dennis Kooker, president of global digital business at Sony Music Entertainment shared “Dan Neely and the team at Vermillio share our vision that prioritizing proper consent, clear attribution and appropriate compensation for professional creators is foundational to unlocking monetization opportunities in this space.”
This year the startup has also introduced groundbreaking 4C Verification program, built on four essential principles: Consent, Credit, Compensation, and Control. The framework ensures that creators determine how and where their content can be used, receive proper attribution for their work, and are fairly compensated when their creations contribute to AI training models.
Most importantly, it allows them to maintain full authority over their creative assets, giving them the ability to monitor and manage their intellectual property in an increasingly digital landscape.