Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Go back inside MotherOpenAI said it is developing a tool to allow developers to specify how they want their projects to be included in — or excluded from — its AI training data. But after 7 months, this part has not seen the light of day.
Called Media Manager, the tool will “identify copyrighted text, images, audio, and video content,” OpenAI said at the time, showing content creators select “from multiple resources.” His intention was to ban some of the companies dangerous opponentsand protect OpenAI from Legal issues related to IP.
But people in the know tell TechCrunch that the device didn’t appear to be very important internally. “I don’t think it was important,” said one OpenAI veteran. “Honestly, I don’t remember anyone working there.”
An employee who coordinates operations with the company told TechCrunch in December that it had discussed the tool with OpenAI in the past, but there had been no recent updates. (These people declined to be identified publicly to discuss confidential business matters.)
And a member of OpenAI’s legal team who was working on Media Manager, Fred von Lohmann, moved to a part-time advisory role in October. OpenAI PR confirmed Von Lohmann’s move to TechCrunch via email.
OpenAI has yet to provide an update on Media Manager’s progress, and the company missed its self-imposed deadline for the tool to be available “by 2025.” (To be clear, “by 2025” could be read to include the year 2025, but TechCrunch interpreted OpenAI’s language to mean leading up to January 1, 2025.)
AI models like OpenAI’s learning method in data sets to make predictions – for example, this When a person bites into a burger, they leave a bite mark. This allows models to learn how the world works, step by step, by looking. ChatGPT can write convincing emails and articles, right SoraOpenAI animation engine, can create realistic images.
The ability to draw on examples of text, film, and more to create new tasks makes AI incredibly powerful. But it is regurgitative. When instructed in a certain way, the models – many of which are trained on countless websites, videos, and images – produce almost identical copies of the data, which although “openly available,” should not be used in this way.
For example, Sora can make videos with TikTok logo and famous video game characters. The New York Times found ChatGPT to quote its words (OpenAI criticized the behavior for “to hack“).
This has angered developers whose work has been swept up in AI studies without their permission. Many have practiced law.
OpenAI is fighting lawsuits filed by artists, writers, YouTube userscomputer scientists, and media organizations, all of whom claim to be illegally trained for their jobs. Critics include writers Sarah Silverman and Ta Nehisi-Coates, photographers, and media conglomerates such as The New York Times and Radio-Canada, to name a few.
OpenAI has followed suit to give consent sales and choose friendsbut not all manufacturers see it words as beautiful.
OpenAI offers developers a number of ways to create automated “outputs” for its AI training. In September, the company he started a form allowing artists to submit their work to be removed from his future studies. And OpenAI has long allowed webmasters to block its bots from crawling the web data removal in their communities.
But developers have criticized these methods as arbitrary and inadequate. There are no options for downloading text, videos, or recordings. And the photo exit feature requires sending a copy of each photo to be removed along with a description, a difficult process.
Media Manager was launched as a complete overhaul – and expansion – of today’s OpenAI solutions.
In an announcement in May, OpenAI said Media Manager will use “high-end machine learning” to help creators and owners “tell (OpenAI) what they have.” OpenAI, which said it is cooperating with regulators while developing the tool, said it hopes Media Manager will “set the standard for the entire AI industry.”
OpenAI has not mentioned Media Manager since then.
A spokesperson told TechCrunch that the tool was “still in development” as of August, but did not respond to a request for comment in December.
OpenAI did not specify when Media Manager would launch – or what it would launch with.
Assuming Media Manager does arrive at some point, experts aren’t sure it will solve developers’ concerns — or do much to solve questions about using AI and IP.
Adrian Cyhan, IP lawyer at Stubbs Alderton & Markiles, said that Media Manager as defined is a demanding job. Even big platforms like YouTube and TikTok struggle and Content ID at scale. Can OpenAI do better?
“Ensuring that people are complying with the legal protections and what they need to do to get back what they think is difficult,” Cyhan told TechCrunch, “especially because of the rapidly changing nature of the law and the potential for it to vary across national and local jurisdictions.”
Ed Newton-Rex, the founder of Fairly Trained, a non-profit organization that ensures that AI companies respect the rights of creators, believes that Media Manager would unfairly shift the burden of managing AI training to creators; By not using them, they may be tacitly consenting to their services being used. “A lot of developers don’t hear about it, don’t use it,” he told TechCrunch. “However, it will be used to protect the exploitation of the creative work against the interests of the creators.”
Mike Borella, chairman of MBHB’s AI team, pointed out that output systems don’t always account for changes that can be made to a job, such as the image that was downloaded. They also can’t handle what is often done on other platforms that host content for creators, added Joshua Weigensberg, IP and media representative for Pryor Cashman.
“Creators and copyright owners don’t control, and often don’t even know, where their work appears online,” Weigensberg said. “Even if the creator tells any AI platform that they are stopping training, these companies can continue to train copies of their work that are available on other websites and services.”
Media Manager may not be very beneficial for OpenAI, especially from a legal perspective. Evan Everist, a partner at Dorsey & Whitney who specializes in legal matters, said that while OpenAI could use the tool to show a judge that it is reducing its training on IP protection, Media Manager may not protect the company from damages if it does so. found that he had broken it.
“Copyright owners are under no obligation to knowingly tell others not to infringe their work before it happens,” Everist said. “The requirements of copyright law still apply – for example, don’t take and copy other people’s material without permission. This could be more about PR and positioning OpenAI as a good user.
Without Media Manager, OpenAI has implemented filters – even imperfect – prevent its models from repeating training models. And in the lawsuits it is fighting, the company continues to claim fair work defense, saying that his models make changes, not persuasion, work.
OpenAI may win its own arguments.
Courts may decide that a company’s AI has a “transformational purpose,” to follow example was founded ten years ago at a rival publishing company to Google. In that case, the court ruled that Google’s copying of millions of Google Books, a type of digital archive, was legal.
OpenAI has he said publicly that it would be “impossible” to train competitive AI models without using legitimate tools – legitimate or not. “Reducing educational information in books and drawings created more than a century ago can lead to interesting experiments, but it will not provide AI systems that meet the needs of today’s citizens,” the company wrote in January to the UK’s House of Lords. .
If the courts decide that OpenAI wins, Media Manager won’t be able to do much legally. OpenAI seems ready to bet — or rethink its exit strategy.