Anthropic Issues Copyright Takedown Requests To Remove 8,000+ Copies of Claude Code Source Code 69
Anthropic is using copyright takedown notices to try to contain an accidental leak of the underlying instructions for its Claude Code AI agent. According to the Wall Street Journal, "Anthropic representatives had used a copyright takedown request to force the removal of more than 8,000 copies and adaptations of the raw Claude Code instructions ... that developers had shared on programming platform GitHub." From the report: Programmers combing through the source code so far have marveled on social media at some of Anthropic's tricks for getting its Claude AI models to operate as Claude Code. One feature asks the models to go back periodically through tasks and consolidate their memories -- a process it calls dreaming. Another appears to instruct Claude Code in some cases to go "undercover" and not reveal that it is an AI when publishing code to platforms like GitHub. Others found tags in the code that appeared pointed at future product releases. The code even included a Tamagotchi-style pet called "Buddy" that users could interact with.
After Anthropic requested that GitHub remove copies of its proprietary code, another programmer used other AI tools to rewrite the Claude Code functionality in other programming languages. Writing on GitHub, the programmer said the effort was aimed at keeping the information available without risking a takedown. That new version has itself become popular on the programming platform.
After Anthropic requested that GitHub remove copies of its proprietary code, another programmer used other AI tools to rewrite the Claude Code functionality in other programming languages. Writing on GitHub, the programmer said the effort was aimed at keeping the information available without risking a takedown. That new version has itself become popular on the programming platform.
Stolen is one thing (Score:4, Insightful)
Accidentally released is another.
If your process results in making your code public... too late? You published. Learn something from that and update your process.
Re: (Score:2)
If your process results in making your code public... too late? You published.
"Code is law".
Re: (Score:2)
More "information wants to be free".
Once code is published, anybody can learn from it and be inspired to create their own version of it. Trying to police that is impossible.
Re: (Score:3)
Not necessarily impossible...but almost always inadvisable. They can be sure that all their actual competitors already have copies before they get the takedown issued.
In this case I don't think a takedown will even limit the damage...it might well exacerbate it.
Re: (Score:2)
That's a good license question. If the "binaries" (the minimized version) were released under some free-to-use license, the bundled source is still not open source, but they also can't revoke the free-to-use permissions granted by the license for the things they officially release.
Re: (Score:2)
but they also can't revoke the free-to-use permissions granted by the license for the things they officially release.
If you mean sources, yes. If you mean binaries, no. e.g. the GPL permits you to cease distribution of the binary as a remedy for not wanting to provide sources.
Re: (Score:2)
You are thinking about how GPL interacts with binaries and source. But the point here is, that you got a bundle under a license that probably says something like "You can use the provided stuff to develop programs" (and a lot of fine print) which now does not only apply to the "binaries" (we're talking about minified JS, but let's treat it like binaries), but also to the source map. It surely is no open source license, but it is still a license granting usage rights (otherwise nobody could legally use claud
Re: (Score:2)
You are thinking about how GPL interacts with binaries and source.
I am.
It surely is no open source license, but it is still a license granting usage rights (otherwise nobody could legally use claude code).
But that's absolutely the question here! That's exactly what I'm talking about! The question absolutely is already "can anyone legally use Claude-produced code?" And I don't have a strong opinion because I'm not an IP lawyer, and though I have strong opinions on how it should work, they are not particularly relevant to that argument.
But I do have some thoughts, even though I am not an IP lawyer, on what the argument hinges on, and what the GPL has to say is absolutely relevant here if using Claude does
Re: (Score:2)
The legal problems you're talking about are about training, not about the output. You need fair use to train your LLM with unlicensed text, but you don't need fair use laws to use the outputs.
Re: (Score:2)
The legal problems you're talking about are about training, not about the output. You need fair use to train your LLM with unlicensed text, but you don't need fair use laws to use the outputs.
That is a question which fundamentally has not been answered yet. The legislators and courts will collectively have the final say.
hohoho (Score:5, Insightful)
After Anthropic requested that GitHub remove copies of its proprietary code, another programmer used other AI tools to rewrite the Claude Code functionality in other programming languages. Writing on GitHub, the programmer said the effort was aimed at keeping the information available without risking a takedown. That new version has itself become popular on the programming platform.
Talk about a money shot. If Anthropic argues that this use doesn't wash away restrictions, then they're also arguing that their software is illegal. Shades of copyleft.
Re:hohoho (Score:5, Interesting)
https://github.com/Outcomefocu... [github.com]
I was to see Anthropic choke on this so bad.
Courts still haven't really ruled on AI generated code in any big countries yet, as far as I can tell. Courts could view AI code the same as AI generated images: non-copyrightable. Generated images can still be subject to trademark if you try to commercialize them, but code not so much. If code ever gets rules as non-copyrightable, any generated code is open game if it gets leaked. Courts could also rule it is subject to copyright of the original training data holders.
Both of these outcomes would be equally devastating to the entire industry in entirely different ways. I'm kinda read to see it all burn.
Re:hohoho (Score:4, Interesting)
Yes. I too want to see the GenAI industry burn to the ground. I am making popcorn as I get ready to watch Anthropic being hoisted on its own petard...
Re: (Score:2)
After Anthropic requested that GitHub remove copies of its proprietary code, another programmer used other AI tools to rewrite the Claude Code functionality in other programming languages. Writing on GitHub, the programmer said the effort was aimed at keeping the information available without risking a takedown. That new version has itself become popular on the programming platform.
Talk about a money shot. If Anthropic argues that this use doesn't wash away restrictions, then they're also arguing that their software is illegal. Shades of copyleft.
No, they're arguing there's ways to use their software to commit an illegal act, which is true of literally anything.
I can't imagine anyone making the argument that using AI tools to rewrite code in another language removes the copyright.
Re: (Score:2)
I'll take 'what is Clean-room design' for $50, Alex
Re: (Score:2)
*BUZZ*
Clean-room design requires that the new model be built from a description of the original by someone with no exposure to the original. You can't look at the code and rewrite it and call it a clean room recreation. Typically it requires separate teams: one to examine the original and document what it does (but NOT HOW), and a second team to build a new model from those specifications.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re:hohoho (Score:4, Interesting)
You say "two separate teams," but Claude pronounces it as "two separate processes with no IPC except the specs files."
Re: hohoho (Score:3)
One AI to read the code and make the spec, then several other AIs to create a new AI.
Soon you'll see Daleks
Re: (Score:2)
*BUZZ*
Clean-room design requires that the new model be built from a description of the original by someone with no exposure to the original.
Look at the rust repo. They are 100% clear that they did exactly that. Each team consisted only of one agent (an advantage, since the agent can work much faster and with AI code advantages, but still clearly a compete software team). One team (agent) saw the code and wrote a spec. A different team (second agent) saw the spec and wrote Rust code to match it.
Re: (Score:2)
You realize that's extremely easy to do with AI, right?
If I can run software, I can run run function/system tracing and introspection on it. Simply that. The difference between this and looking at the code is almost negligible, except in terms of scope of difficulty.
And with AI, I can automate the entire process.
Re: (Score:2)
I can't imagine anyone making the argument that using AI tools to rewrite code in another language removes the copyright.
I can't imagine anyone not understanding that their right to exist is based in part upon that belief, because otherwise they have been willfully aiding and abetting mass copyright infringement.
Re: (Score:2)
No, they're arguing there's ways to use their software to commit an illegal act, which is true of literally anything.
I can't imagine anyone making the argument that using AI tools to rewrite code in another language removes the copyright.
It had better remove copyright, since all AI coding agents are huge engines for regurgitating code harvested from github and stack overflow without any attribution or respect for the original licenses. If this use doesn't elide copyright, then the LLMs themselves have no right to exist.
Re: (Score:2)
Literally everyone is making that argument right now, like ChartDev:
https://www.theregister.com/20... [theregister.com]
Re: (Score:2)
That argument has already been made [lwn.net].
Re: (Score:3)
Re: (Score:2)
Interesting idea but most likely both AI systems were trained on the original code in question. I can only imagine the millions being spent to make a horrible law to help protect Anthropic's code base be copyright protected.
Re: (Score:2)
Interesting idea but most likely both AI systems were trained on the original code in question. I can only imagine the millions being spent to make a horrible law to help protect Anthropic's code base be copyright protected.
Anthropic may well have trained their agents on some of their customer's code. I guarantee you that they have never trained it on their own core source code.
John Gilmore (Score:4, Informative)
I went to a party at his very nice Victorian house in San Francisco once, when I used to hang out with Sun nerds.
Re: (Score:2)
"The Net interprets censorship as damage and routes around it".
This utter nonsense was debunked decades ago.
They should ask the MPAA.... (Score:5, Insightful)
... how well taking down DeCSS [wikipedia.org] worked out.
Stupid (Score:2)
Re:Stupid (Score:5, Insightful)
This is actually a smart move if they envision ever trying to go after other companies for using their code. "If it wasn't for public use, why didn't you even try to get the distributor to take it down?"
Re: (Score:2)
Correct. Even if they can win the case for copyright, they would lose things like triple damages for wilful infringement. All a defendant would have to do is point to a different repo with no copyright identification and say that they copied from that without realizing it was the Anthropic code. As it is, there will be no such repo, except on dark web sites so that claim will be much harder to make.
Re: (Score:2)
Re: (Score:2)
I'm sure there are techniques which will be readily implemented in at least one of a half dozen different agent platforms - assuming there's any merit in it. It's too stupidly easy to build things now to really keep anything "unique" private. People will figure it out and do as they will to get things they want to use.
I personally have an agent framework that's a combination of capabilities of different agent platforms that does things the way I want to. I haven't shared it, though judging by the quality an
Hahaha, Streisand Effect, hahahahaha (Score:4, Insightful)
Apparently these noobs have never heard of it.
Oh the Irony .. (Score:5, Insightful)
Re: (Score:2)
Oh the Irony. a company whose business is built on other peoples works is suing for copyright infringement.
Poetic justice.
Grab Your Popcorn (Score:2)
Re: (Score:2)
Tee shirts printed with Claude source code when?
Too late. (Score:2)
Claude has already seen it.
April Foos! (Score:2)
Anybody else around here long enough to remember the good ol' days when Slashdot would post (exclusively?) fake articles on April Fool's Day every year?
Re:April Foos! (Score:5, Funny)
Re: (Score:2)
Remember? We've been actively trying to forget.
We didn't know how good we had it. Now, we wish the absurdities were just April Fool's Day nonsense...
Re: (Score:2)
Remember? We've been actively trying to forget.
We didn't know how good we had it. Now, we wish the absurdities were just April Fool's Day nonsense...
I noticed this across *all* the news sites I read on April 1st. No "joke" news on any of them, that I noticed. I can only conclude that this is because the real news in 2026 is so batshit insane. Poe's Law and all that.
Re: (Score:2)
Thanks for the insight. I saw a couple of very obvious April's fools messages before going to bed and then ignored the day. Now I understand the somewhat depressing reason why.
Re: (Score:2)
Yes, it was miserable nonsense.
Re: (Score:1)
This affects every future NDA... (Score:2)
They also harmed the community (Score:2)
And if you forked the very public legitimate project at https://github.com/anthropics/... [github.com] they also DMCA'ed you and removed that fork, Despite the fact that it is is perfectly legitimate and considered normal on Github to fork public repos like that As an act of social support and in order to help find bugs and report them.
Boo hoo (Score:1)
Re: (Score:2)
What does this have to do with Sam Altman? This is about Anthropic.
Anthropic is spiraling (Score:2)
Antrhopic seems to be spiraling of late, doing a lot of things which are shooting themselves in the foot.
- This "accidental" code release (I'm not convinced it was an accident and not a fancy PR stunt)
- The complete nerfing to useless of Claude Max plans (less usage, heavily throttled to the point where even getting close to quota has been impossible, and waiting 30m+ for a simple prompt response often takes longer than doing it myelf)
- Consistent API outages for the past several weeks during US business ho
They don't get it, do they? (Score:2)
Once something's out, it's out. People have made their own offline copies, outside of your dirty hands' reach. You can't take it down by issuing a takedown notice. It doesn't work like that.
Sure... (Score:2)
Sure...certainly, I'll totally delete the source code....
Fucking clown shoes, y'all.
Think I'll be shorting Anthropic....
claude rewrite this codebase in Rust ... (Score:2)
and change codebase layout and structure, prompts and assets just enough to make it look sufficiently different.
AI can help with this (Score:2)
Are they suddenly against massive theft? (Score:1)
"Hey! Massive copyright infringement and intellectual property theft is only allowed when WE do it, not anyone else!"