That’s an interesting point, and leads to a reasonable argument that if an AI is trained on a given open source codebase, developers should have free access to use that AI to improve said codebase. I wonder whether future license models might include such clauses.
Oh good. They can show us how it’s done by patching open-source projects for example. Right? That way we will see that they are not full of shit.
Where are the patches? They have trained on millions of open-source projects after all. It should be easy. Show us.
That’s an interesting point, and leads to a reasonable argument that if an AI is trained on a given open source codebase, developers should have free access to use that AI to improve said codebase. I wonder whether future license models might include such clauses.