I wonder if its a one-time fork, or if Musk wants to continue to derive benefit from Wikipedia authors and editors. Is it possible its actually a real-time AI filtering each request? If so that would burn up a massive amount of AI tokens.
It would also present some great methods to tease out exactly how his filters are working. Assuming its real-time, a single wikipedia page could be created with test content with specific words or phrases then a check on the grok version to see if it alters it. A full map could be built of exactly the rules its using.
It can’t be a one time fork. Whenever his propaganda changes, he’s going to want to give different instructions to his AI and then regenerate the entire encyclopedia like he already did.
It’s a different use case from an encyclopedia that is based on facts and the truth.
I wonder if its a one-time fork, or if Musk wants to continue to derive benefit from Wikipedia authors and editors. Is it possible its actually a real-time AI filtering each request? If so that would burn up a massive amount of AI tokens.
It would also present some great methods to tease out exactly how his filters are working. Assuming its real-time, a single wikipedia page could be created with test content with specific words or phrases then a check on the grok version to see if it alters it. A full map could be built of exactly the rules its using.
It can’t be a one time fork. Whenever his propaganda changes, he’s going to want to give different instructions to his AI and then regenerate the entire encyclopedia like he already did.
It’s a different use case from an encyclopedia that is based on facts and the truth.