I think there’s a vast difference between “I say I can take in images as input for prompts with limitations “ and “I’m using the wrong tool for a completely absurd use case” like your microscope analogy implies.
LLM is the wrong tool for image analysis, even if the providers say that it is possible. Possibility doesn’t mean effectiveness or even usefulness. Like a microscope and onions.
I think there’s a vast difference between “I say I can take in images as input for prompts with limitations “ and “I’m using the wrong tool for a completely absurd use case” like your microscope analogy implies.
LLM is the wrong tool for image analysis, even if the providers say that it is possible. Possibility doesn’t mean effectiveness or even usefulness. Like a microscope and onions.