• danA
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    4 months ago

    I’m pretty sure Google uses their TPU chips

    The Coral ones? They don’t have nearly enough RAM to handle LLMs - they only have 8MB RAM and only support small Tensorflow Lite models.

    Google might have some custom-made non-public chips though - a lot of the big tech companies are working on that.

    instead of a regular GPU

    I wouldn’t call them regular GPUs… AI use cases often use products like the Nvidia H100, which are specifically designed for AI. They don’t have any video output ports.