• Richard@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    21 hours ago

    Untrue. There are small models that produce better output than the previous “flagships” like GPT-2. Also, you can achieve much more than we currently do with far less energy by working on novel, specialised hardware (neuromorphic computing).