• Daxtron2@startrek.website
    cake
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 months ago

    Then you haven’t been paying attention. There’s been huge strides in the field of small open language models which can do inference with low enough power consumption to run locally on a phone.