☆ Yσɠƚԋσʂ ☆@lemmy.ml to Technology@lemmy.mlEnglish · 10 months agoOpenAI could be on the brink of bankruptcy in under 12 months, with projections of $5 billion in losseswww.windowscentral.comexternal-linkmessage-square69linkfedilinkarrow-up1147arrow-down14cross-posted to: technology@lemmygrad.mltechnology@hexbear.net
arrow-up1143arrow-down1external-linkOpenAI could be on the brink of bankruptcy in under 12 months, with projections of $5 billion in losseswww.windowscentral.com☆ Yσɠƚԋσʂ ☆@lemmy.ml to Technology@lemmy.mlEnglish · 10 months agomessage-square69linkfedilinkcross-posted to: technology@lemmygrad.mltechnology@hexbear.net
minus-square☆ Yσɠƚԋσʂ ☆@lemmy.mlOPlinkfedilinkarrow-up12·10 months agoI find you can just run local models for that. For example, I’ve been using gpt4all with a the phind model and it works reasonably well https://github.com/nomic-ai/gpt4all https://huggingface.co/Phind/Phind-CodeLlama-34B-v2
minus-squaredriving_croonerlinkfedilinkarrow-up7·10 months agoHow much computer power they need? My pc is pretty old :/
minus-square☆ Yσɠƚԋσʂ ☆@lemmy.mlOPlinkfedilinkarrow-up1·10 months agoIf you have a GPU, it should work, but will take longer to generate answers than an online service likely.
I find you can just run local models for that. For example, I’ve been using gpt4all with a the phind model and it works reasonably well
How much computer power they need? My pc is pretty old :/
If you have a GPU, it should work, but will take longer to generate answers than an online service likely.