Lee Duna@lemmy.nz to Technology@lemmy.worldEnglish · 10 months agoWant a more private ChatGPT alternative that runs offline? Check out Janbgr.comexternal-linkmessage-square81fedilinkarrow-up1419arrow-down117
arrow-up1402arrow-down1external-linkWant a more private ChatGPT alternative that runs offline? Check out Janbgr.comLee Duna@lemmy.nz to Technology@lemmy.worldEnglish · 10 months agomessage-square81fedilink
minus-square🇸🇵🇪🇨🇺🇱🇦🇹🇪🇷@lemmy.worldlinkfedilinkEnglisharrow-up4·10 months agoI think a good 13B model running on 12GB of VRAM can do pretty well. But I’d be hard pressed to believe anything under 33B would beat 3.5.
minus-squaremiss_brainfart@lemmy.mllinkfedilinkEnglisharrow-up3·edit-210 months agoAsking as someone who doesn’t know anything about any of this: Does more B mean better?
minus-squarealphafalcon@feddit.delinkfedilinkEnglisharrow-up4·10 months agoB stands for Billion (Parameters) IIRC
minus-squarejune@lemmy.worldlinkfedilinkEnglisharrow-up3arrow-down3·10 months ago3.5 fuckin sucks though. That’s a pretty low bar to set imo.
Local LLMs can beat GPT 3.5 now.
I think a good 13B model running on 12GB of VRAM can do pretty well. But I’d be hard pressed to believe anything under 33B would beat 3.5.
Asking as someone who doesn’t know anything about any of this:
Does more B mean better?
B stands for Billion (Parameters) IIRC
3.5 fuckin sucks though. That’s a pretty low bar to set imo.