SurpriZe@lemm.ee to Asklemmy@lemmy.ml · 2 months agoWhat model do you use in your GPT4all?message-squaremessage-square6fedilinkarrow-up134arrow-down18file-text
arrow-up126arrow-down1message-squareWhat model do you use in your GPT4all?SurpriZe@lemm.ee to Asklemmy@lemmy.ml · 2 months agomessage-square6fedilinkfile-text
Curious about what model is best to use on my RTX 3080 + Ryzen 5 3600 since I’ve just found out about this.
minus-squaregeneva_convenience@lemmy.mllinkfedilinkarrow-up3·2 months agoLlama3.1 8b,the other versions are too big to run on gpu
Llama3.1 8b,the other versions are too big to run on gpu