• ☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOP
    link
    fedilink
    English
    arrow-up
    3
    ·
    18 days ago

    I’ve found Qwen is overall similar, their smaller model that you can run locally tends to produce somewhat better output in my experience. Another recent open source model that’s good at coding is GLM https://z.ai/blog/glm-4.5

    6gb vram is unfortunately somewhat low, you can run smaller models but the quality of output is not amazing.