There’s a pretty significant difference between good AI models and great ones. I sometimes think about how I could integrate local models directly into the Micro.blog apps, but I wouldn’t want it to be worse than using (for example) OpenAI. How many of my users really have a Mac with 24+ GB of RAM?

Dave Winer

i haven’t been able to discern differences, i use three, chatgpt, claude.ai, and gemini (google’s system I got a free year with my phone). they all seem about the same, the features are different, but the quality of the AI not discernable at least to me.

Manton Reece

@dave Yeah, I’m thinking more like if you download a small open source model to your computer and run it yourself, it will be worse than ChatGPT, Claude, etc. So this limits the practical use of embedding a model directly in an app.

jack

definitely not me haha!

Have you looked at the Gemini models? might be slightly quicker/cheaper to use 2.0 Flash (or similar) for the alt text stuff!

Pedro Corá

What? Are there computer with less than 32gb of RAM?! 🤪

Manton Reece

@j4ck I haven’t looked closely at Gemini. Overall I’m happy with OpenAI, but love to experiment with different things.

Manton Reece @manton
Lightbox Image