Nick Vrana’s Post

View profile for Nick Vrana, graphic

I help build simple experiences for complex businesses.

Microsoft released a new paper discussing Phi-3, their latest small language model, today and I think many people are sleeping on the opportunities that are about to open up with locally run, generalist models. A small model, closer to the user, with the benefit of context from ALL of their data across many services, is going to outperform specialist models and large (cloud hosted) generalist models in so many use cases. There are glimpses of it in the research and leaks from Apple - MM1 is model that can read and navigate through apps so you don't have to. There are never-ending rumors of a local language model to make future-Siri lower-latency and smarter. Google is shipping micro and mini Gemini on devices and giving developers the tools to build LoRAs to shape their behaviors. It will come for desktops too, with Microsoft pushing chip manufacturers for AI accelerators in all next-generation hardware. The cloud isn't going anywhere, but things suddenly look much brighter for local-first software.

Carlos Anaya

AI and ML independent researcher and operator also, Experienced Cloud Engineer. Experienced in Financial, Infrastructure, Cloud, Start-Up, MSP and Federal business environments.

4mo

This has already been happening for a few months now. Its just that laptop or phone vendors are not including it in the software suite yet. But you can setup LLM studio or ollama and download the LLMs, MLMs or SLSMs of choice and run them anywhere for your use on your device .

Like
Reply

To view or add a comment, sign in

Explore topics