Raahul Seshadri’s Post

View profile for Raahul Seshadri, graphic

Director, AI & Tech at WebEngage | I invent deep-tech differentiators

How many language models are there? Here’s an awesome visualisation and fun facts. 👇 > It’s not just large language models that are important. Yes, they generalise well, and can be prompt-engineered to do many things. > However, LLMs tend to be heavy, both in resources and cost. It’s like using a flame-thrower to cook marshmallows. > Small-language models (SLMs) are equally important. Their smaller size means they won’t perform as well as LLMs, and they won’t have a lot of “compressed knowledge” within them. > However, smaller size also means that they can be fine-tuned with far fewer resources. Sometimes in a home desktop too, as long as you have a decent GPU. > SLMs also takes far fewer resources. I’ve run some on my laptop, for example. > YCombinator already has a section in their Request for Startups for specialised models that are resource-conscious and quick. > LLMs can be leveraged to create training data for smaller models. The SLM only needs to be big enough to model the “higher order representation space” well. The era of you running far more intelligent models on your laptops is just around the corner. #generativeai #softwareengineering #machinelearning

To view or add a comment, sign in

Explore topics