Langfuse (YC W23)

Langfuse (YC W23)

Software Development

Open Source LLM Engineering Platform

About us

Langfuse is the open source LLM Engineering Platform. It provides observability, metrics, evals, prompt management and a playground and to debug and improve LLM apps. Langfuse is open. It works with any model, any framework, allows for complex nesting and has open APIs to build downstream use cases. Demo: langfuse.com/docs Docs: langfuse.com/docs Github: github.com/langfuse/langfuse

Website
https://langfuse.com
Industry
Software Development
Company size
2-10 employees
Headquarters
San Francisco
Type
Privately Held
Founded
2022
Specialties
Langfuse, Large Language Models, Observability, Prompt Management, Evaluations, Testing, and Open Source

Locations

Employees at Langfuse (YC W23)

Updates

  • Langfuse (YC W23) reposted this

    View profile for Tom Yeh, graphic

    CS Prof | AI by Hand ✍️ | CU Boulder

    RAG & Langfuse: Tracing by hand ✍️ I am excited about the webinar tomorrow on RAG evaluation. One of the topics will be Langfuse. Langfuse (YC W23) is arguably the most popular open source tools for collecting logs and traces from RAG applications (5.4K stars, 10K active teams). Recently it is seeing 600K SDK weekly installs. It is now backed by a $4 million seed round. It is definitely growing fast and making an impact! Why are RAG traces important? Because you will need to use traces to calculate evaluation metrics, find bugs, and improve performance of your RAG applications. Thanks Marc Klingen, the co-founder of Langfuse, for speaking with me to help me learn about their awesome tool and hear his vision for Langfuse. Hope to see you at the webinar, where I will share more content like this! #rag #evaluation  #aibyhand

    • No alternative text description for this image
  • View organization page for Langfuse (YC W23), graphic

    3,511 followers

    🪢 Multimodality landing in Langfuse. 🖼️ Images are now supported across Langfuse with Video, Audio and PDF support coming soon. 👇 More information in the comments

    View profile for Marlies Mayerhofer, graphic

    Founding Engineer Langfuse | CS Imperial

    Introducing Multi-Modal Support in Langfuse (YC W23) Tracing 💡 Comprehensive Analysis: Log and visualize text and image data as part of traces and observations for a richer, more detailed understanding of system behavior 🔍 Improved Context: Provides enhanced context for debugging and human-in-the-loop annotation We’re exploring building support across modalities — We’d love your feedback!

  • Langfuse (YC W23) reposted this

    View profile for Marlies Mayerhofer, graphic

    Founding Engineer Langfuse | CS Imperial

    Introducing Multi-Modal Support in Langfuse (YC W23) Tracing 💡 Comprehensive Analysis: Log and visualize text and image data as part of traces and observations for a richer, more detailed understanding of system behavior 🔍 Improved Context: Provides enhanced context for debugging and human-in-the-loop annotation We’re exploring building support across modalities — We’d love your feedback!

  • Langfuse (YC W23) reposted this

    View organization page for Milvus, graphic

    4,001 followers

    📚 Discover how to use the LlamaIndex Langfuse (YC W23) integration with our simple demo! This guide demonstrates how to store documents and queries with Milvus and uses Langfuse to measure the retrieval quality. ✨ Langfuse offers a simple integration for automatically capturing traces and metrics generated in LlamaIndex applications. 🔗 Get started: https://lnkd.in/gbpDhu-D #AI #Milvus #LangFuse #Tech

    • No alternative text description for this image
  • Langfuse (YC W23) reposted this

    View profile for Emre Gucer, graphic

    Co-Founder and CEO at Fume (YC W24)

    Fume builds a new feature for the Langfuse (YC W23) integration in the LiteLLM (YC W23) repository! We use both of these awesome tools and are really happy with them! We needed a feature recently and realized both are open sourced. So... Why not we let Fume (YC W24) build it for us. So many engineering teams are forced to switch context on minor features requested by the product teams. From now on, Fume can automate those small improvements. Go to fumedev.com and get started in under 30 minutes! 🚅🌪️🪢

  • Langfuse (YC W23) reposted this

    🪢 We're running Langfuse (YC W23) on Porter and they're a company we admire (especially for their amazing technical support). 🏃 Stoked that Porter users can now deploy Langfuse to their cluster as an add-on. Likely one of the easiest ways to deploy & scale Langfuse. 🙏 Thank you Justin Rhee & team - it's already been so fun working together, excited what we can still do in the future!

    View organization page for Porter, graphic

    2,522 followers

    📊 New Add-on: Langfuse (YC W23), which provides product analytics for LLM apps, is now available as an add-on! 🤑 Spot Instances: When selecting node groups to deploy onto on AWS, spot instances are now an option (including GPU instance types). 💬 Slack Notifications Filtering: Notifications via Slack can now be filtered on an application level. 🔐 Enhanced Security: All Porter-managed infrastructure can now utilize an advanced service mesh. 🖥 Customizable GPU Resources: Users can now tailor their GKE clusters with flexible GPU options. More details in the changelog here 👉 https://lnkd.in/gXtwuchG

    • No alternative text description for this image
  • View organization page for Langfuse (YC W23), graphic

    3,511 followers

    ⚡️ OpenAI Structured Outputs now supported in Langfuse (YC W23) ❓ What are structured outputs in LLMs? 🧮 Structured outputs ensure that LLM responses consistently stick to a JSON schema. This simply means that you can rely on model outputs always following the same format (e.g. a number between 0-10, one out of four categories etc.). Structured outputs are a boon to increasing the reliability of LLM apps - they simply minimize LLMs notorious potential for going off the rails. Using structured outputs, developers can build apps that: • reliably extract structured data • fetch data & answer questions via function calling • create reliable multi-step workflows for LLMs to take actions 💡 In Langfuse you can easily track your OpenAI structured (and unstructured) outputs to improve your LLM application. 👇 Have a look at this cookbook to see how to trace structured outputs & supplied JSON schemas in Langfuse (in comments) 😱 Shoutout to Hassieb Pakzad for tweaking our OpenAI SDKs today

    • No alternative text description for this image
  • View organization page for Langfuse (YC W23), graphic

    3,511 followers

    🤖🔍 AI Agent Observability! Everyone is talking about AI agents. You can easily set them up with no-code builders like FlowiseAI (YC S23), Dify, or Langflow, or use application frameworks like LangGraph (LangChain) or Llama Agents (LlamaIndex). ❓But what is AI Agent Observability? It makes your agents ready for production by tracking and analyzing performance and interactions. Langfuse (YC W23) provides real-time insights into metrics like latency, cost, and error rates, allowing developers to debug, optimize, and enhance their systems. 💡 Why it Matters: • Debugging & Edge Cases: Observability helps trace agent steps and test edge cases, crucial for resolving issues. • Balancing Accuracy & Costs: Monitoring model usage helps strike a balance between accuracy and costs. • Understanding User Interactions: Capturing user interaction data helps refine applications for better outputs. 👇 Link to article in comments if you want to learn more about AI Agent Observability

    • No alternative text description for this image
  • View organization page for Langfuse (YC W23), graphic

    3,511 followers

    🚀 Exciting News! We are proud to share that Langfuse (YC W23) has been recognized as one of Germany's most promising AI startups by WirtschaftsWoche, one of Germany's leading business news publishers! 🌟 ⚡️ Standing out among 687 AI startups in Germany is an honor, as is being included among amazing companies like Tacto, Langdock, Parloa and Helsing. 🪢 A huge shoutout to our dedicated team, incredible users, and our open source community. 🔗 The full article is linked in the comments

Similar pages

Funding