Langfuse in the Enterprise
Langfuse addresses key challenges when deploying LLM-based applications within an enterprise. As an open source project, Langfuse is the ideal platform to address data security and privacy concerns by self-hosting. This document outlines common queries related to using Langfuse in an enterprise setting.
Langfuse is licensed as an open-core project. It’s core product (tracing, observability, evals, prompt management and API/SDK endpoints) is MIT-licensed and freely available (also without limitation for commercial use). Some Langfuse features on the periphery of the core product are not available in the open-source version and cannot be used out of the box. As of today, these are either a) quality of life features (such as LLM-as-a-judge service, prompt playground) or b) security & compliance features (e.g. SSO enforcement, data retention)
Please refer to our feature overview and the Enterprise Edition FAQ here. Please reach out to enterprise@langfuse.com to discuss an enterprise license (self-hosted or cloud) for your team.
Select Resources
- Overview: Slide Deck Langfuse (October 2024)
- Why Langfuse?
- Langfuse has twice been included in the Thoughtworks Tech Radar
- Enterprise Edition (EE) FAQ
- Langfuse can be procured through the AWS Marketplace.
Talk to us
Reach out via email (enterprise@langfuse.com) or talk to us to discuss your specific needs and requirements.
Introduction to Langfuse
10 minute demo of all Langfuse features
Langfuse features along the development lifecycle
Resources
- Product
- Scope of LLM Engineering Platform
- Positioning of Langfuse
- Technical documentation with detailed information on all features and integrations
- 10 minute demo of all Langfuse features
- Interactive demo of the Langfuse platform
- Changelog
- Roadmap
- Traces as the core of the LLMOps workflow (webinar + slides)
- Langfuse has been included in the Thoughtsworks Tech Radar
- Langfuse Cloud
- Cloud Pricing
- Security and Compliance - includes information on GDPR, SOC2 and ISO27001 compliance
- Self-hosting
- Deployment Guide
- Open Source Licensing
- Comparison of Open Source, Pro, and Enterprise version
- Langfuse Self-Hosting: EE Terms and Conditions
Resellers and managed Langfuse deployments
We partner with Shakudo who operates Langfuse Enterprise self-hosted on behalf of customers on their VPC. We are happy to introduce prospects to our contact at Shakudo for more information on managed self-hosting.
FAQ
We collect the most common questions and answers here. If you have questions that are not answered, please reach out to us: enterprise@langfuse.com
What deployment options are available for Langfuse?
- Managed Cloud (cloud.langfuse.com), see Pricing and Security page for details.
- Self-hosted on your own infrastructure. Contact us if you are interested in additional support.
What is the difference between Langfuse Cloud and the open-source version?
The Langfuse team provides Langfuse Cloud as a managed solution to simplify the initial setup of Langfuse and to minimize the operational overhead of maintaining high availability in production. You can chose to self-host Langfuse on your own infrastructure.
Some features are not available in the open-source version. Please refer to the overview here.
How does Authentication and RBAC work in Langfuse?
Langfuse offers a list of prebuilt roles which apply on an organizational and project level to restrict access (RBAC documentation).
If needed, environments (production, staging, development) can be separated into different projects in Langfuse to restrict access to production/sensitive data while making it easier to share development environments with the team and other stakeholders.
SSO with Langfuse is simple. Currently Google, GitHub, Azure AD, Okta, Auth0, and AWS Cognito are supported. We can easily add additional providers based on your requirements. As an enterprise customer, you can also enforce SSO for your organization.
What is the easiest way to try Langfuse?
The Hobby Plan on Langfuse Cloud includes enough resources to try Langfuse for free while in a non-production environment, no credit card required.
Alternatively, you can quickly spin up Langfuse on your own machine using docker compose up
(docs).
If you require security and compliance features to run a POC, please reach out to us at enterprise@langfuse.com.
Common Enterprise LLM Platform Architecture
Langfuse aims to address the challenges of debugging, monitoring, and continuously improving LLM-based applications. It focuses on observability, evaluation, and prompt management.
Langfuse is often deployed alongside a central LLM Gateway that provides schema translation, rate limiting, and PII redaction. The LLM Gateway can be an internal service or an open-source project like LiteLLM. If you use LiteLLM, you can leverage the native integration (docs).
Talk to us
Reach out via email (enterprise@langfuse.com) or talk to us to discuss your specific needs and requirements.