Docs
Integrations
Dify.AI

Dify - Observability & Metrics for your LLM apps

Dify (GitHub (opens in a new tab)) is an open-source LLM app development platform which is natively integrated with Langfuse. With the native integration, you can use Dify to quickly create complex LLM applications and then use Langfuse to monitor and improve them.

Demo of Dify integration

Setup

  1. Create project in Langfuse and get API credentials in project settings.
  2. In Dify: Navigate to Monitoring settings of you Dify app.
  3. Add Langfuse via Third-party LLMOps provider menu.
  4. Invoke Dify application via UI or API to start capturing traces and metrics in Langfuse.

Mapping of Dify to Langfuse

The integration automatically maps the following fields from Dify to Langfuse:

DifyLangfuse
useruserId
messagetrace.id
conversation_idsessionId
trace.name
type of application, type of modeltags

About Dify

Source: Dify Readme (opens in a new tab)

Dify is an open-source LLM app development platform. Its intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production. Here's a list of the core features:

  1. Workflow: Build and test powerful AI workflows on a visual canvas, leveraging all the following features and beyond.

    Introducing Dify Workflow
  2. Comprehensive model support: Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found here (opens in a new tab).

  3. Prompt IDE: Intuitive interface for crafting prompts, comparing model performance, and adding additional features such as text-to-speech to a chat-based app.

  4. RAG Pipeline: Extensive RAG capabilities that cover everything from document ingestion to retrieval, with out-of-box support for text extraction from PDFs, PPTs, and other common document formats.

  5. Agent capabilities: You can define agents based on LLM Function Calling or ReAct, and add pre-built or custom tools for the agent. Dify provides 50+ built-in tools for AI agents, such as Google Search, DELL·E, Stable Diffusion and WolframAlpha.

  6. LLMOps: Monitor and analyze application logs and performance over time. You could continuously improve prompts, datasets, and models based on production data and annotations.

  7. Backend-as-a-Service: All of Dify's offerings come with corresponding APIs, so you could effortlessly integrate Dify into your own business logic.

Was this page useful?

Questions? We're here to help

Subscribe to updates