Package Information
Available Nodes
Documentation
n8n-nodes-openai-langfuse
This project is proudly developed and maintained by Wistron DXLab.
⚡Update: This is the new n8n-nodes-ai-agent-langfuse
project, an upgraded version with Agent integration and enhanced structured tracing support.
npm package: https://www.npmjs.com/package/n8n-nodes-openai-langfuse
Features
- Support for OpenAI-compatible chat models (e.g.,
gpt-4.1-mini,gpt-4o) - Automatic Langfuse tracing for every request and response
- Custom metadata injection:
sessionId,userId, and structured JSON
n8n is a fair-code licensed workflow automation platform.
Installation
Credentials
Operations
Compatibility
Usage
Resources
Version history
Installation
Follow the installation guide in the official n8n documentation for community nodes.
Community Nodes (Recommended)
For n8n v0.187+, install directly from the UI:
- Go to Settings → Community Nodes
- Click Install
- Enter
n8n-nodes-openai-langfusein Enter npm package name - Agree to the risks of using community nodes
- Select Install
Docker Installation (Recommended for Production)
A preconfigured Docker setup is available in the docker/ directory:
- Clone the repository and navigate to the docker/ directory
git clone https://github.com/rorubyy/n8n-nodes-openai-langfuse.git cd n8n-nodes-openai-langfuse/docker - Build the Docker image
docker build -t n8n-openai-langfuse . - Run the container
docker run -it -p 5678:5678 n8n-openai-langfuse
You can now access n8n at http://localhost:5678
Manual Installation
For a standard installation without Docker:
# Go to your n8n installation directory
cd ~/.n8n
# Install the node
npm install n8n-nodes-openai-langfuse
# Restart n8n to apply the node
n8n start
Credential
This credential is used to:
- Authenticate your OpenAI-compatible LLM endpoint
- Enable Langfuse tracing, by sending structured request/response logs to your Langfuse instance
OpenAI Settings
| Field Name | Description | Example |
|---|---|---|
| OpenAI API Key | Your API key for accessing the OpenAI-compatible endpoint | sk-abc123... |
| OpenAI Organization ID | (Optional) Your OpenAI organization ID, if required | org-xyz789 |
| OpenAI Base URL | Full URL to your OpenAI-compatible endpoint | default: https://api.openai.com/v1 |
Langfuse Settings
| Field Name | Description | Example |
|---|---|---|
| Langfuse Base URL | The base URL of your Langfuse instance | https://cloud.langfuse.com or self-hosted URL |
| Langfuse Public Key * | Langfuse public key used for tracing authentication | pk-xxx |
| Langfuse Secret Key * | Langfuse secret key used for tracing authentication | sk-xxx |
🔑 How to find your Langfuse keys:
Log in to your Langfuse dashboard, then go to:
Settings → Projects → [Your Project] to retrieve publicKey and secretKey.
Credential UI Preview
Once filled out, your credential should look like this:

✅ After saving the credential, you're ready to use the node and see traces in your Langfuse dashboard.
Operations
This node lets you inject Langfuse-compatible metadata into your OpenAI requests.
You can trace every run with context such as sessionId, userId, and any custom metadata.
Supported Fields
| Field | Type | Description |
|---|---|---|
sessionId |
string |
Logical session ID to group related runs |
userId |
string |
ID representing the end user making the request |
metadata |
object |
Custom JSON object with additional context (e.g., workflowId, env) |
🧪 Example Setup
| Input Field | Example Value |
|---|---|
| Session ID | {{$json.sessionId}} |
| User ID | test |
| Custom Metadata (JSON) |
{
"project": "test-project",
"env": "dev",
"workflow": "main-flow"
}
Visual Example
- Node Configuration UI: This shows a sample n8n workflow using the Langfuse Chat Node.

- Workflow Setup: A typical workflow using this node.

- Langfuse Trace Output
Here’s how traces appear inside the Langfuse dashboard.

Compatibility
- Requires n8n version 1.0.0 or later
- Compatible with:
- OpenAI official API (https://api.openai.com)
- Any OpenAI-compatible LLM (e.g. via LiteLLM, LocalAI, Azure OpenAI)
- Langfuse Cloud and self-hosted instances
Resources
Version History
- v1.0 – Initial release with OpenAI + Langfuse integration

