Flowise AI: An open source visual LLM workflow orchestration tool that supports the construction of chat robots.
Drag-and-drop UI allows you to quickly build Chatbot + RAG (Retrieval Enhanced Generation)+ tool calls.
What project is Flowise?
core positioning
Flowise is a open source、visualization、Low code/no code Platform that helps users build AI agents, chat bots, and various large language model (LLM)-based workflows by dragging and dropping nodes
Project highlights and functions
- Visualize the construction operation process: Through a complete drag-and-drop interface, connect different modules (such as prompt templates, models, vector databases, tool interfaces, etc.) to easily build an AI workflow
- Based on LangChain: Using LangChain (especially LangChain.js) to execute processes in the background makes building complex logic visual and modular
- Supports multiple deployment methods: Can be built through NPM, Docker, or by yourself to support on-premises deployment and cloud deployment (such as AWS, GCP, Render, Railway, etc.).
- wide application range: Suitable for building chat assistants, document Q & A, RAG (Retrieval-Augmented Generation) processes, multi-agent systems, etc.
- business-friendly:
- multi-agent coordination: Support multi-agent collaboration, parallel tasks, complex process structures (such as loops, routing, layering, etc.)
- Human-computer interaction (HITL): Support “human-in-the-loop” processes, such as having people review agent decisions, approve/reject tool calls
- Observability and process tracking: Provide complete execution tracking and support monitoring tools such as Prometheus and OpenTelemetry (
- process validation: Automatically check process configuration to reduce configuration errors
Project architecture and installation method
Warehouse structure (Monorepo)
The Flowise warehouse contains multiple modules, and the main structure is as follows:
server: Node.js backend, providing API interfaces;ui: Front-end interface built using React;components: Integrate third-party nodes and functional modules;api-documentation: Automatically generated Swagger interface document
Installation and Usage Guide
Two common methods:
1. Local installation and operation:
npm install -g flowise
npx flowise start
then visit http://localhost:3000
2. Using Docker:
- Using Docker Compose:
- replication
.env.examplefor.env, execute after configurationdocker compose up -d; - Open in browser
http://localhost:3000
- replication
- Run with Docker single image:
docker build -t flowise .docker run -d -p 3000:3000 flowise- The visit is also in
http://localhost:3000
developer mode
If you are conducting development and debugging, you can run it in turn:
pnpm install
pnpm build
pnpm start
The development model supports hot overloading through pnpm dev You can enable real-time changes on the front end (and back end)
Ecology and expansion
- Flowise SDK – Python: Provide Python SDK (
flowisePackage), you can call the Flowise chat process through the API, supporting streaming and non-streaming responses - Rich integrated support: Support GitHub document loader (can load public or private repo content, configurable recursion, concurrency, filtering, etc.)
- community feedback: The developer mentioned on Reddit that this tool demonstrated that it is an interesting open source project “drag drop UI to build your customized LLM flow using LangchainJS”
Summary overview
| aspects | content description |
|---|---|
| essence | An open source low/no-code platform for visualizing the building AI process. |
| construction way | Drag and drop nodes to access LLM, RAG, tools and other components. |
| basic technology | Based on LangChain (especially the JavaScript version). |
| deployment options | Local installation, Docker container, local/cloud deployment. |
| enterprise characteristics | Multi-agent collaboration, man-machine review, process monitoring and verification, etc. |
| extended | API, Python SDK, widely integrated plug-ins, document loaders, etc. |
Warehouse:https://github.com/FlowiseAI/Flowise
Oil tubing: