Bowen Liang c923684edd chore: extract retrival method literal values into enum (#5060) 10 месяцев назад
..
.vscode f62f71a81a build: initial support for poetry build tool (#4513) 10 месяцев назад
configs 9d5a89eab6 feat: add log date timezone (#4623) 10 месяцев назад
constants 6ccde0452a feat: Added hindi translation i18n (#5240) 10 месяцев назад
controllers c923684edd chore: extract retrival method literal values into enum (#5060) 10 месяцев назад
core c923684edd chore: extract retrival method literal values into enum (#5060) 10 месяцев назад
docker 5f0ce5811a feat: add `flask upgrade-db` command for running db upgrade with redis lock (#5333) 10 месяцев назад
events d160d1ed02 feat: support opensearch approximate k-NN (#5322) 10 месяцев назад
extensions 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 месяцев назад
fields 43c19007e0 fix: workspace member's last_active should be last_active_time, but not last_login_time (#4906) 10 месяцев назад
libs 7305713b97 fix: allow special characters in email (#5327) 10 месяцев назад
migrations 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 месяцев назад
models c923684edd chore: extract retrival method literal values into enum (#5060) 10 месяцев назад
schedule 6c4e6bf1d6 Feat/dify rag (#2528) 1 год назад
services c923684edd chore: extract retrival method literal values into enum (#5060) 10 месяцев назад
tasks ba5f8afaa8 Feat/firecrawl data source (#5232) 10 месяцев назад
templates 3d92784bd4 fix: email template style (#1914) 1 год назад
tests 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 месяцев назад
.dockerignore 220f7c81e9 build: fix .dockerignore file (#800) 1 год назад
.env.example 147a39b984 feat: support tencent cos storage (#5297) 10 месяцев назад
Dockerfile 55fc46c707 improvement: speed up dependency installation in docker image rebuilds by mounting cache layer (#3218) 1 год назад
README.md bdf3ea4369 docs(api/README): Remove unnecessary `=` (#5380) 10 месяцев назад
app.py 9d5a89eab6 feat: add log date timezone (#4623) 10 месяцев назад
commands.py d160d1ed02 feat: support opensearch approximate k-NN (#5322) 10 месяцев назад
config.py 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 месяцев назад
poetry.lock bb33ffc332 feat: initial support for Milvus 2.4.x (#3795) 10 месяцев назад
poetry.toml f62f71a81a build: initial support for poetry build tool (#4513) 10 месяцев назад
pyproject.toml bb33ffc332 feat: initial support for Milvus 2.4.x (#3795) 10 месяцев назад
requirements-dev.txt 23498883d4 chore: skip explicit installing jinja2 as testing dependency (#4845) 10 месяцев назад
requirements.txt bb33ffc332 feat: initial support for Milvus 2.4.x (#3795) 10 месяцев назад

README.md

Dify Backend API

Usage

  1. Start the docker-compose stack

The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

   cd ../docker
   docker-compose -f docker-compose.middleware.yaml -p dify up -d
   cd ../api
  1. Copy .env.example to .env
  2. Generate a SECRET_KEY in the .env file.
   sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
   secret_key=$(openssl rand -base64 42)
   sed -i '' "/^SECRET_KEY=/c\\
   SECRET_KEY=${secret_key}" .env
  1. Create environment.

Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

Using pip can be found below.

  1. Install dependencies
   poetry env use 3.10
   poetry install

In case of contributors missing to update dependencies for pyproject.toml, you can perform the following shell instead.

   poetry shell                                               # activate current environment
   poetry add $(cat requirements.txt)           # install dependencies of production and update pyproject.toml
   poetry add $(cat requirements-dev.txt) --group dev    # install dependencies of development and update pyproject.toml
  1. Run migrate

Before the first launch, migrate the database to the latest version.

   poetry run python -m flask db upgrade
  1. Start backend
   poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
  1. Start Dify web service.
  2. Setup your application by visiting http://localhost:3000...
  3. If you need to debug local async processing, please start the worker service.
   poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

Testing

  1. Install dependencies for both the backend and the test environment
   poetry install --with dev
  1. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml
   cd ../
   poetry run -C api bash dev/pytest/pytest_all_tests.sh

Usage with pip

[!NOTE]
In the next version, we will deprecate pip as the primary package management tool for dify api service, currently Poetry and pip coexist.

  1. Start the docker-compose stack

The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

   cd ../docker
   docker-compose -f docker-compose.middleware.yaml -p dify up -d
   cd ../api
  1. Copy .env.example to .env
  2. Generate a SECRET_KEY in the .env file.
   sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
  1. Create environment.

If you use Anaconda, create a new environment and activate it

   conda create --name dify python=3.10
   conda activate dify
  1. Install dependencies
   pip install -r requirements.txt
  1. Run migrate

Before the first launch, migrate the database to the latest version.

   flask db upgrade
  1. Start backend:
   flask run --host 0.0.0.0 --port=5001 --debug
  1. Setup your application by visiting http://localhost:5001/console/api/setup or other apis...
  2. If you need to debug local async processing, please start the worker service.
   celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.