dify/api
2024-07-24 16:23:16 +08:00
..
configs bump to 0.6.15 (#6592) 2024-07-23 22:46:42 +08:00
constants Feat/environment variables in workflow (#6515) 2024-07-22 15:29:39 +08:00
contexts Feat/environment variables in workflow (#6515) 2024-07-22 15:29:39 +08:00
controllers feat: update prompt generate (#6516) 2024-07-23 19:52:14 +08:00
core Feat/delete single dataset retrival (#6570) 2024-07-24 12:50:11 +08:00
docker fix: kill signal is not passed to the main process (#6159) 2024-07-18 07:50:54 +08:00
events Feat/delete file when clean document (#5882) 2024-07-15 19:57:05 +08:00
extensions update celery beat scheduler time to env (#6352) 2024-07-17 02:31:30 +08:00
fields Feat/delete single dataset retrival (#6570) 2024-07-24 12:50:11 +08:00
libs fix(api/services/app_generate_service.py): Remove wrong type hints. (#6535) 2024-07-22 22:58:07 +08:00
migrations Fix/6615 40 varchar limit on model name (#6623) 2024-07-24 16:23:16 +08:00
models Fix/6615 40 varchar limit on model name (#6623) 2024-07-24 16:23:16 +08:00
schedule Feat/environment variables in workflow (#6515) 2024-07-22 15:29:39 +08:00
services Feat/delete single dataset retrival (#6570) 2024-07-24 12:50:11 +08:00
tasks update empty document caused delete exist collection (#6392) 2024-07-17 20:38:32 +08:00
templates feat: implement forgot password feature (#5534) 2024-07-05 13:38:51 +08:00
tests fix(segments): Support NoneType. (#6581) 2024-07-23 17:59:32 +08:00
.dockerignore build: support Poetry for depencencies tool in api's Dockerfile (#5105) 2024-06-22 01:34:08 +08:00
.env.example update celery beat scheduler time to env (#6352) 2024-07-17 02:31:30 +08:00
app.py Feat/environment variables in workflow (#6515) 2024-07-22 15:29:39 +08:00
commands.py feat: support AnalyticDB vector store (#5586) 2024-07-09 13:32:04 +08:00
Dockerfile chore: skip pip upgrade preparation in api dockerfile (#5999) 2024-07-06 14:17:34 +08:00
poetry.lock Feat/delete single dataset retrival (#6570) 2024-07-24 12:50:11 +08:00
poetry.toml build: initial support for poetry build tool (#4513) 2024-06-11 13:11:28 +08:00
pyproject.toml Feat/delete single dataset retrival (#6570) 2024-07-24 12:50:11 +08:00
README.md typo: Update README.md (#5987) 2024-07-04 22:50:27 +08:00

Dify Backend API

Usage

Important

In the v0.6.12 release, we deprecated pip as the package management tool for Dify API Backend service and replaced it with poetry.

  1. Start the docker-compose stack

    The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

    cd ../docker
    cp middleware.env.example middleware.env
    docker compose -f docker-compose.middleware.yaml -p dify up -d
    cd ../api
    
  2. Copy .env.example to .env

  3. Generate a SECRET_KEY in the .env file.

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    
    secret_key=$(openssl rand -base64 42)
    sed -i '' "/^SECRET_KEY=/c\\
    SECRET_KEY=${secret_key}" .env
    
  4. Create environment.

    Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

  5. Install dependencies

    poetry env use 3.10
    poetry install
    

    In case of contributors missing to update dependencies for pyproject.toml, you can perform the following shell instead.

    poetry shell                                               # activate current environment
    poetry add $(cat requirements.txt)           # install dependencies of production and update pyproject.toml
    poetry add $(cat requirements-dev.txt) --group dev    # install dependencies of development and update pyproject.toml
    
  6. Run migrate

    Before the first launch, migrate the database to the latest version.

    poetry run python -m flask db upgrade
    
  7. Start backend

    poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
    
  8. Start Dify web service.

  9. Setup your application by visiting http://localhost:3000...

  10. If you need to debug local async processing, please start the worker service.

poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

Testing

  1. Install dependencies for both the backend and the test environment

    poetry install --with dev
    
  2. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml

    cd ../
    poetry run -C api bash dev/pytest/pytest_all_tests.sh