dify/api
2024-07-15 19:27:18 +08:00
..
configs feat: support MyScale vector database (#6092) 2024-07-11 15:21:59 +08:00
constants feat:add tts-streaming config and future (#5492) 2024-07-09 11:33:58 +08:00
controllers feat(backend): support import DSL from URL (#6287) 2024-07-15 16:23:40 +08:00
core fix: zhipuai validate error when user's api key not support for chatglm_turbo in issue #6289 (#6290) 2024-07-15 19:27:18 +08:00
docker feat: correctly delete applications using Celery workers (#5787) 2024-07-01 14:21:17 +08:00
events feat: correctly delete applications using Celery workers (#5787) 2024-07-01 14:21:17 +08:00
extensions Fix/file stream azure blob (#6196) 2024-07-11 17:01:03 +08:00
fields feat: app rate limit (#5844) 2024-07-10 21:31:35 +08:00
libs feat: app rate limit (#5844) 2024-07-10 21:31:35 +08:00
migrations feat: app rate limit (#5844) 2024-07-10 21:31:35 +08:00
models feat: app rate limit (#5844) 2024-07-10 21:31:35 +08:00
schedule refactor(api): switch to dify_config with Pydantic in controllers and schedule (#6237) 2024-07-12 16:51:43 +08:00
services feat(backend): support import DSL from URL (#6287) 2024-07-15 16:23:40 +08:00
tasks refactor(services/tasks): Swtich to dify_config witch Pydantic (#6203) 2024-07-12 12:25:38 +08:00
templates feat: implement forgot password feature (#5534) 2024-07-05 13:38:51 +08:00
tests feat: support MyScale vector database (#6092) 2024-07-11 15:21:59 +08:00
.dockerignore build: support Poetry for depencencies tool in api's Dockerfile (#5105) 2024-06-22 01:34:08 +08:00
.env.example feat: support MyScale vector database (#6092) 2024-07-11 15:21:59 +08:00
app.py Chore/remove-unused-code (#5917) 2024-07-04 18:18:26 +08:00
commands.py feat: support AnalyticDB vector store (#5586) 2024-07-09 13:32:04 +08:00
Dockerfile chore: skip pip upgrade preparation in api dockerfile (#5999) 2024-07-06 14:17:34 +08:00
poetry.lock feat: support MyScale vector database (#6092) 2024-07-11 15:21:59 +08:00
poetry.toml build: initial support for poetry build tool (#4513) 2024-06-11 13:11:28 +08:00
pyproject.toml feat: support MyScale vector database (#6092) 2024-07-11 15:21:59 +08:00
README.md typo: Update README.md (#5987) 2024-07-04 22:50:27 +08:00

Dify Backend API

Usage

Important

In the v0.6.12 release, we deprecated pip as the package management tool for Dify API Backend service and replaced it with poetry.

  1. Start the docker-compose stack

    The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

    cd ../docker
    cp middleware.env.example middleware.env
    docker compose -f docker-compose.middleware.yaml -p dify up -d
    cd ../api
    
  2. Copy .env.example to .env

  3. Generate a SECRET_KEY in the .env file.

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    
    secret_key=$(openssl rand -base64 42)
    sed -i '' "/^SECRET_KEY=/c\\
    SECRET_KEY=${secret_key}" .env
    
  4. Create environment.

    Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

  5. Install dependencies

    poetry env use 3.10
    poetry install
    

    In case of contributors missing to update dependencies for pyproject.toml, you can perform the following shell instead.

    poetry shell                                               # activate current environment
    poetry add $(cat requirements.txt)           # install dependencies of production and update pyproject.toml
    poetry add $(cat requirements-dev.txt) --group dev    # install dependencies of development and update pyproject.toml
    
  6. Run migrate

    Before the first launch, migrate the database to the latest version.

    poetry run python -m flask db upgrade
    
  7. Start backend

    poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
    
  8. Start Dify web service.

  9. Setup your application by visiting http://localhost:3000...

  10. If you need to debug local async processing, please start the worker service.

poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

Testing

  1. Install dependencies for both the backend and the test environment

    poetry install --with dev
    
  2. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml

    cd ../
    poetry run -C api bash dev/pytest/pytest_all_tests.sh