Jibber AI with Docker Memory

To increase accuracy of results, Jibber AI uses large trained models which consume more memory than smaller, less accurate models.

In addition, if you increase the maximum text size setting to allow more text to be processed in a single request, then that will also require more memory.

If you do not give enough memory to the Jibber AI containers, then you may get errors when making requests.

Typical out of memory symptoms for Jibber AI would be:

  • A Remote end closed connection without response or similar error received by the client

  • A [1] [WARNING] Worker with pid 7 was terminated due to signal 9 or similar error in the docker container log

If you see these types or errors, then you should consider allocating more memory to your docker process.

We recomment allocating 5GB for each jibber-ai container. If you increase the text.max.length setting above 100000, then you will need to increase the memory further. You should refer to the Docker documentation on how to do this for your platform.

To reduce memory consumption, you could consider reducing the number of worker processes available in the container. By default there are 4 worker processes running.

You can override this value when the container starts to reduce the number of workers.

docker run command
docker run -d -p 4000:8000 -m 3000m jibberhub/jibber_extractor_en:1.0 gunicorn -b 0.0.0.0:8000 --timeout 600 --workers 1 jibber_app:app

In the example above, the number of worker processes (concurrent requests handled) is reduced to 1.

Replace `1.0` with the version of Jibber AI you want to run.

NOTE: This will also reduce the number of concurrent requests that can be handled.

And here's an example for Docker compose:

docker-compose.yml
version: "3"

services:
  jibber-service:
    image: jibberhub/jibber_extractor_en:1.0
    environment:
      - TOKEN=my-license-token-from-jibber-ai
    command: "gunicorn -b 0.0.0.0:8000 --timeout 600 --workers 1 jibber_app:app"

  nginx-service:
    image: nginx:latest
    volumes:
      - ./nginx.conf:/etc/nginx/nginx.conf:ro
    depends_on:
      - jibber-service
    ports:
      - "4000:4000"

Replace `1.0` with the version of Jibber AI you want to run.

You also need to replace `my-license-token-from-jibber-ai` with your license key.