Django, PostgreSQL, Docker and Jasper Reports.
I was working on a small project aimed at simplifying the creation and tracking of various types of office documents. The project should include a database of all documents entered, along with their associated due dates. These documents pertain to both clients and suppliers, so the system needs to handle multiple aspects. My idea was to use Django as the backend framework and deploy the application using Docker. It was a fun experience overall. Initially, my project was structured like this: businesshub/ |_ accounts |_ anagrafiche |_ businesshub |_ core |_ documents |_ templates .env .env.prod docker-compose.yml Dockerfile entrypoint.sh manage.py requirements.txt My Dockerfile was as follows: # Use a base image with Python FROM python:3.11-slim # Environment variables to prevent bytecode generation and enable unbuffered output ENV PYTHONDONTWRITEBYTECODE=1 ENV PYTHONUNBUFFERED=1 # Set the working directory inside the container WORKDIR /code # Install system dependencies needed for the project RUN apt-get update && \ apt-get install -y python3 python3-pip python3-dev netcat-openbsd wget build-essential \ libffi-dev libpango1.0-0 libpangocairo-1.0-0 libcairo2 libjpeg-dev \ zlib1g-dev libxml2 libxslt1.1 libgdk-pixbuf2.0-0 unzip # Upgrade pip and install the Python dependencies from requirements.txt RUN pip3 install --upgrade pip COPY requirements.txt . RUN pip3 install -r requirements.txt # Copy the project files into the container COPY . . # Set executable permissions for the entrypoint script RUN chmod +x /code/entrypoint.sh # Command to run when the container starts ENTRYPOINT ["/bin/sh", "/code/entrypoint.sh"] my docker-compose.yml was like this services: web: build: . command: /code/entrypoint.sh volumes: - .:/code - static_volume:/code/staticfiles ports: - '8000:8000' env_file: .env depends_on: - db environment: - DB_HOST=db db: image: postgres:15 volumes: - postgres_data:/var/lib/postgresql/data env_file: .env environment: POSTGRES_DB: ${POSTGRES_DB} POSTGRES_USER: ${POSTGRES_USER} POSTGRES_PASSWORD: ${POSTGRES_PASSWORD} ports: - '5434:5432' nginx: image: nginx:alpine ports: - '80:80' volumes: - static_volume:/code/staticfiles - ./nginx/default.conf:/etc/nginx/conf.d/default.conf depends_on: - web volumes: postgres_data: static_volume: And my entrypoint.sh was like this #!/bin/sh # Set the Django settings module export DJANGO_SETTINGS_MODULE=businesshub.settings # Wait for the database to be ready echo "⏳ Waiting for the database at $DB_HOST..." retries=10 while ! nc -z $DB_HOST 5432; do retries=$((retries-1)) if [ $retries -eq 0 ]; then echo "❌ Timeout: Unable to connect to the database!" exit 1 fi sleep 1 done echo "✅ Database is available!" set -e # Create migrations if there are any changes to models echo "

I was working on a small project aimed at simplifying the creation and tracking of various types of office documents. The project should include a database of all documents entered, along with their associated due dates. These documents pertain to both clients and suppliers, so the system needs to handle multiple aspects.
My idea was to use Django as the backend framework and deploy the application using Docker. It was a fun experience overall. Initially, my project was structured like this:
businesshub/
|_ accounts
|_ anagrafiche
|_ businesshub
|_ core
|_ documents
|_ templates
.env
.env.prod
docker-compose.yml
Dockerfile
entrypoint.sh
manage.py
requirements.txt
My Dockerfile was as follows:
# Use a base image with Python
FROM python:3.11-slim
# Environment variables to prevent bytecode generation and enable unbuffered output
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# Set the working directory inside the container
WORKDIR /code
# Install system dependencies needed for the project
RUN apt-get update && \
apt-get install -y python3 python3-pip python3-dev netcat-openbsd wget build-essential \
libffi-dev libpango1.0-0 libpangocairo-1.0-0 libcairo2 libjpeg-dev \
zlib1g-dev libxml2 libxslt1.1 libgdk-pixbuf2.0-0 unzip
# Upgrade pip and install the Python dependencies from requirements.txt
RUN pip3 install --upgrade pip
COPY requirements.txt .
RUN pip3 install -r requirements.txt
# Copy the project files into the container
COPY . .
# Set executable permissions for the entrypoint script
RUN chmod +x /code/entrypoint.sh
# Command to run when the container starts
ENTRYPOINT ["/bin/sh", "/code/entrypoint.sh"]
my docker-compose.yml was like this
services:
web:
build: .
command: /code/entrypoint.sh
volumes:
- .:/code
- static_volume:/code/staticfiles
ports:
- '8000:8000'
env_file: .env
depends_on:
- db
environment:
- DB_HOST=db
db:
image: postgres:15
volumes:
- postgres_data:/var/lib/postgresql/data
env_file: .env
environment:
POSTGRES_DB: ${POSTGRES_DB}
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
ports:
- '5434:5432'
nginx:
image: nginx:alpine
ports:
- '80:80'
volumes:
- static_volume:/code/staticfiles
- ./nginx/default.conf:/etc/nginx/conf.d/default.conf
depends_on:
- web
volumes:
postgres_data:
static_volume:
And my entrypoint.sh was like this
#!/bin/sh
# Set the Django settings module
export DJANGO_SETTINGS_MODULE=businesshub.settings
# Wait for the database to be ready
echo "⏳ Waiting for the database at $DB_HOST..."
retries=10
while ! nc -z $DB_HOST 5432; do
retries=$((retries-1))
if [ $retries -eq 0 ]; then
echo "❌ Timeout: Unable to connect to the database!"
exit 1
fi
sleep 1
done
echo "✅ Database is available!"
set -e
# Create migrations if there are any changes to models
echo "