In the previous parts of this series, we explored two ways to interact with OpenAI models using Python. First, we focused on the command line (Part 1), then we extended this by setting up a Streamlit web app interface (Part 2).
Now, in Part 3, we will encapsulate this functionality into a Docker image, enabling easy deployment across various platforms, including Kubernetes clusters.
Why Docker?
Docker simplifies the packaging of applications and their dependencies into a portable container that can run on any system with Docker installed. It ensures consistency across development, staging, and production environments, minimizing compatibility issues.
The end goal of this post is to build and push a Docker image that runs the Streamlit app discussed in the previous parts, using GitLab CI for continuous integration and delivery (CI/CD).
Setting up the Dockerfile
We begin by creating a Dockerfile
that defines the environment and dependencies required to run our Streamlit app.
# Use an official Python runtime as a parent image
FROM python:3.9-slim
# Set the working directory in the container
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
# Install any needed packages specified in requirements.txt
RUN pip install --upgrade pip && \
pip install -r requirements.txt
# Expose the default Streamlit port
EXPOSE 8501
# Run the Streamlit app
CMD ["streamlit", "run", "app.py", "--server.address=0.0.0.0"]
- Base Image: We are using the
python:3.9-slim
image for a lightweight yet powerful Python environment. - Working Directory: The
WORKDIR /app
command ensures the working directory inside the container is set to/app
. - Copy Files: The
COPY . /app
command copies all project files from the local directory to the container. - Package Installation: We install all required Python packages using
pip install -r requirements.txt
. - Expose Port: The
EXPOSE 8501
statement opens up port 8501, the default for Streamlit, making the app accessible. - Run Streamlit: Finally, the
CMD
command ensures that when the container is started, the Streamlit app will run.
Creating the GitLab CI/CD Pipeline
Now, let’s automate the process of building and pushing the Docker image using GitLab CI.
The following .gitlab-ci.yml
file sets up a simple CI/CD pipeline for this purpose:
image: docker:latest
services:
- docker:dind
stages:
- build_and_push
variables:
DOCKER_DRIVER: overlay2
before_script:
- docker info
- echo $CI_REGISTRY_PASSWORD | docker login -u $CI_REGISTRY_USER --password-stdin $CI_REGISTRY
build_and_push:
stage: build_and_push
script:
- docker build -t $CI_REGISTRY_IMAGE:latest .
- docker push $CI_REGISTRY_IMAGE:latest
only:
- main
- master
Key Points:
- Docker in Docker (
dind
): This allows us to use Docker within the CI/CD pipeline to build and push the image. - Authentication: The
before_script
section logs in to the GitLab container registry using theCI_REGISTRY_USER
andCI_REGISTRY_PASSWORD
environment variables provided by GitLab CI. - Stages: We define a single stage,
build_and_push
, which builds the Docker image and pushes it to the GitLab Container Registry. - Conditions: The
only
rule ensures that this job runs only on commits to themain
ormaster
branches.
Deploying the Docker Image
Once the Docker image is built and pushed to the GitLab registry, it can be deployed to any Docker host or Kubernetes cluster. For example, to deploy the container locally or on a cloud server, you can use the following docker run
command:
docker run -d \
--name openai-model-checker \
-e VIRTUAL_HOST=ai-models.devops100.net \
-e LETSENCRYPT_HOST=ai-models.devops100.net \
-e [email protected] \
-p 6060:8501 \
registry.gitlab.com/python-2024/openai/check-openai-models-with-python:latest
Conclusion
This part wraps up the process of Dockerizing the OpenAI model checker and automating the build using GitLab CI/CD. With this setup, the entire workflow becomes streamlined, ensuring that any updates to the project are automatically built and pushed to the GitLab container registry.
For the complete code and CI pipeline, visit the GitLab repository.
Part 4 will be how to deploy this docker image via docker-compose to easy setup a domain where this streamlit app container is hosted.
This includes the jwilder/nginx-proxy and letsencrypt-nginx-proxy-companion setup for automatic SSL termination, ensuring secure communication via HTTPS
See a hosting example of this app here: https://ai-models.ops100.cloud/