Section 13: Automating Continuous Delivery of Python Packages with GitHub Actions
# Section 13: Automating Continuous Delivery of Python Packages with GitHub Actions
# 118. Introduction to the section
**Goals of this section**
- Understand **continuous delivery (CD)** as an abstract concept, not just a technical recipe.
- Learn **what CD aims to accomplish** and why it’s valuable across all types of software.
**Deep dive into GitHub Actions**
- Gain an **advanced understanding** of GitHub Actions as a CI/CD tool.
- Recognize it among other tools like **Bitbucket Pipelines, GitLab CI, CircleCI, Travis CI, Jenkins, Hudson, Bamboo**.
- Appreciate GitHub Actions for its **flexibility and power**, despite some learning curve in the documentation.
# 119. What and why of "delivery environments”

Dev→QA/Staging→ Production
0.0.0rc0
0.0.0rc1
0.0.0a0
0.0.0b0
**Core Concepts of Continuous Delivery (CD)**
- CD pipelines **reduce risk** by testing before reaching production.
- **Test environments** (like Test PyPI) let you experiment safely, away from end users.
- Common delivery environments include:
- **Development / Staging** → internal use, ahead of production.
- **Production** → stable, public-facing version.
---
**Ways to Handle Pre-release Versions**
- Use **separate environments** (e.g., Test PyPI vs. Prod PyPI).
- Or, stay on Prod PyPI but use **version suffixes** like:
- `RC` → Release Candidate
- `A` → Alpha
- `B` → Beta
- These suffixes prevent accidental installs, as pip won’t auto-select them.
---
**Risks of Rare Deployment**
- Rare or bulk deployments lead to:
- **Delivery hell** → too many changes piled up.
- Hard-to-trace errors when promoting multiple pipelines or features at once.
- Paralyzing fear of moving from staging to production.
- Example: A company’s staging Airflow pipelines piled up, and production deployment became a nightmare.
# 120. Workflow for Python packages
**Why avoid direct laptop-to-production deploys?**
- Developers **should not** deploy directly from their local machines.
- Tools like **GitHub Actions** provide a **controlled, predictable** pipeline for deployments.
---
**High-level developer workflow**
1. Developer **creates a branch**.
2. Developer **opens a pull request (PR)**, even before ready-to-merge, to trigger automation.
3. Developer **pushes small commits** and iterates until the PR is:
- Build passes (CI checks: lint, tests, build, deploy to Test PyPI).
- Approved (if human review is required).
---
**GitHub Actions pipeline sequence**
- GitHub emits events on PR creation or commit pushes.
- GitHub Actions:
- Clones repo.
- Runs linting, tests, builds.
- Optionally deploys to **Test PyPI** (acts like a staging check).
---
**Post-merge (Main branch)**
- Run **build + test again** on Main.
- If successful:
- Deploy to **Prod PyPI**.
- Optional: Add **manual approval step** (needs GitHub Enterprise for best integration).
---
**Alternative staging strategies**
- Always publish pre-release versions (e.g., `1.0.0-rc1`, `1.0.0-alpha`) **directly to Prod PyPI** but with clear suffixes.
- Or, **isolate**: use Test PyPI for staging, only push to Prod PyPI when fully validated.
- Tradeoff: More manual steps = more latency = higher risk of changes piling up.
---
**Why deploy frequently?**
- Avoid **delivery hell** (piling up too many changes at once).
- Small, frequent releases reduce the chance of catastrophic failures.
- Enables **faster feedback loops**, early end-user testing, and adaptability.
---
**Adaptability of this workflow**
- Although shown here for **Python packages to PyPI**, the same pipeline can apply to:
- REST APIs on AWS.
- Docker containers.
- Mobile app store releases.
- Database migrations.
- Any software deployment.
---
**Final takeaway**
- The **ideal CD pipeline** is automated, trusted, and fast.
- Tests must be reliable to support auto-deployments.
- Manual gates slow things down but may be needed for high-risk changes.

# 121. Setting up trigger listeners for the GitHub Actions workflow

**Workflow setup in GitHub Actions**
- Opened a new workflow file: `publish.yaml`.
- Gave it a name: `build test and publish`.
- Based on a **copy** of a previous GitHub Actions CI workflow.
---
**Define triggers**
- Listening for **GitHub events** to decide when the workflow should run.
- **Pull request (PR) events**:
- `opened` → when a new PR is created.
- `synchronize` → when new commits are pushed to an open PR.
- **Push events**:
- Only listen to pushes on the `main` branch (e.g., after merging a PR).
- Avoid triggering on unnecessary activities (like adding labels or editing PR descriptions).
---
**Supporting multiple workflows**
- Supports both:
- **Protected main branch with PR merges**.
- **Trunk-based development** where trusted teams push commits directly to main.
---
# 122. Pushing to Test PyPI from CI; Contexts and secrets management part 1
**Focus of this section**
- Prepare the GitHub Actions workflow for the **post-merge (merge-to-main)** use case.
- Ignore linting and testing for now → focus solely on **publishing to Test PyPI**.
---
**What was done in the YAML workflow**
- Disabled:
- Linting step (Pre-commit).
- Install dependencies step (related to Pre-commit CLI).
- Kept:
- Basic setup: clone repository, check out branch, set up Python 3.8.
---
**Setup for publishing to Test PyPI**
- Added a step to:
- Run the `run.sh` script.
- Inside the script, it:
- Installs dependencies (build, twine).
- Builds the Python package (`dist/` folder).
- Publishes it to **Test PyPI**.
---
**Handling missing `.env` file in CI**
- Problem: In CI, you **can’t use local `.env` files**.
- Solution:
- Use **GitHub Secrets** to securely pass sensitive data (like the Test PyPI token).
- Add secrets in the **GitHub repository settings → Secrets and Variables → Actions → Repository Secrets**.
- Example secrets:
- `TEST_PYPI_TOKEN`
- `PROD_PYPI_TOKEN`
---
**How to access secrets in workflow**
- Use GitHub’s **`secrets` context**:
- Example in YAML:
```yaml
env:
TEST_PYPI_TOKEN: ${{ secrets.TEST_PYPI_TOKEN }}
```
- Scope options for `env`:
- Global to workflow (top-level).
- Job-level.
- Step-level (most secure, narrowest scope).
---
**Introduction to GitHub Contexts**
- Contexts provide dynamic values **available at runtime**:
- Example: commit hash, branch name, event data, secrets.
- Can access using:
- Dot notation → `${{ secrets.TEST_PYPI_TOKEN }}`
- Bracket notation → `${{ secrets['TEST_PYPI_TOKEN'] }}`
---
**Debugging / exploring contexts**
- Found a **GitHub Docs trick**: Add a special job to dump all context values as JSON into the log.
- Example:
```yaml
steps:
- run: echo "${{ toJson(github) }}"
- run: echo "${{ toJson(secrets) }}"
- run: echo "${{ toJson(vars) }}"
```
- This helps **inspect what’s available** during workflow runs.
# 123. Contexts and secrets management part 2
**Goal of this section**
- Focus on the **post-merge use case**:
→ Deploy Python package from GitHub Actions **to Test PyPI**.
- Temporarily **skip linting and testing** → just test the publishing flow.
---
**Handling `.env` + secrets in CI**
- Local `.env` won’t exist in CI → solve this using **GitHub Secrets**.
- Stored secrets in GitHub repo:
- `TEST_PYPI_TOKEN` (for Test PyPI).
- `PROD_PYPI_TOKEN` (for Prod PyPI).
---
**Accessing secrets securely**
- Use GitHub **`secrets` context**:
- Best practice: scope environment variables **only to the steps that need them**.
---
**Debugging & exploring context values**
- Added a **debug job**:
- Printed all context objects (e.g., `github`, `secrets`, `vars`) using:
```yaml
run: echo "${{ toJson(secrets) }}"
```
- Saw that secret values were **masked** (e.g., shown as `**` in logs).
---
**Triggering the workflow manually**
- Added:
```yaml
on:
workflow_dispatch:
```
- Enabled **Run workflow** button in GitHub Actions UI for manual runs.
---
**Troubleshooting build errors**
- Issue: `ModuleNotFoundError` for `build` → fixed by committing `pyproject.toml`.
- Issue: package name mismatch → fixed by aligning names in `pyproject.toml`.
- Issue: **Version already exists** on Test PyPI → fixed by bumping the version.
---
**Successful deployment result**
- Pushed version `0.0.4` → workflow ran successfully.
- Confirmed live on:
→ [Test PyPI package page](https://test.pypi.org/project/packaging-demo/0.0.4).
# 124. A few minor improvements to the pre-merge pipeline
### **Workflow Improvements**
- Split long bash script into **separate GitHub Actions steps**:
- `Install Python dependencies`
- `Build Python package`
- `Publish to Test PyPI`
- Gave each step a **clear name**, improving GitHub Actions UI readability.
- Added `x` to bash commands to **show executed shell commands** in logs.
- Added inline in `#!/bin/bash -x` to avoid leaking secrets when running scripts locally.
---
### **Handling `.env` loading in CI vs local**
- Created `try_load_env()` function in Bash:
- Gracefully skips if `.env` file doesn’t exist (instead of silently failing).
- Logs `"No .env file found"` and returns exit code 1.
- Ensures CI secrets are loaded securely from **GitHub Secrets** (e.g., `TEST_PYPI_TOKEN`), not `.env`.
# 125. Enabling linting in the pipeline
### **Goal of this section**
- Add **Linting** to the **pre-merge GitHub Actions workflow** as a basic code quality check.
---
### **Created `lint:ci` Command in `run.sh`**
- Defined a function:
```bash
function lint:ci() {
SKIP=no-commit-to-branch pre-commit run --all-files
}
```
- **Why this design?**
- Keeps GitHub Actions YAML file **clean and shallow**.
- Centralizes CI-specific logic inside shell scripts.
- Improves testability and local reproducibility.
---
### **Key Advantages of This Approach**
- Clean separation of concerns:
- GitHub Actions YAML = workflow logic.
- Shell script = execution logic.
- Easier to test CI scripts locally.
- Easier to modify or extend linting behavior later.
# 126. Writing the post-merge pipeline
### **Goals of this Video**
- Begin implementation of the **post-merge pipeline**.
- Ensure the same workflow:
- Handles both **pre-merge checks** and **post-merge deploys**.
- Introduce **conditional deployment** logic using GitHub Actions `if` statements.
---
### **Implementation Summary**
- Reused the same `publish.yaml` workflow file.
- Added two new **conditional steps**:
```yaml
- name: Publish to Test PyPI
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
run: bin/bash -x run.sh publish:test
env:
TEST_PYPI_TOKEN: ${{ secrets.TEST_PYPI_TOKEN }}
- name: Publish to Prod PyPI
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
run: bin/bash -x run.sh publish:prod
env:
PROD_PYPI_TOKEN: ${{ secrets.PROD_PYPI_TOKEN }}
```
---
### **Post-Merge Workflow Summary**
- **Triggered on**: Push to `main` branch.
- **Steps include**:
- Install dependencies
- Run lint checks
- Build package
- Conditionally publish to **Test PyPI**
- Conditionally publish to **Prod PyPI**
---
### **Use of GitHub Contexts**
- Used:
- `github.event_name` → to detect if it’s a `push`
- `github.ref` → to check if it's on the `main` branch
- Verified expected values using the **dump-contexts job** and GitHub Actions documentation.
---
### **Identified Problem**
- **Version bumping is not enforced pre-merge**.
- If version (e.g., `0.0.4`) is already used in Test/Prod PyPI:
- The post-merge step will **fail** to publish.
- But pre-merge checks will **still pass**, causing problems **after merging**.
# 127. Adding git tags and avoiding duplicate releases
### **Problem Identified**
- **Version collisions can break the post-merge deployment**.
- Example: Version `0.0.4` already exists on Test PyPI. If reused in a new PR:
- Pre-merge passes (no version check)
- Post-merge fails on deploy (duplicate version)
---
### **Goal**
- Catch version duplication **before merging** to main.
- Ensure version bumps are made **during development**, not after merge.
- **Track releases** by tagging commits with their version.
---
### **Solution Strategy**
1. **Tag commits** with the version after successful deploys.
2. **Fail the pre-merge build** if a tag with the same version already exists.
3. Use `git tag` as a proxy to check version uniqueness.
---
### **Tagging Logic**
- In **pre-merge workflow**:
- Attempt `git tag $version`
- If tag exists → pipeline fails
- In **post-merge workflow**:
- Tag the commit: `git tag $version`
- Push tag: `git push origin --tags`
---
### **GitHub Actions Enhancements**
- Updated `checkout` step to fetch all tags and full history:
```yaml
- uses: actions/checkout@v3
with:
fetch-depth: 0
```
- Added conditional step to tag the commit with the version.
- This step runs **only on push to main**:
```yaml
- name: Tag with release version
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
run: |
git tag $version
git push origin --tags
```