Skip to content

docs: update .gitignore via Apex Optimizer #4

docs: update .gitignore via Apex Optimizer

docs: update .gitignore via Apex Optimizer #4

Workflow file for this run

name: CI
# This GitHub Action workflow is designed to automate the Continuous Integration process for the Crawl4AI-LLM-Optimized-Web-Crawler-And-Scraper-Python-Engine.
# It ensures code quality, dependency integrity, and basic functionality testing.
# Trigger the workflow on push events to the main branch and on pull request events targeting the main branch.
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
# Define the jobs that will run as part of the CI workflow.
jobs:
build:
# Specify the operating system environment for the job runner.
runs-on: ubuntu-latest
# Define the steps to be executed within the build job.
steps:
# Step 1: Check out the repository code.
- name: Check out code
uses: actions/checkout@v4
# Step 2: Set up Python environment.
# Uses the "actions/setup-python" action to configure a Python environment.
# "python-version: '3.10'" specifies the desired Python version.
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.10'
cache: 'uv'
# Step 3: Install dependencies using uv.
# This step ensures that all necessary Python packages are installed.
- name: Install dependencies with uv
run: |
python -m pip install --upgrade pip
uv pip install --system -r requirements.txt
# If a dev requirements file exists, install dev dependencies too.
if [ -f requirements-dev.txt ]; then uv pip install --system -r requirements-dev.txt; fi
# Step 4: Run linters and formatters with Ruff.
# Ruff is used for static analysis and code formatting to maintain code quality.
- name: Lint and format with Ruff
run: |
uv pip install ruff
ruff check .
ruff format --check .
# Step 5: Run tests with Pytest.
# Pytest is executed to run all defined unit and integration tests.
# "--cov=crawl4ai" measures code coverage for the 'crawl4ai' package.
# "--cov-report=xml" generates a coverage report in XML format for potential integration with coverage reporting tools.
- name: Test with Pytest
run: |
uv pip install pytest pytest-cov
pytest --cov=crawl4ai --cov-report=xml
# Step 6: Upload coverage reports.
# Uploads the generated XML coverage report to Codecov for analysis and visualization.
# "file: ./coverage.xml" specifies the path to the coverage report file.
# "token: ${{ secrets.CODECOV_TOKEN }}" uses a GitHub secret to authenticate with Codecov.
# This step is conditional and only runs if the main branch is pushed to.
- name: Upload coverage to Codecov
if: github.event_name != 'pull_request'
uses: codecov/codecov-action@v4
with:
token: ${{ secrets.CODECOV_TOKEN }}
file: ./coverage.xml
fail_ci_if_error: true