mirror of
https://github.com/sunnypilot/sunnypilot.git
synced 2026-02-18 22:23:56 +08:00
ci: Refactor model building workflows (#1096)
* Tinygrad bump from sync-20250627 * bump tinygrad_repo * Reformat metadata generator to match driving_models.json * bump tinygrad * Revert "bump tinygrad" This reverts commitf479dfd502. * revert me after SP model compiled * Model recompiled successfully, initiate "revert me after SP model compiled" This reverts commit95706eb688. * The "FillMe" placeholder caused an extra 10 seconds of work * bump to 22Jul2025 * Update build-all-tinygrad-models.yaml * Update build-all-tinygrad-models.yaml * Update build-all-tinygrad-models.yaml * Update build-all-tinygrad-models.yaml * Update build-all-tinygrad-models.yaml * Update build-all-tinygrad-models.yaml * Update build-all-tinygrad-models.yaml * Allow more dynamic short names This should hopefully be future-proof for now.. It's robust enough to return the correct word-digit format (see example on how it generates from given display name below): 'Last Horizon V2 (November 22, 2024)' -> LHV2 'Alabama (November 25, 2024)' -> ALABAMA 'PlayStation (December 03, 2024)' -> PLAYSTAT 'Postal Service (December 09, 2024)' -> PS 'Null Pointer (December 13, 2024)' -> NP 'North America (December 16, 2024)' -> NA 'National Public Radio (December 18, 2024)' -> NPR 'Filet o Fish (March 7, 2025)' -> FOF 'Tomb Raider 2 (April 18, 2025)' -> TR2 'Tomb Raider 3 (April 22, 2025)' -> TR3 'Tomb Raider 4 (April 25, 2025)' -> TR4 'Tomb Raider 5 (April 25, 2025)' -> TR5 'Tomb Raider 6 (April 30, 2025)' -> TR6 'Tomb Raider 7 (May 07, 2025)' -> TR7 'Down to Ride (Revision: May 10, 2025)' -> DTR 'SP Vikander Model (May 16, 2025)' -> SPVM 'VFF Driving (May 15, 2025)' -> VFFD 'Secret Good Openpilot (May 16, 2025)' -> SGO 'Vegetarian Filet o Fish (May 29, 2025)' -> VFOF 'Down To Ride (Revision: May 30, 2025)' -> DTR 'Vegetarian Filet o Fish v2 (June 05, 2025)' -> VFOFV2 'Kerrygold Driving (June 08, 2025)' -> KD 'Tomb Raider 10 (June 16, 2025)' -> TR10 'Organic Kerrygold (June 17, 2025)' -> OK 'Liquid Crystal Driving (June 21, 2025)' -> LCD 'Vegetarian Filet o Fish v3 (June 21, 2025)' -> VFOFV3 'Vibe Model [Custom Model]' -> VMCM 'Tomb Raider 13 (June 27, 2025)' -> TR13 'Aggressive TR (June 28, 2025)' -> ATR 'Tomb Raider 14 (June 30, 2025)' -> TR14 'Cookiemonster Tomb Raider (July 02, 2025)' -> CTR 'Down to Ride (Revision: July 07, 2025)' -> DTR 'Simple Plan Driving (July 07, 2025)' -> SPD 'Down to Ride (Revision: July 08, 2025)' -> DTR 'Tomb Raider 15 (July 09, 2025)' -> TR15 'Tomb Raider 15 rev-2 (July 11, 2025)' -> TR15R2 'Le Tomb Raider 14 (July 14, 2025)' -> LTR14 'Le Tomb Raider 14h (July 17, 2025)' -> LTR14H 'Tomb Raider 16 (July 18, 2025)' -> TR16 'Tomb Raider 16v2 (July 21, 2025)' -> TR16V2 * Update build-all-tinygrad-models.yaml * Update build-all-tinygrad-models.yaml * No need to sleep 3 seconds, just send it * try dynamic * cleanup * Update build-single-tinygrad-model.yaml * bc devtekve said. also, this is repetitive af * Revert "bc devtekve said. also, this is repetitive af" This reverts commit3a0c1562de. * maybe we could use a script instead that both build all That both build all and sunnypilot-build-model reference * refactor: consolidate model building steps into a single workflow * tweak * tweakx2 * tweakx3 * tweakx4 * dunno dunno... * output dir * lots of changes * Revert "lots of changes" This reverts commit4aadb0ee29. * fail if all fail * no inputs needed * make it easier for us * note failure and exit 0 * Update build-all-tinygrad-models.yaml * not needed unless we really want it * Update build-single-tinygrad-model.yaml * Merge branch 'sync-20250627-tinygrad' of github.com:sunnypilot/sunnypilot into sync-20250627-tinygrad * retry for failed ? * always run this step because sometimes one build fails which causes the matrix to fail, but most builds still have uploaded artifacts. * strip * no escape * Update build-all-tinygrad-models.yaml * Test case from terminal run (openpilot) james@Mac sunnypilot % jq -c '[.bundles[] | select(.runner=="tinygrad") | {ref, display_name: (.display_name | gsub(" \\([^)]*\\)"; "")), is_20hz}]' \ /Users/james/Documents/GitHub/sunnypilot-docs/docs/driving_models_v6.json > matrix.json mkdir -p output touch "output/model-Tomb Raider 16v2 (July 21, 2025)-544" touch "output/model-Space Lab Model (July 24, 2025)-547" touch "output/model-Space Lab Model v1 (July 24, 2025)-548" built=(); while IFS= read -r line; do built+=("$line"); done < <( ls output | sed -E 's/^model-//' | sed -E 's/-[0-9]+$//' | sed -E 's/ \([^)]*\)//' | awk '{gsub(/^ +| +$/, ""); print}' ) jq -c --argjson built "$(printf '%s\n' "${built[@]}" | jq -R . | jq -s .)" \ 'map(select(.display_name as $n | ($built | index($n | gsub("^ +| +$"; "")) | not)))' \ matrix.json > retry_matrix.json cat retry_matrix.json [] (openpilot) james@Mac sunnypilot % * always * great success * add suffix to retry artifact so it doesn't conflict * retry to get_model too * and there haha * unnecessary hyphen * compare built to missing. include retries * adjust copy of artifacts. * Update build-all-tinygrad-models.yaml * Update model selector versioning and add documentation * Update retry condition for failed models in build-all-tinygrad-models.yaml * Update retry condition for failed models in build-all-tinygrad-models.yaml * Update build-single-tinygrad-model.yaml * false * default none because why not * red diff? i think? * meh ... not needed i guess * error error error * Nayan is watching... always watching mike wazowski * string all the way * lots of retries just in case because im scared * more robust * ONLY ONE!!!!!! * delete.... a lot * fix artifacts * fix artifacts * make sure each is unique :) * skip files like artifact duhhhh * artifact name dir * concurrency * copy here * Update build-single-tinygrad-model.yaml * Update build-single-tinygrad-model.yaml * bump * bump tinygrad * max parallel? if not, i have the other remedy ready in build-all * revert me! * I resynced tinygrad woo hoo * setup shouldnt fail * pull * big ole diff * condition * Update build-all-tinygrad-models.yaml * not always() never always() never!!! * not failure instead of great success * Update build-all-tinygrad-models.yaml * yay that worked. lets invoke build-single one last time * these arent used and are just taking up 250MB space * really frog? * bump back to 3 * self-hosted, tici * rename to trigger tests * 2 and done --------- Co-authored-by: DevTekVE <devtekve@gmail.com>
This commit is contained in:
committed by
GitHub
parent
c47aadee19
commit
6d41ce2032
343
.github/workflows/build-all-tinygrad-models.yaml
vendored
343
.github/workflows/build-all-tinygrad-models.yaml
vendored
@@ -1,21 +1,23 @@
|
||||
name: Build All Tinygrad Models and Push to GitLab
|
||||
name: Build and push all tinygrad models
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
branch:
|
||||
description: 'Branch to run workflow from'
|
||||
required: false
|
||||
default: 'master-new'
|
||||
set_min_version:
|
||||
description: 'Minimum selector version required for the models (see helpers.py or readme.md)'
|
||||
required: true
|
||||
type: string
|
||||
|
||||
jobs:
|
||||
setup:
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
json_version: ${{ steps.get-json.outputs.json_version }}
|
||||
recompiled_dir: ${{ steps.create-recompiled-dir.outputs.recompiled_dir }}
|
||||
json_file: ${{ steps.get-json.outputs.json_file }}
|
||||
model_matrix: ${{ steps.set-matrix.outputs.model_matrix }}
|
||||
steps:
|
||||
- name: Checkout docs repo
|
||||
- name: Checkout docs repo (sunnypilot-docs, gh-pages)
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
repository: sunnypilot/sunnypilot-docs
|
||||
@@ -23,7 +25,7 @@ jobs:
|
||||
path: docs
|
||||
ssh-key: ${{ secrets.CI_SUNNYPILOT_DOCS_PRIVATE_KEY }}
|
||||
|
||||
- name: Get next JSON version to use
|
||||
- name: Get next JSON version to use (from GitHub docs repo)
|
||||
id: get-json
|
||||
run: |
|
||||
cd docs/docs
|
||||
@@ -31,19 +33,131 @@ jobs:
|
||||
next=$((latest+1))
|
||||
json_file="driving_models_v${next}.json"
|
||||
cp "driving_models_v${latest}.json" "$json_file"
|
||||
echo "json_file=$json_file" >> $GITHUB_OUTPUT
|
||||
echo "json_file=docs/docs/$json_file" >> $GITHUB_OUTPUT
|
||||
echo "json_version=$((next+0))" >> $GITHUB_OUTPUT
|
||||
echo "SRC_JSON_FILE=docs/docs/driving_models_v${latest}.json" >> $GITHUB_ENV
|
||||
|
||||
- name: Upload context for next jobs
|
||||
uses: actions/upload-artifact@v4
|
||||
- name: Extract tinygrad models
|
||||
id: set-matrix
|
||||
working-directory: docs/docs
|
||||
run: |
|
||||
jq -c '[.bundles[] | select(.runner=="tinygrad") | {ref, display_name: (.display_name | gsub(" \\([^)]*\\)"; "")), is_20hz}]' "$(basename "${SRC_JSON_FILE}")" > matrix.json
|
||||
echo "model_matrix=$(cat matrix.json)" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Set up SSH
|
||||
uses: webfactory/ssh-agent@v0.9.0
|
||||
with:
|
||||
name: context
|
||||
path: docs
|
||||
ssh-private-key: ${{ secrets.GITLAB_SSH_PRIVATE_KEY }}
|
||||
- run: |
|
||||
mkdir -p ~/.ssh
|
||||
ssh-keyscan -H gitlab.com >> ~/.ssh/known_hosts
|
||||
|
||||
build-all:
|
||||
- name: Clone GitLab docs repo and create new recompiled dir
|
||||
id: create-recompiled-dir
|
||||
env:
|
||||
GIT_SSH_COMMAND: 'ssh -o UserKnownHostsFile=~/.ssh/known_hosts'
|
||||
run: |
|
||||
git clone --depth 1 --filter=tree:0 --sparse git@gitlab.com:sunnypilot/public/docs.sunnypilot.ai2.git gitlab_docs
|
||||
cd gitlab_docs
|
||||
git checkout main
|
||||
git sparse-checkout set --no-cone models/
|
||||
cd models
|
||||
latest_dir=$(ls -d recompiled* 2>/dev/null | sed -E 's/recompiled([0-9]+)/\1/' | sort -n | tail -1)
|
||||
if [[ -z "$latest_dir" ]]; then
|
||||
next_dir=1
|
||||
else
|
||||
next_dir=$((latest_dir+1))
|
||||
fi
|
||||
recompiled_dir="${next_dir}"
|
||||
mkdir -p "recompiled${recompiled_dir}"
|
||||
touch "recompiled${recompiled_dir}/.gitkeep"
|
||||
cd ../..
|
||||
echo "recompiled_dir=$recompiled_dir" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Push empty recompiled dir to GitLab
|
||||
run: |
|
||||
cd gitlab_docs
|
||||
git add models/recompiled${{ steps.create-recompiled-dir.outputs.recompiled_dir }}
|
||||
git config --global user.name "GitHub Action"
|
||||
git config --global user.email "action@github.com"
|
||||
git commit -m "Add recompiled${{ steps.create-recompiled-dir.outputs.recompiled_dir }} for build-all" || echo "No changes to commit"
|
||||
git push origin main
|
||||
|
||||
- name: Push new JSON to GitHub docs repo
|
||||
run: |
|
||||
cd docs
|
||||
git pull origin gh-pages
|
||||
git add docs/"$(basename ${{ steps.get-json.outputs.json_file }})"
|
||||
git config --global user.name "GitHub Action"
|
||||
git config --global user.email "action@github.com"
|
||||
git commit -m "Add new ${{ steps.get-json.outputs.json_file }} for build-all" || echo "No changes to commit"
|
||||
git push origin gh-pages
|
||||
|
||||
get_and_build:
|
||||
needs: [setup]
|
||||
strategy:
|
||||
matrix:
|
||||
model: ${{ fromJson(needs.setup.outputs.model_matrix) }}
|
||||
fail-fast: false
|
||||
uses: ./.github/workflows/build-single-tinygrad-model.yaml
|
||||
with:
|
||||
upstream_branch: ${{ matrix.model.ref }}
|
||||
custom_name: ${{ matrix.model.display_name }}
|
||||
recompiled_dir: ${{ needs.setup.outputs.recompiled_dir }}
|
||||
json_version: ${{ needs.setup.outputs.json_version }}
|
||||
secrets: inherit
|
||||
|
||||
retry_failed_models:
|
||||
needs: [setup, get_and_build]
|
||||
runs-on: ubuntu-latest
|
||||
needs: setup
|
||||
if: ${{ needs.setup.result != 'failure' && (failure() && !cancelled()) }}
|
||||
outputs:
|
||||
retry_matrix: ${{ steps.set-retry-matrix.outputs.retry_matrix }}
|
||||
steps:
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
pattern: model-*
|
||||
path: output
|
||||
|
||||
- id: set-retry-matrix
|
||||
run: |
|
||||
echo '${{ needs.setup.outputs.model_matrix }}' > matrix.json
|
||||
built=(); while IFS= read -r line; do built+=("$line"); done < <(
|
||||
ls output | sed -E 's/^model-//' | sed -E 's/-[0-9]+$//' | sed -E 's/ \([^)]*\)//' | awk '{gsub(/^ +| +$/, ""); print}'
|
||||
)
|
||||
jq -c --argjson built "$(printf '%s\n' "${built[@]}" | jq -R . | jq -s .)" \
|
||||
'map(select(.display_name as $n | ($built | index($n | gsub("^ +| +$"; "")) | not)))' matrix.json > retry_matrix.json
|
||||
echo "retry_matrix=$(cat retry_matrix.json)" >> $GITHUB_OUTPUT
|
||||
|
||||
retry_get_and_build:
|
||||
needs: [setup, get_and_build, retry_failed_models]
|
||||
if: ${{ needs.get_and_build.result == 'failure' || (needs.retry_failed_models.outputs.retry_matrix != '[]' && needs.retry_failed_models.outputs.retry_matrix != '') }}
|
||||
strategy:
|
||||
matrix:
|
||||
model: ${{ fromJson(needs.retry_failed_models.outputs.retry_matrix) }}
|
||||
fail-fast: false
|
||||
uses: ./.github/workflows/build-single-tinygrad-model.yaml
|
||||
with:
|
||||
upstream_branch: ${{ matrix.model.ref }}
|
||||
custom_name: ${{ matrix.model.display_name }}
|
||||
recompiled_dir: ${{ needs.setup.outputs.recompiled_dir }}
|
||||
json_version: ${{ needs.setup.outputs.json_version }}
|
||||
artifact_suffix: -retry
|
||||
secrets: inherit
|
||||
|
||||
publish_models:
|
||||
name: Publish models sequentially
|
||||
needs: [setup, get_and_build, retry_failed_models, retry_get_and_build]
|
||||
if: ${{ !cancelled() && (needs.get_and_build.result != 'failure' || needs.retry_get_and_build.result == 'success' || (needs.retry_failed_models.outputs.retry_matrix != '[]' && needs.retry_failed_models.outputs.retry_matrix != '')) }}
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
max-parallel: 1
|
||||
matrix:
|
||||
model: ${{ fromJson(needs.setup.outputs.model_matrix) }}
|
||||
env:
|
||||
JSON_FILE: docs/docs/${{ needs.setup.outputs.json_file }}
|
||||
RECOMPILED_DIR: recompiled${{ needs.setup.outputs.recompiled_dir }}
|
||||
JSON_FILE: ${{ needs.setup.outputs.json_file }}
|
||||
ARTIFACT_NAME_INPUT: ${{ matrix.model.display_name }}
|
||||
steps:
|
||||
- name: Set up SSH
|
||||
uses: webfactory/ssh-agent@v0.9.0
|
||||
@@ -62,138 +176,73 @@ jobs:
|
||||
echo "Cloning GitLab"
|
||||
git clone --depth 1 --filter=tree:0 --sparse git@gitlab.com:sunnypilot/public/docs.sunnypilot.ai2.git gitlab_docs
|
||||
cd gitlab_docs
|
||||
echo "checkout models/${RECOMPILED_DIR}"
|
||||
git sparse-checkout set --no-cone models/${RECOMPILED_DIR}
|
||||
git checkout main
|
||||
cd ..
|
||||
|
||||
- name: Set next recompiled dir
|
||||
id: set-recompiled
|
||||
run: |
|
||||
cd gitlab_docs/models
|
||||
latest_dir=$(ls -d recompiled* 2>/dev/null | sed -E 's/recompiled([0-9]+)/\1/' | sort -n | tail -1)
|
||||
if [[ -z "$latest_dir" ]]; then
|
||||
next_dir=1
|
||||
else
|
||||
next_dir=$((latest_dir+1))
|
||||
fi
|
||||
recompiled_dir="recompiled${next_dir}"
|
||||
mkdir -p "$recompiled_dir"
|
||||
echo "RECOMPILED_DIR=$recompiled_dir" >> $GITHUB_ENV
|
||||
|
||||
- name: Download context
|
||||
uses: actions/download-artifact@v4
|
||||
- name: Checkout docs repo
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
name: context
|
||||
path: .
|
||||
repository: sunnypilot/sunnypilot-docs
|
||||
ref: gh-pages
|
||||
path: docs
|
||||
ssh-key: ${{ secrets.CI_SUNNYPILOT_DOCS_PRIVATE_KEY }}
|
||||
|
||||
- name: Install dependencies
|
||||
- name: Validate recompiled dir and JSON version
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y jq gh
|
||||
|
||||
- name: Build all tinygrad models
|
||||
id: trigger-builds
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
set -e
|
||||
> triggered_run_ids.txt
|
||||
BRANCH="${{ github.event.inputs.branch }}"
|
||||
jq -c '.bundles[] | select(.runner=="tinygrad")' "$JSON_FILE" | while read -r bundle; do
|
||||
ref=$(echo "$bundle" | jq -r '.ref')
|
||||
display_name=$(echo "$bundle" | jq -r '.display_name' | sed 's/ ([^)]*)//g')
|
||||
is_20hz=$(echo "$bundle" | jq -r '.is_20hz')
|
||||
echo "Triggering build for: $display_name ($ref) [20Hz: $is_20hz]"
|
||||
START_TIME=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
|
||||
gh workflow run sunnypilot-build-model.yaml \
|
||||
--repo sunnypilot/sunnypilot \
|
||||
--ref "$BRANCH" \
|
||||
-f upstream_branch="$ref" \
|
||||
-f custom_name="$display_name" \
|
||||
-f is_20hz="$is_20hz"
|
||||
for i in {1..24}; do
|
||||
RUN_ID=$(gh run list --repo sunnypilot/sunnypilot --workflow=sunnypilot-build-model.yaml --branch="$BRANCH" --created ">$START_TIME" --limit=1 --json databaseId --jq '.[0].databaseId')
|
||||
if [ -n "$RUN_ID" ]; then
|
||||
break
|
||||
fi
|
||||
sleep 5
|
||||
done
|
||||
if [ -z "$RUN_ID" ]; then
|
||||
echo "ould not find the triggered workflow run for $display_name ($ref)"
|
||||
exit 1
|
||||
fi
|
||||
echo "$RUN_ID" >> triggered_run_ids.txt
|
||||
done
|
||||
|
||||
- name: Wait for all model builds to finish
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
set -e
|
||||
SUCCESS_RUNS=()
|
||||
FAILED_RUNS=()
|
||||
declare -A RUN_ID_TO_NAME
|
||||
|
||||
while read -r RUN_ID; do
|
||||
ARTIFACT_NAME=$(gh api repos/sunnypilot/sunnypilot/actions/runs/$RUN_ID/artifacts --jq '.artifacts[] | select(.name | startswith("model-")) | .name' || echo "unknown")
|
||||
RUN_ID_TO_NAME["$RUN_ID"]="$ARTIFACT_NAME"
|
||||
done < triggered_run_ids.txt
|
||||
|
||||
while read -r RUN_ID; do
|
||||
echo "Watching run ID: $RUN_ID"
|
||||
gh run watch "$RUN_ID" --repo sunnypilot/sunnypilot
|
||||
CONCLUSION=$(gh run view "$RUN_ID" --repo sunnypilot/sunnypilot --json conclusion --jq '.conclusion')
|
||||
ARTIFACT_NAME="${RUN_ID_TO_NAME[$RUN_ID]}"
|
||||
echo "Run $RUN_ID ($ARTIFACT_NAME) concluded with: $CONCLUSION"
|
||||
if [[ "$CONCLUSION" == "success" ]]; then
|
||||
SUCCESS_RUNS+=("$RUN_ID")
|
||||
else
|
||||
FAILED_RUNS+=("$RUN_ID")
|
||||
fi
|
||||
done < triggered_run_ids.txt
|
||||
|
||||
if [[ ${#SUCCESS_RUNS[@]} -eq 0 ]]; then
|
||||
echo "All model builds failed. Aborting."
|
||||
if [ ! -d "gitlab_docs/models/$RECOMPILED_DIR" ]; then
|
||||
echo "Recompiled dir $RECOMPILED_DIR does not exist in GitLab repo"
|
||||
exit 1
|
||||
fi
|
||||
if [ ! -f "$JSON_FILE" ]; then
|
||||
echo "JSON file $JSON_FILE does not exist!"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ ${#FAILED_RUNS[@]} -gt 0 ]]; then
|
||||
echo "WARNING: The following model builds failed:"
|
||||
for RUN_ID in "${FAILED_RUNS[@]}"; do
|
||||
echo "- $RUN_ID (${RUN_ID_TO_NAME[$RUN_ID]})"
|
||||
done
|
||||
echo "You may want to rerun these models manually."
|
||||
fi
|
||||
- name: Download artifact name file
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: artifact-name-${{ env.ARTIFACT_NAME_INPUT }}
|
||||
path: artifact_name
|
||||
|
||||
- name: Download and extract all model artifacts
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Read artifact name
|
||||
id: read-artifact-name
|
||||
run: |
|
||||
ARTIFACT_DIR="gitlab_docs/models/$RECOMPILED_DIR"
|
||||
SUCCESS_RUNS=()
|
||||
while read -r RUN_ID; do
|
||||
CONCLUSION=$(gh run view "$RUN_ID" --repo sunnypilot/sunnypilot --json conclusion --jq '.conclusion')
|
||||
if [[ "$CONCLUSION" == "success" ]]; then
|
||||
SUCCESS_RUNS+=("$RUN_ID")
|
||||
fi
|
||||
done < triggered_run_ids.txt
|
||||
ARTIFACT_NAME=$(cat artifact_name/artifact_name.txt)
|
||||
echo "artifact_name=$ARTIFACT_NAME" >> $GITHUB_OUTPUT
|
||||
|
||||
for RUN_ID in "${SUCCESS_RUNS[@]}"; do
|
||||
ARTIFACT_NAME=$(gh api repos/sunnypilot/sunnypilot/actions/runs/$RUN_ID/artifacts --jq '.artifacts[] | select(.name | startswith("model-")) | .name')
|
||||
echo "Downloading artifact: $ARTIFACT_NAME from run: $RUN_ID"
|
||||
mkdir -p "$ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
echo "Created directory: $ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
gh run download "$RUN_ID" --repo sunnypilot/sunnypilot -n "$ARTIFACT_NAME" --dir "$ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
echo "Downloaded artifact zip(s) to: $ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
ZIP_PATH=$(find "$ARTIFACT_DIR/$ARTIFACT_NAME" -type f -name '*.zip' | head -n1)
|
||||
if [ -n "$ZIP_PATH" ]; then
|
||||
echo "Unzipping $ZIP_PATH to $ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
unzip -o "$ZIP_PATH" -d "$ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
rm -f "$ZIP_PATH"
|
||||
echo "Unzipped and removed $ZIP_PATH"
|
||||
else
|
||||
echo "No zip file found in $ARTIFACT_DIR/$ARTIFACT_NAME (This is NOT an error)."
|
||||
- name: Download model artifact
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ${{ steps.read-artifact-name.outputs.artifact_name }}
|
||||
path: output
|
||||
|
||||
- name: Remove onnx files bc not needed for recompiled dir since they already exist from single build
|
||||
run: |
|
||||
find output -type f -name '*.onnx' -delete
|
||||
find output -type f -name 'big_*.pkl' -delete
|
||||
find output -type f -name 'dmonitoring_model_tinygrad.pkl' -delete
|
||||
|
||||
- name: Copy model artifacts to gitlab
|
||||
env:
|
||||
ARTIFACT_NAME: ${{ steps.read-artifact-name.outputs.artifact_name }}
|
||||
run: |
|
||||
ARTIFACT_DIR="gitlab_docs/models/${RECOMPILED_DIR}/${ARTIFACT_NAME}"
|
||||
mkdir -p "$ARTIFACT_DIR"
|
||||
for path in output/*; do
|
||||
if [ "$(basename "$path")" = "artifact_name.txt" ]; then
|
||||
continue
|
||||
fi
|
||||
name="$(basename "$path")"
|
||||
if [ -d "$path" ]; then
|
||||
mkdir -p "$ARTIFACT_DIR/$name"
|
||||
cp -r "$path"/* "$ARTIFACT_DIR/$name/"
|
||||
echo "Copied dir $name -> $ARTIFACT_DIR/$name"
|
||||
else
|
||||
cp "$path" "$ARTIFACT_DIR/"
|
||||
echo "Copied file $name -> $ARTIFACT_DIR/"
|
||||
fi
|
||||
echo "Done processing $ARTIFACT_NAME"
|
||||
done
|
||||
|
||||
- name: Push recompiled dir to GitLab
|
||||
@@ -202,25 +251,35 @@ jobs:
|
||||
run: |
|
||||
cd gitlab_docs
|
||||
git checkout main
|
||||
mkdir -p models/"$(basename $RECOMPILED_DIR)"
|
||||
git add models/"$(basename $RECOMPILED_DIR)"
|
||||
git pull origin main
|
||||
for d in models/"$RECOMPILED_DIR"/*/; do
|
||||
git sparse-checkout add "$d"
|
||||
done
|
||||
git add models/"$RECOMPILED_DIR"
|
||||
git config --global user.name "GitHub Action"
|
||||
git config --global user.email "action@github.com"
|
||||
git commit -m "Add $(basename $RECOMPILED_DIR) from build-all-tinygrad-models"
|
||||
git commit -m "Update $RECOMPILED_DIR with model from build-all-tinygrad-models" || echo "No changes to commit"
|
||||
git push origin main
|
||||
- run: |
|
||||
cd docs
|
||||
git pull origin gh-pages
|
||||
|
||||
- name: Run json_parser.py to update JSON
|
||||
- name: update json
|
||||
run: |
|
||||
python3 docs/json_parser.py \
|
||||
ARGS=""
|
||||
[ -n "${{ inputs.set_min_version }}" ] && ARGS="$ARGS --set-min-version \"${{ inputs.set_min_version }}\""
|
||||
ARGS="$ARGS --sort-by-date"
|
||||
eval python3 docs/json_parser.py \
|
||||
--json-path "$JSON_FILE" \
|
||||
--recompiled-dir "gitlab_docs/models/$RECOMPILED_DIR"
|
||||
--recompiled-dir "gitlab_docs/models/$RECOMPILED_DIR" \
|
||||
$ARGS
|
||||
|
||||
- name: Push updated JSON to GitHub docs repo
|
||||
- name: Push updated json to GitHub
|
||||
run: |
|
||||
cd docs
|
||||
git config --global user.name "GitHub Action"
|
||||
git config --global user.email "action@github.com"
|
||||
git checkout gh-pages
|
||||
git add docs/"$(basename $JSON_FILE)"
|
||||
git commit -m "Update $(basename $JSON_FILE) after recompiling models" || echo "No changes to commit"
|
||||
git commit -m "Update $(basename $JSON_FILE) after recompiling model" || echo "No changes to commit"
|
||||
git push origin gh-pages
|
||||
|
||||
174
.github/workflows/build-single-tinygrad-model.yaml
vendored
174
.github/workflows/build-single-tinygrad-model.yaml
vendored
@@ -1,13 +1,36 @@
|
||||
name: Build Single Tinygrad Model and Push
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
upstream_branch:
|
||||
description: 'Upstream commit to build from'
|
||||
required: true
|
||||
type: string
|
||||
custom_name:
|
||||
description: 'Custom name for the model (no date, only name)'
|
||||
required: false
|
||||
type: string
|
||||
recompiled_dir:
|
||||
description: 'Existing recompiled directory number (e.g. 3 for recompiled3)'
|
||||
required: true
|
||||
type: string
|
||||
json_version:
|
||||
description: 'driving_models version number to update (e.g. 5 for driving_models_v5.json)'
|
||||
required: true
|
||||
type: string
|
||||
artifact_suffix:
|
||||
description: 'Suffix for artifact name'
|
||||
required: false
|
||||
type: string
|
||||
default: ''
|
||||
bypass_push:
|
||||
description: 'Bypass pushing to GitLab for build-all'
|
||||
required: false
|
||||
default: true
|
||||
type: boolean
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
build_model_ref:
|
||||
description: 'Branch to use for build-model workflow'
|
||||
required: false
|
||||
default: 'master-new'
|
||||
type: string
|
||||
upstream_branch:
|
||||
description: 'Upstream commit to build from'
|
||||
required: true
|
||||
@@ -26,10 +49,12 @@ on:
|
||||
type: number
|
||||
model_folder:
|
||||
description: 'Model folder'
|
||||
required: true
|
||||
type: choice
|
||||
default: 'None'
|
||||
options:
|
||||
- None
|
||||
- Simple Plan Models
|
||||
- Space Lab Models
|
||||
- TR Models
|
||||
- DTR Models
|
||||
- Custom Merge Models
|
||||
@@ -47,13 +72,27 @@ on:
|
||||
description: 'Minimum selector version'
|
||||
required: false
|
||||
type: number
|
||||
env:
|
||||
RECOMPILED_DIR: recompiled${{ inputs.recompiled_dir }}
|
||||
JSON_FILE: docs/docs/driving_models_v${{ inputs.json_version }}.json
|
||||
|
||||
jobs:
|
||||
build-single:
|
||||
build_model:
|
||||
uses: ./.github/workflows/sunnypilot-build-model.yaml
|
||||
with:
|
||||
upstream_branch: ${{ inputs.upstream_branch }}
|
||||
custom_name: ${{ inputs.custom_name || inputs.upstream_branch }}
|
||||
is_20hz: true
|
||||
artifact_suffix: ${{ inputs.artifact_suffix }}
|
||||
secrets: inherit
|
||||
|
||||
publish_model:
|
||||
if: ${{ !inputs.bypass_push && !cancelled() }}
|
||||
concurrency:
|
||||
group: gitlab-push-${{ inputs.recompiled_dir }}
|
||||
cancel-in-progress: false
|
||||
needs: build_model
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
RECOMPILED_DIR: recompiled${{ github.event.inputs.recompiled_dir }}
|
||||
JSON_FILE: docs/docs/driving_models_v${{ github.event.inputs.json_version }}.json
|
||||
steps:
|
||||
- name: Set up SSH
|
||||
uses: webfactory/ssh-agent@v0.9.0
|
||||
@@ -96,66 +135,49 @@ jobs:
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Install dependencies
|
||||
- name: Download artifact name file
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: artifact-name-${{ inputs.custom_name || inputs.upstream_branch }}
|
||||
path: artifact_name
|
||||
|
||||
- name: Read artifact name
|
||||
id: read-artifact-name
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y jq gh
|
||||
|
||||
- name: Build model
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
START_TIME=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
|
||||
gh workflow run sunnypilot-build-model.yaml \
|
||||
--repo sunnypilot/sunnypilot \
|
||||
--ref "${{ github.event.inputs.build_model_ref }}" \
|
||||
-f upstream_branch="${{ github.event.inputs.upstream_branch }}" \
|
||||
-f custom_name="${{ github.event.inputs.custom_name }}"
|
||||
|
||||
for i in {1..24}; do
|
||||
RUN_ID=$(gh run list --repo sunnypilot/sunnypilot --workflow=sunnypilot-build-model.yaml --branch="${{ github.event.inputs.build_model_ref }}" --created ">$START_TIME" --limit=1 --json databaseId --jq '.[0].databaseId')
|
||||
if [ -n "$RUN_ID" ]; then
|
||||
break
|
||||
fi
|
||||
sleep 5
|
||||
done
|
||||
|
||||
if [ -z "$RUN_ID" ]; then
|
||||
echo "Could not find the triggered workflow run."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Watching run ID: $RUN_ID"
|
||||
gh run watch "$RUN_ID" --repo sunnypilot/sunnypilot
|
||||
CONCLUSION=$(gh run view "$RUN_ID" --repo sunnypilot/sunnypilot --json conclusion --jq '.conclusion')
|
||||
echo "Run concluded with: $CONCLUSION"
|
||||
if [[ "$CONCLUSION" != "success" ]]; then
|
||||
echo "Workflow run failed with conclusion: $CONCLUSION"
|
||||
exit 1
|
||||
fi
|
||||
ARTIFACT_NAME=$(cat artifact_name/artifact_name.txt)
|
||||
echo "artifact_name=$ARTIFACT_NAME" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Download and extract model artifact
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: ${{ steps.read-artifact-name.outputs.artifact_name }}
|
||||
path: output
|
||||
|
||||
- name: Remove unwanted files
|
||||
run: |
|
||||
ARTIFACT_DIR="gitlab_docs/models/$RECOMPILED_DIR"
|
||||
RUN_ID=$(gh run list --repo sunnypilot/sunnypilot --workflow=sunnypilot-build-model.yaml --branch="${{ github.event.inputs.build_model_ref }}" --limit=1 --json databaseId --jq '.[0].databaseId')
|
||||
ARTIFACT_NAME=$(gh api repos/sunnypilot/sunnypilot/actions/runs/$RUN_ID/artifacts --jq '.artifacts[] | select(.name | startswith("model-")) | .name')
|
||||
echo "Downloading artifact: $ARTIFACT_NAME from run: $RUN_ID"
|
||||
mkdir -p "$ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
echo "Created directory: $ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
gh run download "$RUN_ID" --repo sunnypilot/sunnypilot -n "$ARTIFACT_NAME" --dir "$ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
echo "Downloaded artifact zip(s) to: $ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
ZIP_PATH=$(find "$ARTIFACT_DIR/$ARTIFACT_NAME" -type f -name '*.zip' | head -n1)
|
||||
if [ -n "$ZIP_PATH" ]; then
|
||||
echo "Unzipping $ZIP_PATH to $ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
unzip -o "$ZIP_PATH" -d "$ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
rm -f "$ZIP_PATH"
|
||||
echo "Unzipped and removed $ZIP_PATH"
|
||||
else
|
||||
echo "No zip file found in $ARTIFACT_DIR/$ARTIFACT_NAME"
|
||||
fi
|
||||
echo "Done processing $ARTIFACT_NAME"
|
||||
find output -type f -name 'dmonitoring_model_tinygrad.pkl' -delete
|
||||
find output -type f -name 'dmonitoring_model.onnx' -delete
|
||||
|
||||
- name: Copy model artifact(s) to GitLab recompiled dir
|
||||
env:
|
||||
ARTIFACT_NAME: ${{ steps.read-artifact-name.outputs.artifact_name }}
|
||||
run: |
|
||||
ARTIFACT_DIR="gitlab_docs/models/${RECOMPILED_DIR}/${ARTIFACT_NAME}"
|
||||
mkdir -p "$ARTIFACT_DIR"
|
||||
for path in output/*; do
|
||||
if [ "$(basename "$path")" = "artifact_name.txt" ]; then
|
||||
continue
|
||||
fi
|
||||
name="$(basename "$path")"
|
||||
if [ -d "$path" ]; then
|
||||
mkdir -p "$ARTIFACT_DIR/$name"
|
||||
cp -r "$path"/* "$ARTIFACT_DIR/$name/"
|
||||
echo "Copied dir $name -> $ARTIFACT_DIR/$name"
|
||||
else
|
||||
cp "$path" "$ARTIFACT_DIR/"
|
||||
echo "Copied file $name -> $ARTIFACT_DIR/"
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Push recompiled dir to GitLab
|
||||
env:
|
||||
@@ -163,28 +185,36 @@ jobs:
|
||||
run: |
|
||||
cd gitlab_docs
|
||||
git checkout main
|
||||
git pull origin main
|
||||
for d in models/"$RECOMPILED_DIR"/*/; do
|
||||
git sparse-checkout add "$d"
|
||||
done
|
||||
git add models/"$RECOMPILED_DIR"
|
||||
git config --global user.name "GitHub Action"
|
||||
git config --global user.email "action@github.com"
|
||||
git commit -m "Update $RECOMPILED_DIR with new/updated model from build-single-tinygrad-model" || echo "No changes to commit"
|
||||
git commit -m "Create/Update $RECOMPILED_DIR with new/updated model from build-single-tinygrad-model" || echo "No changes to commit"
|
||||
git push origin main
|
||||
|
||||
- run: |
|
||||
cd docs
|
||||
git pull origin gh-pages
|
||||
|
||||
- name: Run json_parser.py to update JSON
|
||||
run: |
|
||||
FOLDER="${{ github.event.inputs.model_folder }}"
|
||||
FOLDER="${{ inputs.model_folder }}"
|
||||
if [ "$FOLDER" = "Other" ]; then
|
||||
FOLDER="${{ github.event.inputs.custom_model_folder }}"
|
||||
FOLDER="${{ inputs.custom_model_folder }}"
|
||||
fi
|
||||
ARGS=""
|
||||
[ -n "$FOLDER" ] && ARGS="$ARGS --model-folder \"$FOLDER\""
|
||||
[ -n "${{ github.event.inputs.generation }}" ] && ARGS="$ARGS --generation \"${{ github.event.inputs.generation }}\""
|
||||
[ -n "${{ github.event.inputs.version }}" ] && ARGS="$ARGS --version \"${{ github.event.inputs.version }}\""
|
||||
if [ "$FOLDER" != "None" ] && [ -n "$FOLDER" ]; then
|
||||
ARGS="$ARGS --model-folder \"$FOLDER\""
|
||||
fi
|
||||
[ -n "${{ inputs.generation }}" ] && ARGS="$ARGS --generation \"${{ inputs.generation }}\""
|
||||
[ -n "${{ inputs.version }}" ] && ARGS="$ARGS --version \"${{ inputs.version }}\""
|
||||
eval python3 docs/json_parser.py \
|
||||
--json-path "$JSON_FILE" \
|
||||
--recompiled-dir "gitlab_docs/models/$RECOMPILED_DIR" \
|
||||
--sort-by-date \
|
||||
$ARGS
|
||||
|
||||
- name: Push updated JSON to GitHub docs repo
|
||||
|
||||
70
.github/workflows/sunnypilot-build-model.yaml
vendored
70
.github/workflows/sunnypilot-build-model.yaml
vendored
@@ -9,6 +9,27 @@ env:
|
||||
MODELS_DIR: ${{ github.workspace }}/selfdrive/modeld/models
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
upstream_branch:
|
||||
description: 'Upstream branch to build from'
|
||||
required: true
|
||||
default: 'master'
|
||||
type: string
|
||||
custom_name:
|
||||
description: 'Custom name for the model (no date, only name)'
|
||||
required: false
|
||||
type: string
|
||||
is_20hz:
|
||||
description: 'Is this a 20Hz model'
|
||||
required: false
|
||||
type: boolean
|
||||
default: true
|
||||
artifact_suffix:
|
||||
description: 'Suffix for artifact name'
|
||||
required: false
|
||||
type: string
|
||||
default: ''
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
upstream_branch:
|
||||
@@ -32,34 +53,53 @@ run-name: Build model [${{ inputs.custom_name || inputs.upstream_branch }}] from
|
||||
jobs:
|
||||
get_model:
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
REF: ${{ inputs.upstream_branch }}
|
||||
outputs:
|
||||
model_date: ${{ steps.commit-date.outputs.model_date }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
# Note: To allow dynamic models from both openpilot and sunnypilot (merges/mashups), we try commaai as default,
|
||||
# and fallback to sunnypilot if the ref checkout fails.
|
||||
- name: Checkout commaai/openpilot
|
||||
id: checkout_upstream
|
||||
continue-on-error: true
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
repository: ${{ env.UPSTREAM_REPO }}
|
||||
ref: ${{ github.event.inputs.upstream_branch }}
|
||||
repository: commaai/openpilot
|
||||
ref: ${{ inputs.upstream_branch }}
|
||||
submodules: recursive
|
||||
path: openpilot
|
||||
|
||||
- name: Fallback to sunnypilot/sunnypilot
|
||||
if: steps.checkout_upstream.outcome == 'failure'
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
repository: sunnypilot/sunnypilot
|
||||
ref: ${{ inputs.upstream_branch }}
|
||||
submodules: recursive
|
||||
path: openpilot
|
||||
- name: Get commit date
|
||||
id: commit-date
|
||||
run: |
|
||||
# Get the commit date in YYYY-MM-DD format
|
||||
cd ${{ github.workspace }}/openpilot
|
||||
commit_date=$(git log -1 --format=%cd --date=format:'%B %d, %Y')
|
||||
echo "model_date=${commit_date}" >> $GITHUB_OUTPUT
|
||||
cat $GITHUB_OUTPUT
|
||||
- run: git lfs pull
|
||||
- run: |
|
||||
cd ${{ github.workspace }}/openpilot
|
||||
git lfs pull
|
||||
- name: 'Upload Artifact'
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: models
|
||||
path: ${{ github.workspace }}/selfdrive/modeld/models/*.onnx
|
||||
name: models-${{ env.REF }}${{ inputs.artifact_suffix }}
|
||||
path: ${{ github.workspace }}/openpilot/selfdrive/modeld/models/*.onnx
|
||||
|
||||
build_model:
|
||||
runs-on: [self-hosted, tici]
|
||||
needs: get_model
|
||||
env:
|
||||
MODEL_NAME: ${{ inputs.custom_name || inputs.upstream_branch }} (${{ needs.get_model.outputs.model_date }})
|
||||
|
||||
REF: ${{ inputs.upstream_branch }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
@@ -114,7 +154,7 @@ jobs:
|
||||
- name: Download model artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: models
|
||||
name: models-${{ env.REF }}${{ inputs.artifact_suffix }}
|
||||
path: ${{ github.workspace }}/selfdrive/modeld/models
|
||||
|
||||
- name: Build Model
|
||||
@@ -157,12 +197,22 @@ jobs:
|
||||
--upstream-branch "${{ inputs.upstream_branch }}" \
|
||||
${{ inputs.is_20hz && '--is-20hz' || '' }}
|
||||
|
||||
- name: Write artifact name to file
|
||||
run: echo "model-${{ env.MODEL_NAME }}${{ inputs.artifact_suffix }}-${{ github.run_number }}" > ${{ env.OUTPUT_DIR }}/artifact_name.txt
|
||||
|
||||
- name: Upload Build Artifacts
|
||||
id: upload-artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: model-${{ env.MODEL_NAME }}-${{ github.run_number }}
|
||||
name: model-${{ env.MODEL_NAME }}${{ inputs.artifact_suffix }}-${{ github.run_number }}
|
||||
path: ${{ env.OUTPUT_DIR }}
|
||||
|
||||
- name: Upload artifact name file
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: artifact-name-${{ inputs.custom_name || inputs.upstream_branch }}
|
||||
path: ${{ env.OUTPUT_DIR }}/artifact_name.txt
|
||||
|
||||
- name: Re-enable powersave
|
||||
if: always()
|
||||
run: |
|
||||
|
||||
Reference in New Issue
Block a user