mirror of
https://github.com/sunnypilot/sunnypilot.git
synced 2026-02-18 20:03:53 +08:00
ci: Add branch reset workflow and improve squash script (#579)
* Add nightly branch reset workflow and improve squash script Introduced a GitHub Actions workflow to reset and squash PRs nightly for the `master-dev-c3-new-test` branch. Enhanced `squash_and_merge.py` to handle more specific exit codes and `squash_and_merge_prs.py` to streamline PR processing. Updated argument handling in scripts and added validation for squash script execution. * Forcing to show up * UnForcing to show up * Refactor branch handling to use inputs/environments directly Removed intermediate step for setting branch variables and updated logic to use `inputs` or environment variables directly. Simplifies the workflow and improves maintainability by reducing redundancy and reliance on unnecessary outputs. * Fix Python script invocation in CI workflow. Replaced implicit script execution with an explicit `python3` command to ensure compatibility and consistency in the workflow. This change resolves potential issues with shell defaults or system configurations. * Update branch reset logic in workflow script Replaces checkout-based branch reset with deletion and recreation to ensure the target branch correctly points to the source branch. This change handles cases where the target branch may already exist. * Refactor PR data handling to parse JSON input. Updated the script to parse and handle PR data as JSON, ensuring proper data structure during processing. Adjusted functions to operate on parsed JSON instead of raw arguments for improved clarity and error handling. * Refactor PR processing to streamline branch handling Removed redundant `fetch_pr_branches` function and integrated branch fetching directly into `process_pr`. Simplified subprocess calls for clarity and added branch cleanup to prevent conflicts. This improves code maintainability and execution efficiency. * Add PR number to squash and merge commit titles This change appends the PR number to the title used in squash and merge commits, improving traceability and clarity in the commit history. The modification ensures easier identification of commits linked to specific pull requests. * Enhance PR processing with sorting and additional checks Implemented sorting of PRs by creation date and added checks for merge conflicts, commit data availability, and status check completion. Updated nightly squash script to provide detailed feedback via PR comments for skipped PRs. These changes improve the reliability and traceability of the merge process. * Add traceback logging to error handling in squash_and_merge_prs Enhanced error reporting by including full traceback details when an exception occurs in the `process_pr` function. This aids in debugging by providing more context on failures. * Refactor and add debug output to PR merge check logic Simplified multi-line statements into single lines for clarity and added debug `print` statements to log `merge_status` and its output. These changes enhance readability and facilitate debugging of the merge conflict check process. * Add GITHUB_TOKEN to environment for CI workflow Ensure the workflow has access to the GITHUB_TOKEN for authentication. This is necessary for interacting securely with the GitHub API during the CI process. Without this, some steps may fail due to lack of authorization. * Enable scheduled workflow execution at midnight UTC Reactivates the cron schedule for the workflow to run daily at midnight UTC. This ensures the workflow executes automatically without manual triggers, maintaining regular updates or checks. * test * Update workflow to trigger and monitor selfdrive tests Replaced the direct triggering of the prebuilt workflow with a step to trigger and wait for the completion of selfdrive tests. Ensures prebuilt workflow runs only if selfdrive tests succeed, improving reliability of the CI process. * Refine workflow trigger to fetch run URL and ID Updated the workflow script to extract the run URL and derive the workflow ID from it. This ensures more accurate handling and tracking of GitHub Actions runs. * Simplify selfdrive test triggering in workflow Replaces custom script with a reusable GitHub Action to trigger and wait for selfdrive test completion. This improves maintainability and reduces complexity in the workflow file. Adjusts subsequent prebuilt workflow trigger to ensure compatibility. * Remove duplicate prebuilt workflow trigger step The redundant step for triggering the sunnypilot prebuilt workflow has been removed. This cleanup avoids unnecessary duplication and ensures a more streamlined workflow definition. * Update selfdrive test trigger to use GitHub CLI commands Replaced the third-party action with GitHub CLI for triggering and monitoring selfdrive tests. This change improves maintainability and reduces reliance on external dependencies. Updated related steps to ensure compatibility with the new approach. * Add delay to ensure selfdrive tests workflow starts Introduce a 120-second sleep before fetching the latest run ID to allow sufficient time for the selfdrive tests action to initialize. This prevents potential issues caused by attempting to retrieve the run ID too early. * Improve push step to check for diffs before execution Added logic to verify if there are differences between local and remote branches before attempting a push. This prevents unnecessary pushes and skips subsequent workflows when no changes are detected. Updated dependent steps to conditionally run based on the presence of changes. * Update target branch and improve squash comment clarity Renamed the default target branch from `master-dev-c3-new-test` to `nightly` in the workflow configuration. Enhanced squash process comments to dynamically reflect the `target_branch` value for better clarity and consistency. * Add missing newline at end of file Ensures the file complies with POSIX standards by including a newline at the end. This improves consistency and prevents potential issues with some tools or systems. * Update default target branch and disable nightly schedule Changed the default target branch to 'master-dev-c3-new' for workflow consistency. Commented out the nightly schedule to temporarily disable automated runs. No functional changes were made to other parts of the workflow. * Refactored squash and merge scripts for improved PR handling In this commit, significant updates have been made to the 'squash_and_merge.py' and 'squash_and_merge_prs.py' scripts related to how pull requests (PRs) are processed. A 'source-branch' argument has been added to the argument parser. The merging command has been changed from 'merge' to 'rebase'. The PR processing function has been refined. Specifically, PR validation is now a separate function confirming the conditions 'branch name', 'commit data', 'check pass status', and 'mergeability'. Now, any failures under these conditions result in skipping the PR with an appropriate warning. The target branch is deleted if it exists, before a new one is created from the source branch. The squash script now runs with more structured arguments. These changes generally improve PR handling in CI/CD pipelines, making them more efficient and error-resistant. * Add 'PullRequest' to .codespellignore This update includes 'PullRequest' in the .codespellignore file to prevent it from being flagged as a spelling error. It helps streamline code reviews and reduces false positives during spell checks.
This commit is contained in:
@@ -1,2 +1,3 @@
|
||||
Wen
|
||||
REGIST
|
||||
PullRequest
|
||||
|
||||
162
.github/workflows/sunnypilot-master-dev-c3-prep.yaml
vendored
Normal file
162
.github/workflows/sunnypilot-master-dev-c3-prep.yaml
vendored
Normal file
@@ -0,0 +1,162 @@
|
||||
name: Nightly Branch Reset and PR Squash
|
||||
|
||||
env:
|
||||
DEFAULT_SOURCE_BRANCH: "master-new"
|
||||
DEFAULT_TARGET_BRANCH: "nightly"
|
||||
PR_LABEL: "dev-c3"
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
source_branch:
|
||||
description: 'Source branch to reset from'
|
||||
required: true
|
||||
default: 'master-new'
|
||||
type: string
|
||||
target_branch:
|
||||
description: 'Target branch to reset and squash into'
|
||||
required: true
|
||||
default: 'master-dev-c3-new'
|
||||
type: string
|
||||
# schedule:
|
||||
# - cron: '0 0 * * *' # Run at midnight UTC for nightly
|
||||
|
||||
jobs:
|
||||
reset-and-squash:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0 # Fetch all history for all branches
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Configure Git
|
||||
run: |
|
||||
git config --global user.name 'github-actions[bot]'
|
||||
git config --global user.email 'github-actions[bot]@users.noreply.github.com'
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.10'
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install PyGithub
|
||||
|
||||
- name: Check branches exist
|
||||
run: |
|
||||
# Check if source branch exists
|
||||
if ! git ls-remote --heads origin ${{ inputs.source_branch || env.DEFAULT_SOURCE_BRANCH }} | grep -q "${{ inputs.source_branch || env.DEFAULT_SOURCE_BRANCH }}"; then
|
||||
echo "Source branch ${{ inputs.source_branch || env.DEFAULT_SOURCE_BRANCH }} does not exist!"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Make sure we have the latest source branch
|
||||
git fetch origin ${{ inputs.source_branch || env.DEFAULT_SOURCE_BRANCH }}
|
||||
|
||||
# Check if target branch exists
|
||||
if ! git ls-remote --heads origin ${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }} | grep -q "${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }}"; then
|
||||
echo "Target branch ${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }} does not exist, creating it from ${{ inputs.source_branch || env.DEFAULT_SOURCE_BRANCH }}"
|
||||
git checkout -b ${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }} origin/${{ inputs.source_branch || env.DEFAULT_SOURCE_BRANCH }}
|
||||
git push origin ${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }}
|
||||
else
|
||||
# Fetch target branch if it exists
|
||||
git fetch origin ${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }}
|
||||
fi
|
||||
|
||||
- name: Reset target branch
|
||||
run: |
|
||||
echo "Resetting ${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }} to match ${{ inputs.source_branch || env.DEFAULT_SOURCE_BRANCH }}"
|
||||
# Delete if exists and recreate pointing to source
|
||||
git branch -D ${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }} || true
|
||||
git branch ${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }} origin/${{ inputs.source_branch || env.DEFAULT_SOURCE_BRANCH }}
|
||||
|
||||
- name: Get PRs to squash
|
||||
id: get-prs
|
||||
run: |
|
||||
# Use GitHub API to get PRs with specific label, ordered by creation date
|
||||
PR_LIST=$(gh api graphql -f query='
|
||||
query($label:String!) {
|
||||
search(query: $label, type:ISSUE, first:100) {
|
||||
nodes {
|
||||
... on PullRequest {
|
||||
number
|
||||
headRefName
|
||||
title
|
||||
createdAt
|
||||
commits(last: 1) {
|
||||
nodes {
|
||||
commit {
|
||||
statusCheckRollup {
|
||||
state
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}' -F label="is:pr is:open label:${PR_LABEL} sort:created-asc")
|
||||
|
||||
echo "PR_LIST=${PR_LIST}" >> $GITHUB_OUTPUT
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Process PRs
|
||||
run: |
|
||||
python3 ${{ github.workspace }}/release/ci/squash_and_merge_prs.py \
|
||||
--pr-data '${{ steps.get-prs.outputs.PR_LIST }}' \
|
||||
--target-branch ${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }} \
|
||||
--squash-script-path '${{ github.workspace }}/release/ci/squash_and_merge.py'
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Push changes if there are diffs
|
||||
id: push-changes # Add an id so we can reference this step
|
||||
run: |
|
||||
TARGET_BRANCH="${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }}"
|
||||
|
||||
# Fetch the latest from remote
|
||||
git fetch origin $TARGET_BRANCH
|
||||
|
||||
# Check for diffs between local and remote
|
||||
if git diff $TARGET_BRANCH origin/$TARGET_BRANCH --quiet; then
|
||||
echo "No changes to push - local and remote branches are identical"
|
||||
echo "has_changes=false" >> $GITHUB_OUTPUT
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# If we get here, there are diffs, so push
|
||||
if ! git push origin $TARGET_BRANCH --force; then
|
||||
echo "Failed to push changes to $TARGET_BRANCH"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Branch $TARGET_BRANCH has been reset and updated with squashed PRs"
|
||||
echo "has_changes=true" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Trigger and wait for selfdrive tests
|
||||
if: steps.push-changes.outputs.has_changes == 'true'
|
||||
run: |
|
||||
echo "Triggering selfdrive tests..."
|
||||
gh workflow run selfdrive_tests.yaml --ref "${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }}"
|
||||
|
||||
echo "Sleeping for 120s to give plenty of time for the action to start and then we wait"
|
||||
sleep 120
|
||||
|
||||
echo "Getting latest run ID..."
|
||||
RUN_ID=$(gh run list --workflow=selfdrive_tests.yaml --branch="${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }}" --limit=1 --json databaseId --jq '.[0].databaseId')
|
||||
|
||||
echo "Watching run ID: $RUN_ID"
|
||||
gh run watch "$RUN_ID"
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Trigger prebuilt workflow
|
||||
if: success() && steps.push-changes.outputs.has_changes == 'true'
|
||||
run: |
|
||||
gh workflow run sunnypilot-build-prebuilt.yaml --ref "${{ inputs.target_branch || env.DEFAULT_TARGET_BRANCH }}"
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
@@ -117,7 +117,7 @@ def workspace_manager(original_branch: str):
|
||||
if temp_branch:
|
||||
run_command(f"git branch -D {temp_branch}")
|
||||
print("\nOperation interrupted, but changes were already restored.")
|
||||
sys.exit(1)
|
||||
sys.exit(3)
|
||||
|
||||
# First, switch back to original branch
|
||||
current = get_current_branch()
|
||||
@@ -139,12 +139,12 @@ def workspace_manager(original_branch: str):
|
||||
|
||||
if signum:
|
||||
print("\nOperation interrupted. Cleaned up and restored original state.")
|
||||
sys.exit(1)
|
||||
sys.exit(4)
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error during cleanup: {e}")
|
||||
if signum:
|
||||
sys.exit(1)
|
||||
sys.exit(5)
|
||||
|
||||
try:
|
||||
# Set up signal handlers
|
||||
@@ -275,7 +275,7 @@ def squash_and_merge(source_branch: str, target_branch: str, manual_title: str |
|
||||
return False
|
||||
|
||||
print(f"Attempting to merge changes from {temp_branch}...")
|
||||
code, _, error = run_command(f"git merge {temp_branch}")
|
||||
code, _, error = run_command(f"git rebase {temp_branch}")
|
||||
|
||||
if code != 0:
|
||||
print(f"\nMerge failed with error: {error}")
|
||||
@@ -344,7 +344,7 @@ def main():
|
||||
parser.add_argument('--push', action='store_true',
|
||||
help='Push changes to remote after squashing')
|
||||
|
||||
args = parser.parse_args()
|
||||
args, unknown = parser.parse_known_args()
|
||||
|
||||
# Determine source branch early
|
||||
source_branch = args.source
|
||||
@@ -354,7 +354,7 @@ def main():
|
||||
sys.exit(1)
|
||||
|
||||
if not squash_and_merge(source_branch, args.target, args.title, args.backup, args.push):
|
||||
sys.exit(1)
|
||||
sys.exit(2)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
163
release/ci/squash_and_merge_prs.py
Executable file
163
release/ci/squash_and_merge_prs.py
Executable file
@@ -0,0 +1,163 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import subprocess
|
||||
import sys
|
||||
import os
|
||||
import argparse
|
||||
import json
|
||||
from datetime import datetime
|
||||
|
||||
|
||||
def setup_argument_parser():
|
||||
parser = argparse.ArgumentParser(description='Process and squash GitHub PRs')
|
||||
parser.add_argument('--pr-data', type=str, help='PR data in JSON format')
|
||||
parser.add_argument('--source-branch', type=str, default='master-new',
|
||||
help='Source branch for merging')
|
||||
parser.add_argument('--target-branch', type=str, default='master-dev-c3-new-test',
|
||||
help='Target branch for merging')
|
||||
parser.add_argument('--squash-script-path', type=str, required=True,
|
||||
help='Path to the squash_and_merge.py script')
|
||||
return parser
|
||||
|
||||
|
||||
def validate_squash_script(script_path):
|
||||
if not os.path.isfile(script_path):
|
||||
raise FileNotFoundError(f"Squash script not found at: {script_path}")
|
||||
if not os.access(script_path, os.X_OK):
|
||||
raise PermissionError(f"Squash script is not executable: {script_path}")
|
||||
|
||||
|
||||
def sort_prs_by_creation(pr_data):
|
||||
"""Sort PRs by creation date"""
|
||||
nodes = (pr_data.get('data', {}).get('search', {}).get('nodes', []))
|
||||
|
||||
return sorted(
|
||||
nodes,
|
||||
key=lambda x: datetime.fromisoformat(x.get('createdAt', '').replace('Z', '+00:00'))
|
||||
)
|
||||
|
||||
|
||||
def add_pr_comment(pr_number, comment):
|
||||
"""Add a comment to a PR using gh cli"""
|
||||
try:
|
||||
subprocess.run(
|
||||
['gh', 'pr', 'comment', str(pr_number), '--body', comment],
|
||||
check=True,
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
except subprocess.CalledProcessError as e:
|
||||
print(f"Failed to add comment to PR #{pr_number}: {e.stderr}")
|
||||
|
||||
|
||||
def validate_pr(pr):
|
||||
"""Validate a PR and return (is_valid, skip_reason)"""
|
||||
pr_number = pr.get('number', 'UNKNOWN')
|
||||
branch = pr.get('headRefName', '')
|
||||
|
||||
if not branch:
|
||||
return False, f"Missing branch name for PR #{pr_number}"
|
||||
|
||||
# Check if checks have passed
|
||||
commits = pr.get('commits', {}).get('nodes', [])
|
||||
if not commits:
|
||||
return False, "No commit data found"
|
||||
|
||||
status = commits[0].get('commit', {}).get('statusCheckRollup', {})
|
||||
if not status or status.get('state') != 'SUCCESS':
|
||||
return False, "Not all checks have passed"
|
||||
|
||||
# Check for merge conflicts
|
||||
merge_status = subprocess.run(['gh', 'pr', 'view', str(pr_number), '--json', 'mergeable,mergeStateStatus'], capture_output=True, text=True)
|
||||
merge_data = json.loads(merge_status.stdout)
|
||||
if not merge_data.get('mergeable'):
|
||||
return False, "Merge conflicts detected"
|
||||
|
||||
if (mergeStateStatus := merge_data.get('mergeStateStatus')) != "CLEAN":
|
||||
return False, f"Branch is `{mergeStateStatus}`"
|
||||
|
||||
return True, None
|
||||
|
||||
|
||||
def process_pr(pr_data, source_branch, target_branch, squash_script_path):
|
||||
try:
|
||||
nodes = sort_prs_by_creation(pr_data)
|
||||
if not nodes:
|
||||
print("No PRs to squash")
|
||||
return 0
|
||||
|
||||
print(f"Deleting target branch {target_branch}")
|
||||
subprocess.run(['git', 'branch', '-D', target_branch], check=False)
|
||||
subprocess.run(['git', 'branch', target_branch, f'origin/{source_branch}'], check=True)
|
||||
success_count = 0
|
||||
for pr in nodes:
|
||||
pr_number = pr.get('number', 'UNKNOWN')
|
||||
branch = pr.get('headRefName', '')
|
||||
title = pr.get('title', '')
|
||||
is_valid, skip_reason = validate_pr(pr)
|
||||
|
||||
if not is_valid:
|
||||
print(f"Warning: {skip_reason} for PR #{pr_number}, skipping")
|
||||
add_pr_comment(pr_number, f"⚠️ This PR was skipped in the automated `{target_branch}` squash because {skip_reason}.")
|
||||
continue
|
||||
|
||||
try:
|
||||
# Fetch PR branch
|
||||
subprocess.run(['git', 'fetch', 'origin', branch], check=True)
|
||||
# Delete branch if it exists (ignore errors if it doesn't)
|
||||
subprocess.run(['git', 'branch', '-D', branch], check=False)
|
||||
# Create new branch pointing to origin's branch
|
||||
subprocess.run(['git', 'branch', branch, f'origin/{branch}'], check=True)
|
||||
|
||||
# Run squash script
|
||||
subprocess.run([
|
||||
squash_script_path,
|
||||
'--target', target_branch,
|
||||
'--source', branch,
|
||||
'--title', f"{title} (#{pr_number})",
|
||||
], check=True)
|
||||
|
||||
print(f"Successfully processed PR #{pr_number}")
|
||||
success_count += 1
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
print(f"Error processing PR #{pr_number}:")
|
||||
print(f"Command failed with exit code {e.returncode}")
|
||||
error_output = getattr(e, 'stderr', 'No error output available')
|
||||
print(f"Error output: {error_output}")
|
||||
add_pr_comment(pr_number, f"⚠️ Error during automated {target_branch} squash:\n```\n{error_output}\n```")
|
||||
continue
|
||||
except Exception as e:
|
||||
print(f"Unexpected error processing PR #{pr_number}: {str(e)}")
|
||||
continue
|
||||
|
||||
return success_count
|
||||
|
||||
except Exception as e:
|
||||
import traceback
|
||||
print(f"Error in process_pr: {str(e)}")
|
||||
print("Full traceback:")
|
||||
print(traceback.format_exc())
|
||||
return 0
|
||||
|
||||
|
||||
def main():
|
||||
parser = setup_argument_parser()
|
||||
try:
|
||||
args = parser.parse_args()
|
||||
validate_squash_script(args.squash_script_path)
|
||||
pr_data_json = json.loads(args.pr_data)
|
||||
|
||||
# Process the PRs
|
||||
success_count = process_pr(pr_data_json, args.source_branch, args.target_branch, args.squash_script_path)
|
||||
print(f"Successfully processed {success_count} PRs")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Fatal error: {str(e)}", file=sys.stderr)
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
Reference in New Issue
Block a user