-
Notifications
You must be signed in to change notification settings - Fork 755
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CI] Add CI workflow to run compute-benchmarks on incoming syclos PRs #14454
base: sycl
Are you sure you want to change the base?
Conversation
- name: Upload sycl-bench microbenchmark results | ||
if: inputs.tests_selector == 'benchmark' && steps.run_benchmarks.outcome == 'success' | ||
uses: actions/upload-artifact@v4 | ||
with: | ||
name: sycl_benchmark_res_${{ steps.run_benchmarks.outputs.TIMESTAMP }} | ||
path: ${{ steps.run_benchmarks.outputs.BENCHMARK_RESULTS }} | ||
retention-days: 7 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That isn't very useful. I'd hope there are some GH actions that can draw some graphs with performance data of the project over time (if that data is "reported" properly to it).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the review Andrei! Udit and I were talking about graphing this a while back: As of right now this workflow has not yet been approved, as many details regarding the workflow needs to be hashed out. However, we do plan on graphing this data if this workflow gets approved.
The current idea that I have is to store/cache benchmarks periodically over time as long-term artifacts, and then graphing it using either a python or js action. Although, I have yet to sort out the details. This may potentially be janky, so I'm open to new ideas.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, we are yet to decide whether we will keep sycl-bench in intel/llvm CI. I guess we will also be exploring other micro benchmarking suites to find the ones appropriate for our use. This POC PR was just to evaluate the time it takes to run sycl-bench in CI.
This PR:
The current plan is to enable this benchmark to run nightly in order to catch regressions, although there is potential for this workflow to be used in precommit. As a result, a lot of components in this workflow are either separate reusable components, or directly written with precommit in mind. The current benchmarking workflow functions as so:
The workflows are fully configurable via benchmark-ci.conf; enabled compute-benchmarks tests can be configured via enabled_tests.conf.
Feel free to test out the workflow via manual dispatches of sycl-linux-run-tests.yml on branch benchmarking-workflow, but be aware that the run currently will always fail, as Github repository secrets are not yet added.