Skip to content

Cache flash attention tracing (#8026) #10003

Cache flash attention tracing (#8026)

Cache flash attention tracing (#8026) #10003

Triggered via push September 17, 2024 00:22
Status Success
Total duration 1h 21m 45s
Artifacts 4
get-torch-commit
3s
get-torch-commit
Build XLA CUDA plugin  /  build
5m 18s
Build XLA CUDA plugin / build
Build PyTorch/XLA  /  build
23m 40s
Build PyTorch/XLA / build
Build PyTorch with CUDA  /  build
23m 54s
Build PyTorch with CUDA / build
Matrix: GPU tests / test
Matrix: CPU tests / test
Matrix: GPU tests requiring torch CUDA / test
Fit to window
Zoom out
Zoom in

Annotations

1 warning
TPU tests / tpu-test
This self-hosted runner is currently using runner version 2.317.0. This version is out of date. Please update to the latest version 2.319.1

Artifacts

Produced during runtime
Name Size
cpp-test-bin
699 MB
cuda-plugin
130 MB
torch-with-cuda
342 MB
torch-xla-wheels
217 MB