-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Add windows ci build #1049
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add windows ci build #1049
Conversation
83b580c
to
8f3a30d
Compare
Deploy preview for pytorch-tutorials-preview ready! Built with commit 83b580c https://deploy-preview-1049--pytorch-tutorials-preview.netlify.app |
8f3a30d
to
0176dc8
Compare
Deploy preview for pytorch-tutorials-preview ready! Built with commit 8f3a30d https://deploy-preview-1049--pytorch-tutorials-preview.netlify.app |
Deploy preview for pytorch-tutorials-preview ready! Built with commit 618430e https://deploy-preview-1049--pytorch-tutorials-preview.netlify.app |
2ace553
to
c0eda34
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we can allow this to run on every single PR and every single merge for that matter, it just seems too expensive in what it actually costs to run for us to do that
.circleci/config.yml
Outdated
- pytorch_windows_build_worker_0: | ||
name: win_test_worker_0 | ||
- pytorch_windows_build_worker_1: | ||
name: win_test_worker_1 | ||
- pytorch_windows_build_worker_2: | ||
name: win_test_worker_2 | ||
- pytorch_windows_build_worker_3: | ||
name: win_test_worker_3 | ||
- pytorch_windows_build_worker_4: | ||
name: win_test_worker_4 | ||
- pytorch_windows_build_worker_5: | ||
name: win_test_worker_5 | ||
- pytorch_windows_build_worker_6: | ||
name: win_test_worker_6 | ||
- pytorch_windows_build_worker_7: | ||
name: win_test_worker_7 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does every worker need a GPU machine to run? Can some of the work be shifted to CPU-only machines? Can the amount of coverage on the Windows build be reduced so it doesn't need 8 workers?
If not, then yes, we'll need to limit this to master-only (that might be sufficient, since this repository isn't nearly as active as pytorch/pytorch).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Considering the parallel jobs (8 workers with most 55minutes for one) seems not much more efficient than one job (1 hour 36 minutes). I have changed the parallel jobs to a single one only on master branch.
@seemethere @yns88
.circleci/config.yml
Outdated
- pytorch_windows_build_worker_0: | ||
name: win_test_worker_0 | ||
- pytorch_windows_build_worker_1: | ||
name: win_test_worker_1 | ||
- pytorch_windows_build_worker_2: | ||
name: win_test_worker_2 | ||
- pytorch_windows_build_worker_3: | ||
name: win_test_worker_3 | ||
- pytorch_windows_build_worker_4: | ||
name: win_test_worker_4 | ||
- pytorch_windows_build_worker_5: | ||
name: win_test_worker_5 | ||
- pytorch_windows_build_worker_6: | ||
name: win_test_worker_6 | ||
- pytorch_windows_build_worker_7: | ||
name: win_test_worker_7 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here's what a manual approval would look like:
- pytorch_windows_build_worker_0: | |
name: win_test_worker_0 | |
- pytorch_windows_build_worker_1: | |
name: win_test_worker_1 | |
- pytorch_windows_build_worker_2: | |
name: win_test_worker_2 | |
- pytorch_windows_build_worker_3: | |
name: win_test_worker_3 | |
- pytorch_windows_build_worker_4: | |
name: win_test_worker_4 | |
- pytorch_windows_build_worker_5: | |
name: win_test_worker_5 | |
- pytorch_windows_build_worker_6: | |
name: win_test_worker_6 | |
- pytorch_windows_build_worker_7: | |
name: win_test_worker_7 | |
- hold: | |
type: approval | |
- pytorch_windows_build_worker_0: | |
name: win_test_worker_0 | |
requires: | |
- hold | |
- pytorch_windows_build_worker_1: | |
name: win_test_worker_1 | |
requires: | |
- hold | |
- pytorch_windows_build_worker_2: | |
name: win_test_worker_2 | |
requires: | |
- hold | |
- pytorch_windows_build_worker_3: | |
name: win_test_worker_3 | |
requires: | |
- hold | |
- pytorch_windows_build_worker_4: | |
name: win_test_worker_4 | |
requires: | |
- hold | |
- pytorch_windows_build_worker_5: | |
name: win_test_worker_5 | |
requires: | |
- hold | |
- pytorch_windows_build_worker_6: | |
name: win_test_worker_6 | |
requires: | |
- hold | |
- pytorch_windows_build_worker_7: | |
name: win_test_worker_7 | |
requires: | |
- hold |
Makefile
Outdated
@@ -45,18 +45,18 @@ download: | |||
mkdir -p intermediate_source/data | |||
|
|||
# transfer learning tutorial data | |||
wget -N https://download.pytorch.org/tutorial/hymenoptera_data.zip -P $(DATADIR) | |||
wget -N http://download.pytorch.org/tutorial/hymenoptera_data.zip -P $(DATADIR) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why the switch from https
-> http
?
I'd much prefer it to use https
over regular http
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why the switch from
https
->http
?I'd much prefer it to use
https
over regularhttp
?
fixed
04deb28
to
0fc7e4f
Compare
502ee0d
to
a86c122
Compare
a86c122
to
618430e
Compare
The CI failure seems not related to this change. |
I've re-kicked the failed jobs since they appear to be timeouts, will merge on green |
The CI for this PR failed. I've kicked the build on the win thread that failed. |
This reverts commit b97599a.
No description provided.