|
1 | 1 | PyTorch Recipes
|
2 | 2 | ---------------------------------------------
|
3 |
| -1. loading_data_recipe.py |
4 |
| - Loading Data in PyTorch |
5 |
| - https://pytorch.org/tutorials/recipes/recipes/loading_data_recipe.html |
6 |
| - |
7 |
| -2. defining_a_neural_network.py |
| 3 | +1. defining_a_neural_network.py |
8 | 4 | Defining a Neural Network in PyTorch
|
9 | 5 | https://pytorch.org/tutorials/recipes/recipes/defining_a_neural_network.html
|
10 | 6 |
|
11 |
| -3. what_is_state_dict.py |
| 7 | +2. what_is_state_dict.py |
12 | 8 | What is a state_dict in PyTorch
|
13 | 9 | https://pytorch.org/tutorials/recipes/recipes/what_is_state_dict.html
|
14 | 10 |
|
15 |
| -4. saving_and_loading_models_for_inference.py |
| 11 | +3. saving_and_loading_models_for_inference.py |
16 | 12 | Saving and loading models for inference in PyTorch
|
17 | 13 | https://pytorch.org/tutorials/recipes/recipes/saving_and_loading_models_for_inference.html
|
18 | 14 |
|
19 |
| -5. custom_dataset_transforms_loader.py |
| 15 | +4. custom_dataset_transforms_loader.py |
20 | 16 | Developing Custom PyTorch Dataloaders
|
21 | 17 | https://pytorch.org/tutorials/recipes/recipes/custom_dataset_transforms_loader.html
|
22 | 18 |
|
23 | 19 |
|
24 |
| -6. Captum_Recipe.py |
| 20 | +5. Captum_Recipe.py |
25 | 21 | Model Interpretability using Captum
|
26 | 22 | https://pytorch.org/tutorials/recipes/recipes/Captum_Recipe.html
|
27 | 23 |
|
28 |
| -7. dynamic_quantization.py |
| 24 | +6. dynamic_quantization.py |
29 | 25 | Dynamic Quantization
|
30 | 26 | https://pytorch.org/tutorials/recipes/recipes/dynamic_quantization.html
|
31 | 27 |
|
32 |
| -8. save_load_across_devices.py |
| 28 | +7. save_load_across_devices.py |
33 | 29 | Saving and loading models across devices in PyTorch
|
34 | 30 | https://pytorch.org/tutorials/recipes/recipes/save_load_across_devices.html
|
35 | 31 |
|
36 |
| -9. saving_and_loading_a_general_checkpoint.py |
| 32 | +8. saving_and_loading_a_general_checkpoint.py |
37 | 33 | Saving and loading a general checkpoint in PyTorch
|
38 | 34 | https://pytorch.org/tutorials/recipes/recipes/saving_and_loading_a_general_checkpoint.html
|
39 | 35 |
|
40 |
| -10. saving_and_loading_models_for_inference.py |
| 36 | +9. saving_and_loading_models_for_inference.py |
41 | 37 | Saving and loading models for inference in PyTorch
|
42 | 38 | https://pytorch.org/tutorials/recipes/recipes/saving_and_loading_models_for_inference.html
|
43 | 39 |
|
44 |
| -11. saving_multiple_models_in_one_file.py |
| 40 | +10. saving_multiple_models_in_one_file.py |
45 | 41 | Saving and loading multiple models in one file using PyTorch
|
46 | 42 | https://pytorch.org/tutorials/recipes/recipes/saving_multiple_models_in_one_file.html
|
47 | 43 |
|
48 |
| -12. warmstarting_model_using_parameters_from_a_different_model.py |
| 44 | +11. warmstarting_model_using_parameters_from_a_different_model.py |
49 | 45 | Warmstarting models using parameters from different model
|
50 | 46 | https://pytorch.org/tutorials/recipes/recipes/warmstarting_model_using_parameters_from_a_different_model.html
|
51 | 47 |
|
52 |
| -13. zeroing_out_gradients.py |
| 48 | +12. zeroing_out_gradients.py |
53 | 49 | Zeroing out gradients
|
54 | 50 | https://pytorch.org/tutorials/recipes/recipes/zeroing_out_gradients.html
|
55 | 51 |
|
56 |
| -14. mobile_perf.py |
| 52 | +13. mobile_perf.py |
57 | 53 | PyTorch Mobile Performance Recipes
|
58 | 54 | https://pytorch.org/tutorials/recipes/mobile_perf.html
|
59 | 55 |
|
60 |
| -15. amp_recipe.py |
| 56 | +14. amp_recipe.py |
61 | 57 | Automatic Mixed Precision
|
62 | 58 | https://pytorch.org/tutorials/recipes/amp_recipe.html
|
0 commit comments