Kano001 commited on
Commit
56d4caf
1 Parent(s): 864affd

Upload 7 files

Browse files
MLPY/Lib/site-packages/torchvision-0.18.1.dist-info/INSTALLER ADDED
@@ -0,0 +1 @@
 
 
1
+ pip
MLPY/Lib/site-packages/torchvision-0.18.1.dist-info/LICENSE ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ BSD 3-Clause License
2
+
3
+ Copyright (c) Soumith Chintala 2016,
4
+ All rights reserved.
5
+
6
+ Redistribution and use in source and binary forms, with or without
7
+ modification, are permitted provided that the following conditions are met:
8
+
9
+ * Redistributions of source code must retain the above copyright notice, this
10
+ list of conditions and the following disclaimer.
11
+
12
+ * Redistributions in binary form must reproduce the above copyright notice,
13
+ this list of conditions and the following disclaimer in the documentation
14
+ and/or other materials provided with the distribution.
15
+
16
+ * Neither the name of the copyright holder nor the names of its
17
+ contributors may be used to endorse or promote products derived from
18
+ this software without specific prior written permission.
19
+
20
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
21
+ AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
22
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
23
+ DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
24
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
25
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
26
+ SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
27
+ CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
28
+ OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
MLPY/Lib/site-packages/torchvision-0.18.1.dist-info/METADATA ADDED
@@ -0,0 +1,168 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.1
2
+ Name: torchvision
3
+ Version: 0.18.1
4
+ Summary: image and video datasets and models for torch deep learning
5
+ Home-page: https://github.com/pytorch/vision
6
+ Author: PyTorch Core Team
7
+ Author-email: [email protected]
8
+ License: BSD
9
+ Requires-Python: >=3.8
10
+ Description-Content-Type: text/markdown
11
+ License-File: LICENSE
12
+ Requires-Dist: numpy
13
+ Requires-Dist: torch (==2.3.1)
14
+ Requires-Dist: pillow (!=8.3.*,>=5.3.0)
15
+ Provides-Extra: scipy
16
+ Requires-Dist: scipy ; extra == 'scipy'
17
+
18
+ # torchvision
19
+
20
+ [![total torchvision downloads](https://pepy.tech/badge/torchvision)](https://pepy.tech/project/torchvision)
21
+ [![documentation](https://img.shields.io/badge/dynamic/json.svg?label=docs&url=https%3A%2F%2Fpypi.org%2Fpypi%2Ftorchvision%2Fjson&query=%24.info.version&colorB=brightgreen&prefix=v)](https://pytorch.org/vision/stable/index.html)
22
+
23
+ The torchvision package consists of popular datasets, model architectures, and common image transformations for computer
24
+ vision.
25
+
26
+ ## Installation
27
+
28
+ Please refer to the [official
29
+ instructions](https://pytorch.org/get-started/locally/) to install the stable
30
+ versions of `torch` and `torchvision` on your system.
31
+
32
+ To build source, refer to our [contributing
33
+ page](https://github.com/pytorch/vision/blob/main/CONTRIBUTING.md#development-installation).
34
+
35
+ The following is the corresponding `torchvision` versions and supported Python
36
+ versions.
37
+
38
+ | `torch` | `torchvision` | Python |
39
+ | ------------------ | ------------------ | ------------------- |
40
+ | `main` / `nightly` | `main` / `nightly` | `>=3.8`, `<=3.11` |
41
+ | `2.2` | `0.17` | `>=3.8`, `<=3.11` |
42
+ | `2.1` | `0.16` | `>=3.8`, `<=3.11` |
43
+ | `2.0` | `0.15` | `>=3.8`, `<=3.11` |
44
+
45
+ <details>
46
+ <summary>older versions</summary>
47
+
48
+ | `torch` | `torchvision` | Python |
49
+ |---------|-------------------|---------------------------|
50
+ | `1.13` | `0.14` | `>=3.7.2`, `<=3.10` |
51
+ | `1.12` | `0.13` | `>=3.7`, `<=3.10` |
52
+ | `1.11` | `0.12` | `>=3.7`, `<=3.10` |
53
+ | `1.10` | `0.11` | `>=3.6`, `<=3.9` |
54
+ | `1.9` | `0.10` | `>=3.6`, `<=3.9` |
55
+ | `1.8` | `0.9` | `>=3.6`, `<=3.9` |
56
+ | `1.7` | `0.8` | `>=3.6`, `<=3.9` |
57
+ | `1.6` | `0.7` | `>=3.6`, `<=3.8` |
58
+ | `1.5` | `0.6` | `>=3.5`, `<=3.8` |
59
+ | `1.4` | `0.5` | `==2.7`, `>=3.5`, `<=3.8` |
60
+ | `1.3` | `0.4.2` / `0.4.3` | `==2.7`, `>=3.5`, `<=3.7` |
61
+ | `1.2` | `0.4.1` | `==2.7`, `>=3.5`, `<=3.7` |
62
+ | `1.1` | `0.3` | `==2.7`, `>=3.5`, `<=3.7` |
63
+ | `<=1.0` | `0.2` | `==2.7`, `>=3.5`, `<=3.7` |
64
+
65
+ </details>
66
+
67
+ ## Image Backends
68
+
69
+ Torchvision currently supports the following image backends:
70
+
71
+ - torch tensors
72
+ - PIL images:
73
+ - [Pillow](https://python-pillow.org/)
74
+ - [Pillow-SIMD](https://github.com/uploadcare/pillow-simd) - a **much faster** drop-in replacement for Pillow with SIMD.
75
+
76
+ Read more in in our [docs](https://pytorch.org/vision/stable/transforms.html).
77
+
78
+ ## [UNSTABLE] Video Backend
79
+
80
+ Torchvision currently supports the following video backends:
81
+
82
+ - [pyav](https://github.com/PyAV-Org/PyAV) (default) - Pythonic binding for ffmpeg libraries.
83
+ - video_reader - This needs ffmpeg to be installed and torchvision to be built from source. There shouldn't be any
84
+ conflicting version of ffmpeg installed. Currently, this is only supported on Linux.
85
+
86
+ ```
87
+ conda install -c conda-forge 'ffmpeg<4.3'
88
+ python setup.py install
89
+ ```
90
+
91
+ # Using the models on C++
92
+
93
+ TorchVision provides an example project for how to use the models on C++ using JIT Script.
94
+
95
+ Installation From source:
96
+
97
+ ```
98
+ mkdir build
99
+ cd build
100
+ # Add -DWITH_CUDA=on support for the CUDA if needed
101
+ cmake ..
102
+ make
103
+ make install
104
+ ```
105
+
106
+ Once installed, the library can be accessed in cmake (after properly configuring `CMAKE_PREFIX_PATH`) via the
107
+ `TorchVision::TorchVision` target:
108
+
109
+ ```
110
+ find_package(TorchVision REQUIRED)
111
+ target_link_libraries(my-target PUBLIC TorchVision::TorchVision)
112
+ ```
113
+
114
+ The `TorchVision` package will also automatically look for the `Torch` package and add it as a dependency to
115
+ `my-target`, so make sure that it is also available to cmake via the `CMAKE_PREFIX_PATH`.
116
+
117
+ For an example setup, take a look at `examples/cpp/hello_world`.
118
+
119
+ Python linking is disabled by default when compiling TorchVision with CMake, this allows you to run models without any
120
+ Python dependency. In some special cases where TorchVision's operators are used from Python code, you may need to link
121
+ to Python. This can be done by passing `-DUSE_PYTHON=on` to CMake.
122
+
123
+ ### TorchVision Operators
124
+
125
+ In order to get the torchvision operators registered with torch (eg. for the JIT), all you need to do is to ensure that
126
+ you `#include <torchvision/vision.h>` in your project.
127
+
128
+ ## Documentation
129
+
130
+ You can find the API documentation on the pytorch website: <https://pytorch.org/vision/stable/index.html>
131
+
132
+ ## Contributing
133
+
134
+ See the [CONTRIBUTING](CONTRIBUTING.md) file for how to help out.
135
+
136
+ ## Disclaimer on Datasets
137
+
138
+ This is a utility library that downloads and prepares public datasets. We do not host or distribute these datasets,
139
+ vouch for their quality or fairness, or claim that you have license to use the dataset. It is your responsibility to
140
+ determine whether you have permission to use the dataset under the dataset's license.
141
+
142
+ If you're a dataset owner and wish to update any part of it (description, citation, etc.), or do not want your dataset
143
+ to be included in this library, please get in touch through a GitHub issue. Thanks for your contribution to the ML
144
+ community!
145
+
146
+ ## Pre-trained Model License
147
+
148
+ The pre-trained models provided in this library may have their own licenses or terms and conditions derived from the
149
+ dataset used for training. It is your responsibility to determine whether you have permission to use the models for your
150
+ use case.
151
+
152
+ More specifically, SWAG models are released under the CC-BY-NC 4.0 license. See
153
+ [SWAG LICENSE](https://github.com/facebookresearch/SWAG/blob/main/LICENSE) for additional details.
154
+
155
+ ## Citing TorchVision
156
+
157
+ If you find TorchVision useful in your work, please consider citing the following BibTeX entry:
158
+
159
+ ```bibtex
160
+ @software{torchvision2016,
161
+ title = {TorchVision: PyTorch's Computer Vision library},
162
+ author = {TorchVision maintainers and contributors},
163
+ year = 2016,
164
+ journal = {GitHub repository},
165
+ publisher = {GitHub},
166
+ howpublished = {\url{https://github.com/pytorch/vision}}
167
+ }
168
+ ```
MLPY/Lib/site-packages/torchvision-0.18.1.dist-info/RECORD ADDED
@@ -0,0 +1,382 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ torchvision-0.18.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
2
+ torchvision-0.18.1.dist-info/LICENSE,sha256=wGNj-dM2J9xRc7E1IkRMyF-7Rzn2PhbUWH1cChZbWx4,1546
3
+ torchvision-0.18.1.dist-info/METADATA,sha256=YxT0dITCC_3UTAsUzUVOJynTswZ4aDupfgT2mTya_9o,6613
4
+ torchvision-0.18.1.dist-info/RECORD,,
5
+ torchvision-0.18.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
6
+ torchvision-0.18.1.dist-info/WHEEL,sha256=fVcVlLzi8CGi_Ul8vjMdn8gER25dn5GBg9E6k9z41-Y,100
7
+ torchvision-0.18.1.dist-info/top_level.txt,sha256=ucJZoaluBW9BGYT4TuCE6zoZY_JuSP30wbDh-IRpxUU,12
8
+ torchvision/_C.pyd,sha256=CZpzriDpF2VZ-puYophZIJFyEmwFyIHOQuMKtXVmpio,642048
9
+ torchvision/__init__.py,sha256=GH5ATaN0Zkz9Lh2FAfjOfqtPR0YgegQpxmdRSca-VKo,3471
10
+ torchvision/__pycache__/__init__.cpython-39.pyc,,
11
+ torchvision/__pycache__/_internally_replaced_utils.cpython-39.pyc,,
12
+ torchvision/__pycache__/_meta_registrations.cpython-39.pyc,,
13
+ torchvision/__pycache__/_utils.cpython-39.pyc,,
14
+ torchvision/__pycache__/extension.cpython-39.pyc,,
15
+ torchvision/__pycache__/utils.cpython-39.pyc,,
16
+ torchvision/__pycache__/version.cpython-39.pyc,,
17
+ torchvision/_internally_replaced_utils.py,sha256=jnEH2UiEKQCUf1Bq2fITPKxxg-ZySJONIVoniEh8hGY,1439
18
+ torchvision/_meta_registrations.py,sha256=fvulO98RBcEUh5owNUbKFA29w9WKwkscK_Azqu5vjw4,7437
19
+ torchvision/_utils.py,sha256=kcSn6P3Vjv5QPgHHNmNu59Mh-NLb5MDlYxvDkNJWrOM,966
20
+ torchvision/datasets/__init__.py,sha256=IsRXN1eaTyE7Wb67BtBaDgZkAukXqYiblbziJ4lD6aI,3733
21
+ torchvision/datasets/__pycache__/__init__.cpython-39.pyc,,
22
+ torchvision/datasets/__pycache__/_optical_flow.cpython-39.pyc,,
23
+ torchvision/datasets/__pycache__/_stereo_matching.cpython-39.pyc,,
24
+ torchvision/datasets/__pycache__/caltech.cpython-39.pyc,,
25
+ torchvision/datasets/__pycache__/celeba.cpython-39.pyc,,
26
+ torchvision/datasets/__pycache__/cifar.cpython-39.pyc,,
27
+ torchvision/datasets/__pycache__/cityscapes.cpython-39.pyc,,
28
+ torchvision/datasets/__pycache__/clevr.cpython-39.pyc,,
29
+ torchvision/datasets/__pycache__/coco.cpython-39.pyc,,
30
+ torchvision/datasets/__pycache__/country211.cpython-39.pyc,,
31
+ torchvision/datasets/__pycache__/dtd.cpython-39.pyc,,
32
+ torchvision/datasets/__pycache__/eurosat.cpython-39.pyc,,
33
+ torchvision/datasets/__pycache__/fakedata.cpython-39.pyc,,
34
+ torchvision/datasets/__pycache__/fer2013.cpython-39.pyc,,
35
+ torchvision/datasets/__pycache__/fgvc_aircraft.cpython-39.pyc,,
36
+ torchvision/datasets/__pycache__/flickr.cpython-39.pyc,,
37
+ torchvision/datasets/__pycache__/flowers102.cpython-39.pyc,,
38
+ torchvision/datasets/__pycache__/folder.cpython-39.pyc,,
39
+ torchvision/datasets/__pycache__/food101.cpython-39.pyc,,
40
+ torchvision/datasets/__pycache__/gtsrb.cpython-39.pyc,,
41
+ torchvision/datasets/__pycache__/hmdb51.cpython-39.pyc,,
42
+ torchvision/datasets/__pycache__/imagenet.cpython-39.pyc,,
43
+ torchvision/datasets/__pycache__/imagenette.cpython-39.pyc,,
44
+ torchvision/datasets/__pycache__/inaturalist.cpython-39.pyc,,
45
+ torchvision/datasets/__pycache__/kinetics.cpython-39.pyc,,
46
+ torchvision/datasets/__pycache__/kitti.cpython-39.pyc,,
47
+ torchvision/datasets/__pycache__/lfw.cpython-39.pyc,,
48
+ torchvision/datasets/__pycache__/lsun.cpython-39.pyc,,
49
+ torchvision/datasets/__pycache__/mnist.cpython-39.pyc,,
50
+ torchvision/datasets/__pycache__/moving_mnist.cpython-39.pyc,,
51
+ torchvision/datasets/__pycache__/omniglot.cpython-39.pyc,,
52
+ torchvision/datasets/__pycache__/oxford_iiit_pet.cpython-39.pyc,,
53
+ torchvision/datasets/__pycache__/pcam.cpython-39.pyc,,
54
+ torchvision/datasets/__pycache__/phototour.cpython-39.pyc,,
55
+ torchvision/datasets/__pycache__/places365.cpython-39.pyc,,
56
+ torchvision/datasets/__pycache__/rendered_sst2.cpython-39.pyc,,
57
+ torchvision/datasets/__pycache__/sbd.cpython-39.pyc,,
58
+ torchvision/datasets/__pycache__/sbu.cpython-39.pyc,,
59
+ torchvision/datasets/__pycache__/semeion.cpython-39.pyc,,
60
+ torchvision/datasets/__pycache__/stanford_cars.cpython-39.pyc,,
61
+ torchvision/datasets/__pycache__/stl10.cpython-39.pyc,,
62
+ torchvision/datasets/__pycache__/sun397.cpython-39.pyc,,
63
+ torchvision/datasets/__pycache__/svhn.cpython-39.pyc,,
64
+ torchvision/datasets/__pycache__/ucf101.cpython-39.pyc,,
65
+ torchvision/datasets/__pycache__/usps.cpython-39.pyc,,
66
+ torchvision/datasets/__pycache__/utils.cpython-39.pyc,,
67
+ torchvision/datasets/__pycache__/video_utils.cpython-39.pyc,,
68
+ torchvision/datasets/__pycache__/vision.cpython-39.pyc,,
69
+ torchvision/datasets/__pycache__/voc.cpython-39.pyc,,
70
+ torchvision/datasets/__pycache__/widerface.cpython-39.pyc,,
71
+ torchvision/datasets/_optical_flow.py,sha256=exA1SfgHNdvExeOHrCoMnQUfhCo-7c0P97kkqYm2dRE,20141
72
+ torchvision/datasets/_stereo_matching.py,sha256=uI3EzvxJgcbI97sXkiqwUd_UafZrpL57QMfQ7kKy4YA,50307
73
+ torchvision/datasets/caltech.py,sha256=5lmADI0KQmwGcABpMmaivr2qY5NoOgc4HhDF9525s6Y,9175
74
+ torchvision/datasets/celeba.py,sha256=hES-xHtqgbTagwSUsOnArvKZ-YqwWw0KNo9wvCbsUEo,8664
75
+ torchvision/datasets/cifar.py,sha256=2HVkqRdWeks6vHPNlU1K70rODmvgIAwODu5RB_meRDs,6018
76
+ torchvision/datasets/cityscapes.py,sha256=8aAmy50eIgLtlqi8s7fiz6kqusMudTIv0EeT8jLptKw,10515
77
+ torchvision/datasets/clevr.py,sha256=vliWOSDso7wRX5NtfwWSOK4sL-h0A0UP7tctAie9o3o,3548
78
+ torchvision/datasets/coco.py,sha256=ffMMWD4lEElgMPb1V967OTU1RiSt2MPklgh3h3vGOjQ,4289
79
+ torchvision/datasets/country211.py,sha256=_iKc9AaBoGm9vWTd9Z2IbdB4hD2KjzKo9Xsa8eDvlfk,2494
80
+ torchvision/datasets/dtd.py,sha256=xCxrh3Prnz6Td0D2zp0LY4NupqUN9rihTkb1g8I7mIs,4119
81
+ torchvision/datasets/eurosat.py,sha256=haGeLUypJFKodQJAUC3MxyCdd_nARHNY_nWpbY2A6Y4,2172
82
+ torchvision/datasets/fakedata.py,sha256=BqGViPyIRcvdBNpGhsPl_isrxxOTtlFLyKnLxzBWQP8,2514
83
+ torchvision/datasets/fer2013.py,sha256=OxPwEQLDHuq3Dvl0HjDDFPIyTdsgpuz9rIoGlPMclbQ,2881
84
+ torchvision/datasets/fgvc_aircraft.py,sha256=IlkE04918JZ8of8bwfSPtvfkYafrBIUGUbpg3isfQos,4741
85
+ torchvision/datasets/flickr.py,sha256=kRzv6HJiz8TDa7_SR9gc13S1n_4loCz5ike7UhV4S3I,5598
86
+ torchvision/datasets/flowers102.py,sha256=pkBqqDqOI4Ef_OB5J8_P8s0dqq_YldY1kkQ0hc3S0No,4755
87
+ torchvision/datasets/folder.py,sha256=xS9pf344r5JKxtNYMdWTvCR9NNgxGWfU4Ow5xJiLTvc,13243
88
+ torchvision/datasets/food101.py,sha256=jhWKGMqqkIMDHq_dQocQOgYWHdZ6pYKsy9cDB7MPiI0,3845
89
+ torchvision/datasets/gtsrb.py,sha256=TVMesJ3jzVk1tg4ecJyiun5lQ0PKh7lMZWLkZbkVbUY,3888
90
+ torchvision/datasets/hmdb51.py,sha256=nAfL2TmZMNAuxv6hN1X3jNP-DmBEF5ls5YnjHo64TIs,6123
91
+ torchvision/datasets/imagenet.py,sha256=ftqZ3qfRBuL-fAAXFjNwKzN92KdT_vS-PUqIlnFUMfA,8910
92
+ torchvision/datasets/imagenette.py,sha256=1JxKiR5ItZCuOZooSidy2ma33sgUM0YFXI1vbHSX4l0,4560
93
+ torchvision/datasets/inaturalist.py,sha256=RWD3qH5_411m6TpnieUuhVGuWf_Iz5qIvWCkuXCVJAQ,10403
94
+ torchvision/datasets/kinetics.py,sha256=SMgnLJ6qaqmdX6q9GyFgSPfZij4PdG_OPXLKoV-PuF0,10638
95
+ torchvision/datasets/kitti.py,sha256=q0otSanUygTFq4P2T3xKft1Gv-BPQXw-xNNMfOmDOzc,5795
96
+ torchvision/datasets/lfw.py,sha256=3XWdxEt1D-M-zG4psvq8ZCUvKR1ksQQ0iXIfo80lprA,10816
97
+ torchvision/datasets/lsun.py,sha256=pOPWLCnQBY3lbUzCWR_YsRhFZeEB2qFoAWxnuQT_l6w,5896
98
+ torchvision/datasets/mnist.py,sha256=Kyky-ex-AHE5NXezLzqb1Kp9zTA5bqS7mI0FSfpYhfI,22277
99
+ torchvision/datasets/moving_mnist.py,sha256=iuX-yNmki7bypxNfICobifZhadtJKFmTGvFd40JPrIw,3740
100
+ torchvision/datasets/omniglot.py,sha256=ysZzt9vQ7eD2ikqQGIEvfRVckusg2JRl4d8rQCF0PEU,4254
101
+ torchvision/datasets/oxford_iiit_pet.py,sha256=aeTY9s-2D6ueX_CwppAAghHlkTdQLBgyOoz_kRSzLWk,5215
102
+ torchvision/datasets/pcam.py,sha256=8tWbJcxo7lKYTToISFpL6_OSbxeYEqU2QsaY6OYjRpM,5419
103
+ torchvision/datasets/phototour.py,sha256=yEQ_X1bDJQA-70I1x7PTy7F-mCn6UrW70ubU-z3AU8g,8271
104
+ torchvision/datasets/places365.py,sha256=izNJr6E0mR3toIG4UJs82Gdc1Fh_a5FJKVa8qVpezT4,7430
105
+ torchvision/datasets/rendered_sst2.py,sha256=_guhu305oAtxCH8KcV4rUwP5CUBucFCYfx--lpBjRK8,3683
106
+ torchvision/datasets/samplers/__init__.py,sha256=yX8XD3h-VwTCoqF9IdQmcuGblZxvXb2EIqc5lPqXXtY,164
107
+ torchvision/datasets/samplers/__pycache__/__init__.cpython-39.pyc,,
108
+ torchvision/datasets/samplers/__pycache__/clip_sampler.cpython-39.pyc,,
109
+ torchvision/datasets/samplers/clip_sampler.py,sha256=ERp9c0O3ZGQNR3lLXvZXWjfJkXayYym8bhYfOm_MNKI,6416
110
+ torchvision/datasets/sbd.py,sha256=6gtgU3ip9TFGuacBpKNGrVJcpe2l31PXD6LQORFh0Bs,5388
111
+ torchvision/datasets/sbu.py,sha256=GWAuWzCS39wkH77_Py3EKu2ej-TifDaxGyiup67RLFw,4253
112
+ torchvision/datasets/semeion.py,sha256=DLzb76ihmjgtAzNhnV0JntTjwuWA1sf0eEcFBj9oJII,3240
113
+ torchvision/datasets/stanford_cars.py,sha256=REy9oAjVYN5jISRCLQD8eg0GMBx3MZUbYeF76eQCdI8,4626
114
+ torchvision/datasets/stl10.py,sha256=MGcfHh9vUzIPrTtgcYDKjewMjw4r11vJLJGihLP-hRQ,7468
115
+ torchvision/datasets/sun397.py,sha256=yIbrKH6CfuJ3l8yjQGsZCakQGPcbIxNGb6h64bQCDU4,2859
116
+ torchvision/datasets/svhn.py,sha256=nP-62KxLkLD6n84lw5lgyblHyRFWC5NYiBLoGgeID5Y,4958
117
+ torchvision/datasets/ucf101.py,sha256=zM9EY6Clqdvt45PZyL0dZvnlLn-5oyCD6v924bENlPE,5664
118
+ torchvision/datasets/usps.py,sha256=vaztMPU18onO_7tWXUWNQ0QsPPYoxGB-3a7YiA3pYrs,3596
119
+ torchvision/datasets/utils.py,sha256=RfT013BlukbKIvp9LG8DL_-FdYscGkieGI5vC9tukn4,16804
120
+ torchvision/datasets/video_utils.py,sha256=gUazc9gv4wgFuJ1N47e2c4Wi86HWhR9O09HcwVraeLA,17632
121
+ torchvision/datasets/vision.py,sha256=5N69xiFsNdiqjVzP-QP3-PtswDUNsKSeyy_FzXTDt_4,4360
122
+ torchvision/datasets/voc.py,sha256=7GhzVyU3iWbBzFyb1zdj-9xxCSLmQCdihPO_o85SdqA,9059
123
+ torchvision/datasets/widerface.py,sha256=UGvI97nsBCmJhG0KKNxOwM9ikLGBuaamUSjlyt4jwzo,8494
124
+ torchvision/extension.py,sha256=0A4efQ6V8RlQcMMtDpK74VIBHpDv4icjkkOc-EokPHw,3233
125
+ torchvision/image.pyd,sha256=RxYV5fJirIbu3jVv8nnK61t6Pkwg0pYtxbDqZcFXZus,150016
126
+ torchvision/io/__init__.py,sha256=md51PMqDbCY8Wvo9rA2il7ZrZ87GshTq8fJY3xhNVOA,1547
127
+ torchvision/io/__pycache__/__init__.cpython-39.pyc,,
128
+ torchvision/io/__pycache__/_load_gpu_decoder.cpython-39.pyc,,
129
+ torchvision/io/__pycache__/_video_opt.cpython-39.pyc,,
130
+ torchvision/io/__pycache__/image.cpython-39.pyc,,
131
+ torchvision/io/__pycache__/video.cpython-39.pyc,,
132
+ torchvision/io/__pycache__/video_reader.cpython-39.pyc,,
133
+ torchvision/io/_load_gpu_decoder.py,sha256=B3mPLXerJYXqiHv9FO2laRrGRlIkXii7CsD0J8J_SwU,182
134
+ torchvision/io/_video_opt.py,sha256=n5PL4hXCnOVaLDlSfLtWzP8yK4ypwWbyvQhbSHrO0Ps,20902
135
+ torchvision/io/image.py,sha256=cYqcLdGmcRsnBKYfOVvuEGrYIGq9bXhIIpIS3XuySWY,11120
136
+ torchvision/io/video.py,sha256=tcDKnx2z_AXAgBU5CCgthNGL2lMBDoMT-vXnfW8qyzE,16089
137
+ torchvision/io/video_reader.py,sha256=RhuK-KcutlD_ByPkND_tHH_QVIwDsdqWLz-C0KJc0EM,11642
138
+ torchvision/jpeg8.dll,sha256=aM-Kj2MkrdHI0gkgpHfh86_icuM26XiMu6gyMGeuKig,552448
139
+ torchvision/libpng16.dll,sha256=nPxu7uIrOxgpEXCLUyjP7rQBCghBVPsL-h8OxeArvc0,192512
140
+ torchvision/models/__init__.py,sha256=6QlTJfvjKcUmMJvwSapWUNFXbf2Vo15dVRcBuNSaYko,888
141
+ torchvision/models/__pycache__/__init__.cpython-39.pyc,,
142
+ torchvision/models/__pycache__/_api.cpython-39.pyc,,
143
+ torchvision/models/__pycache__/_meta.cpython-39.pyc,,
144
+ torchvision/models/__pycache__/_utils.cpython-39.pyc,,
145
+ torchvision/models/__pycache__/alexnet.cpython-39.pyc,,
146
+ torchvision/models/__pycache__/convnext.cpython-39.pyc,,
147
+ torchvision/models/__pycache__/densenet.cpython-39.pyc,,
148
+ torchvision/models/__pycache__/efficientnet.cpython-39.pyc,,
149
+ torchvision/models/__pycache__/feature_extraction.cpython-39.pyc,,
150
+ torchvision/models/__pycache__/googlenet.cpython-39.pyc,,
151
+ torchvision/models/__pycache__/inception.cpython-39.pyc,,
152
+ torchvision/models/__pycache__/maxvit.cpython-39.pyc,,
153
+ torchvision/models/__pycache__/mnasnet.cpython-39.pyc,,
154
+ torchvision/models/__pycache__/mobilenet.cpython-39.pyc,,
155
+ torchvision/models/__pycache__/mobilenetv2.cpython-39.pyc,,
156
+ torchvision/models/__pycache__/mobilenetv3.cpython-39.pyc,,
157
+ torchvision/models/__pycache__/regnet.cpython-39.pyc,,
158
+ torchvision/models/__pycache__/resnet.cpython-39.pyc,,
159
+ torchvision/models/__pycache__/shufflenetv2.cpython-39.pyc,,
160
+ torchvision/models/__pycache__/squeezenet.cpython-39.pyc,,
161
+ torchvision/models/__pycache__/swin_transformer.cpython-39.pyc,,
162
+ torchvision/models/__pycache__/vgg.cpython-39.pyc,,
163
+ torchvision/models/__pycache__/vision_transformer.cpython-39.pyc,,
164
+ torchvision/models/_api.py,sha256=RJurpplK_q7teeUlnqlMuTTxAorqIEhEcuQydyhzF3s,10331
165
+ torchvision/models/_meta.py,sha256=2NSIICoq4MDzPZc00DlGJTgHOCwTBSObSTeRTh3E0tQ,30429
166
+ torchvision/models/_utils.py,sha256=X7zduE90fkek8DjukzyENOcZ0iop03R0LIxC_FuAazk,11149
167
+ torchvision/models/alexnet.py,sha256=XcldP2UuOkdOUfdxitGS8qHzLH78Ny7VCzTzKsaWITU,4607
168
+ torchvision/models/convnext.py,sha256=mML3XALGvQGe3aYkmz6dbC7Wus7ntZWr7IVihWjLV8M,15740
169
+ torchvision/models/densenet.py,sha256=Wxvecj8b09Uy2FjKun7ZbagpDmHll8L2c3GbM802Ivs,17273
170
+ torchvision/models/detection/__init__.py,sha256=D4cs338Z4BQn5TgX2IKuJC9TD2rtw2svUDZlALR-lwI,175
171
+ torchvision/models/detection/__pycache__/__init__.cpython-39.pyc,,
172
+ torchvision/models/detection/__pycache__/_utils.cpython-39.pyc,,
173
+ torchvision/models/detection/__pycache__/anchor_utils.cpython-39.pyc,,
174
+ torchvision/models/detection/__pycache__/backbone_utils.cpython-39.pyc,,
175
+ torchvision/models/detection/__pycache__/faster_rcnn.cpython-39.pyc,,
176
+ torchvision/models/detection/__pycache__/fcos.cpython-39.pyc,,
177
+ torchvision/models/detection/__pycache__/generalized_rcnn.cpython-39.pyc,,
178
+ torchvision/models/detection/__pycache__/image_list.cpython-39.pyc,,
179
+ torchvision/models/detection/__pycache__/keypoint_rcnn.cpython-39.pyc,,
180
+ torchvision/models/detection/__pycache__/mask_rcnn.cpython-39.pyc,,
181
+ torchvision/models/detection/__pycache__/retinanet.cpython-39.pyc,,
182
+ torchvision/models/detection/__pycache__/roi_heads.cpython-39.pyc,,
183
+ torchvision/models/detection/__pycache__/rpn.cpython-39.pyc,,
184
+ torchvision/models/detection/__pycache__/ssd.cpython-39.pyc,,
185
+ torchvision/models/detection/__pycache__/ssdlite.cpython-39.pyc,,
186
+ torchvision/models/detection/__pycache__/transform.cpython-39.pyc,,
187
+ torchvision/models/detection/_utils.py,sha256=m9bowqjuYiR9A7HI7wJAK_kgFrUYlKSukXOEwuUtRpA,22667
188
+ torchvision/models/detection/anchor_utils.py,sha256=TQKWOKDFALsTmYw_BMGIDlT9mNQhukmgnNdFuRjN49w,12127
189
+ torchvision/models/detection/backbone_utils.py,sha256=BBKVxCyAY9Q8JYOLQ3oAbWJxcjn5HAbPeAcDb-5JoVA,10792
190
+ torchvision/models/detection/faster_rcnn.py,sha256=CqOFL8d206qgEvlmJrgjvcQXMZmR5u4Eg_AomEzlYWw,37576
191
+ torchvision/models/detection/fcos.py,sha256=8ffrmOs2WZZFlUCzoOV5NiGkDZ7K24CuJozvX0bhNWg,34761
192
+ torchvision/models/detection/generalized_rcnn.py,sha256=nLVj2o8yr6BW1MN12u30QuFbvtsdllEbUlbNH-dO-G0,4861
193
+ torchvision/models/detection/image_list.py,sha256=IzFjxIaMdyFas1IHPBgAuBK3iYJOert5HzGurYJitNk,808
194
+ torchvision/models/detection/keypoint_rcnn.py,sha256=XCZw2v0ilcYYkHt62pIO9SUUh03B8UFCFTeLcMsXf-U,22198
195
+ torchvision/models/detection/mask_rcnn.py,sha256=O1-jvyefS-aU6JGsUB2lgzIBE8Tht-RsApabHDjvyQ0,27054
196
+ torchvision/models/detection/retinanet.py,sha256=Erd9q38DhUGCc7NLJxKGSBUafgEUBe1fA5zaEECwm2g,37954
197
+ torchvision/models/detection/roi_heads.py,sha256=f_Lde69JHugGAaiGWh6ugJ9WUTT8f7InxPry_MZxgZY,34698
198
+ torchvision/models/detection/rpn.py,sha256=z4ezvg1XS9uNq_3jIYXzIRnsAhuGClZ5AJ5P0NDziN8,16226
199
+ torchvision/models/detection/ssd.py,sha256=QyWW7IihpNG4_SWeVo1-X5J9kDJrasd794-AHIcDxQY,29661
200
+ torchvision/models/detection/ssdlite.py,sha256=DRkN07LP7WTKt6Q5WIOraQR37Unx7XIiB70UdhmDCcE,13550
201
+ torchvision/models/detection/transform.py,sha256=gou0mGGPzLncV2ZMHkTTkw6UO0fUoJii6Dr2N0PNA_Y,12508
202
+ torchvision/models/efficientnet.py,sha256=tW6BpsBProhN6b1o1_sHSikSoqDupQRgdgRETiYjcAY,44221
203
+ torchvision/models/feature_extraction.py,sha256=uTBD0Obc42BSm0YBU7VcSKJR7HDeY6Da5nOiuGBZsV8,26140
204
+ torchvision/models/googlenet.py,sha256=AtXckNXKcmWhDyoozVsvb3VPI-odl2tptiYz7KXU3wA,13151
205
+ torchvision/models/inception.py,sha256=l9tutwO7KNVa5nfdl1_5f-6lJzQ-NSOCzXPArDILeAA,19329
206
+ torchvision/models/maxvit.py,sha256=m8Pfh7MYcYANxwxiAaU9pVmzeWmOB2jTpLSiuUjsHZI,32886
207
+ torchvision/models/mnasnet.py,sha256=PTSF4DchbFgtOchd9kgoPKCcfKH6NC4WfHa5bv4jZ58,18008
208
+ torchvision/models/mobilenet.py,sha256=alrEJwktmXVcCphU8h2EAJZX0YdKfcz4tJEOdG2BXB8,217
209
+ torchvision/models/mobilenetv2.py,sha256=eVil23yP4f-DBUhjKt73mao84zEiF5Y01wHNwh8HCdM,9970
210
+ torchvision/models/mobilenetv3.py,sha256=g1Ul1RkohPHhXfsecg2r0W7NVlEgVUc7X_uT1bxxrTQ,16702
211
+ torchvision/models/optical_flow/__init__.py,sha256=uuRFAdvcDobdAbY2VmxEZ7_CLH_f5-JRkCSuJRkj4RY,21
212
+ torchvision/models/optical_flow/__pycache__/__init__.cpython-39.pyc,,
213
+ torchvision/models/optical_flow/__pycache__/_utils.cpython-39.pyc,,
214
+ torchvision/models/optical_flow/__pycache__/raft.cpython-39.pyc,,
215
+ torchvision/models/optical_flow/_utils.py,sha256=PRcuU-IB6EL3hAOLiyC5q-NBzlvIKfhSF_BMplHbzfY,2125
216
+ torchvision/models/optical_flow/raft.py,sha256=o9wJ3jZH9EWwJl7fQYeWSdeXPMKYOZ0Zwm-hhcequVk,40942
217
+ torchvision/models/quantization/__init__.py,sha256=YOJmYqWQTfP5P2ypteZNKQOMW4VEB2WHJlYoSlSaL1Y,130
218
+ torchvision/models/quantization/__pycache__/__init__.cpython-39.pyc,,
219
+ torchvision/models/quantization/__pycache__/googlenet.cpython-39.pyc,,
220
+ torchvision/models/quantization/__pycache__/inception.cpython-39.pyc,,
221
+ torchvision/models/quantization/__pycache__/mobilenet.cpython-39.pyc,,
222
+ torchvision/models/quantization/__pycache__/mobilenetv2.cpython-39.pyc,,
223
+ torchvision/models/quantization/__pycache__/mobilenetv3.cpython-39.pyc,,
224
+ torchvision/models/quantization/__pycache__/resnet.cpython-39.pyc,,
225
+ torchvision/models/quantization/__pycache__/shufflenetv2.cpython-39.pyc,,
226
+ torchvision/models/quantization/__pycache__/utils.cpython-39.pyc,,
227
+ torchvision/models/quantization/googlenet.py,sha256=P92cacoKVTV4cDoaNkRevLx1daGH5DkPfUwPDMuOXO0,8290
228
+ torchvision/models/quantization/inception.py,sha256=TXz2hRpSvh6zYP398MsXTYQMnqYgnYnq5wyG6xr5Nhk,11088
229
+ torchvision/models/quantization/mobilenet.py,sha256=alrEJwktmXVcCphU8h2EAJZX0YdKfcz4tJEOdG2BXB8,217
230
+ torchvision/models/quantization/mobilenetv2.py,sha256=g2z3HPQ0MXFCuMV5TpLPRdO99m3wC_3d0ukUanXaJHo,6037
231
+ torchvision/models/quantization/mobilenetv3.py,sha256=l9g1AwKiU35XKuobXo-UPe5MqRKC1kgN7xgWWim4qr4,9467
232
+ torchvision/models/quantization/resnet.py,sha256=azvn1vwebP22qryYGNgFLMviGjHklXOX7xd4C2cggUo,18423
233
+ torchvision/models/quantization/shufflenetv2.py,sha256=7k9MLRLzjP3vke-e0Ai_cnA-j15KGQq5yDcs_ELXUg8,17311
234
+ torchvision/models/quantization/utils.py,sha256=Ij88l6toyO8MQi1w512Jt-yQ2Q9hK75-Z2SOjIzS6Zw,2109
235
+ torchvision/models/regnet.py,sha256=NbsA3RO7ka7k9fwYyVZ5wvvtJUuhXqyX76aGIoIujGE,65124
236
+ torchvision/models/resnet.py,sha256=AXWWl7XlkSQpUCsv8lCaWeDuYaq-KyOLXmBe0R8rv58,39917
237
+ torchvision/models/segmentation/__init__.py,sha256=TLL2SSmqE08HLiv_yyIWyIyrvf2xaOsZi0muDv_Y5Vc,69
238
+ torchvision/models/segmentation/__pycache__/__init__.cpython-39.pyc,,
239
+ torchvision/models/segmentation/__pycache__/_utils.cpython-39.pyc,,
240
+ torchvision/models/segmentation/__pycache__/deeplabv3.cpython-39.pyc,,
241
+ torchvision/models/segmentation/__pycache__/fcn.cpython-39.pyc,,
242
+ torchvision/models/segmentation/__pycache__/lraspp.cpython-39.pyc,,
243
+ torchvision/models/segmentation/_utils.py,sha256=yFeyBa5_Pyv1UQ_2N64XMRgTYsxifwzDd-VRP-vmIGM,1234
244
+ torchvision/models/segmentation/deeplabv3.py,sha256=MkmYEm1dF4afQEYXmFfxLAcqioiT5uWKYGSXCccIYh4,15405
245
+ torchvision/models/segmentation/fcn.py,sha256=mQ1Wi4S9j5G6OQbNciuxNwVbJ6e9miTzIWj6mUF5JwA,9205
246
+ torchvision/models/segmentation/lraspp.py,sha256=yx_b3PJsH5e0F3TqiGDhEnbXCGTdNX2iIxsKvNenM0s,7821
247
+ torchvision/models/shufflenetv2.py,sha256=VEGsTNNTdqdu8m7I62zQuJK_5CkET-0Y4ixYBJ-QBCs,15852
248
+ torchvision/models/squeezenet.py,sha256=Dha-ci350KU15D0LS9N07kw6MlNuusUHSBnC83Ery_E,8986
249
+ torchvision/models/swin_transformer.py,sha256=-Q9Kd1qzsD7vL3u137Q4MoHSTwzA6QFcweaF0zCWmUk,40370
250
+ torchvision/models/vgg.py,sha256=Qt9r5sFoY-oPdwP4De61jqSVe9XUZLXK47r9yVDQ33M,19736
251
+ torchvision/models/video/__init__.py,sha256=xHHR5c6kP0cMDXck7XnXq19iJL_Uemcxg3OC00cqE6A,97
252
+ torchvision/models/video/__pycache__/__init__.cpython-39.pyc,,
253
+ torchvision/models/video/__pycache__/mvit.cpython-39.pyc,,
254
+ torchvision/models/video/__pycache__/resnet.cpython-39.pyc,,
255
+ torchvision/models/video/__pycache__/s3d.cpython-39.pyc,,
256
+ torchvision/models/video/__pycache__/swin_transformer.cpython-39.pyc,,
257
+ torchvision/models/video/mvit.py,sha256=xIK4nCOuJWXQjoX8NzcouwzyTkIkcFug3yiu0a5-Dk8,33895
258
+ torchvision/models/video/resnet.py,sha256=JOP7FDfUOfQQP-jEYUvzSIOr9Iaexl2L3iUh-rzrp90,17274
259
+ torchvision/models/video/s3d.py,sha256=Rn-iypP13jrETAap1Qd4NY6kkpYDuSXjGkEKZDOxemI,8034
260
+ torchvision/models/video/swin_transformer.py,sha256=M74P2v4lVKM6zgwoeFn_njppZB2l-gAjuGVvzzESKpU,28431
261
+ torchvision/models/vision_transformer.py,sha256=GE-_-dlFJQPTnONe4qrzYOYp-wavPOrFPCo9krM39Vg,33000
262
+ torchvision/ops/__init__.py,sha256=7wibGxcF1JHDviSNs9O9Pwlc8dhMSFfZo0wzVjTFnAY,2001
263
+ torchvision/ops/__pycache__/__init__.cpython-39.pyc,,
264
+ torchvision/ops/__pycache__/_box_convert.cpython-39.pyc,,
265
+ torchvision/ops/__pycache__/_register_onnx_ops.cpython-39.pyc,,
266
+ torchvision/ops/__pycache__/_utils.cpython-39.pyc,,
267
+ torchvision/ops/__pycache__/boxes.cpython-39.pyc,,
268
+ torchvision/ops/__pycache__/ciou_loss.cpython-39.pyc,,
269
+ torchvision/ops/__pycache__/deform_conv.cpython-39.pyc,,
270
+ torchvision/ops/__pycache__/diou_loss.cpython-39.pyc,,
271
+ torchvision/ops/__pycache__/drop_block.cpython-39.pyc,,
272
+ torchvision/ops/__pycache__/feature_pyramid_network.cpython-39.pyc,,
273
+ torchvision/ops/__pycache__/focal_loss.cpython-39.pyc,,
274
+ torchvision/ops/__pycache__/giou_loss.cpython-39.pyc,,
275
+ torchvision/ops/__pycache__/misc.cpython-39.pyc,,
276
+ torchvision/ops/__pycache__/poolers.cpython-39.pyc,,
277
+ torchvision/ops/__pycache__/ps_roi_align.cpython-39.pyc,,
278
+ torchvision/ops/__pycache__/ps_roi_pool.cpython-39.pyc,,
279
+ torchvision/ops/__pycache__/roi_align.cpython-39.pyc,,
280
+ torchvision/ops/__pycache__/roi_pool.cpython-39.pyc,,
281
+ torchvision/ops/__pycache__/stochastic_depth.cpython-39.pyc,,
282
+ torchvision/ops/_box_convert.py,sha256=glF6sulLzaw_KG36wg0CHWt0ef62BnkjokbqQnBUMsU,2490
283
+ torchvision/ops/_register_onnx_ops.py,sha256=g4M5Fp7n_5ZTzIQcUXvEct3YFlUMPNVSQQfBP-J0eQQ,4288
284
+ torchvision/ops/_utils.py,sha256=xFrLnLhKDHiG2TN39tUWY-MJbLEPka6dkaVVJFAN7-8,3736
285
+ torchvision/ops/boxes.py,sha256=LvrW5Pj4K5TRFApj70zUx9xuKZ0SJixgvAcPZk0bwUA,16796
286
+ torchvision/ops/ciou_loss.py,sha256=Qzm89C82ehX-YvYBPLPRPhbJZdr3itizxuxrT7MLi9o,2834
287
+ torchvision/ops/deform_conv.py,sha256=DuIosFDK3tsY5RlHU7mez5x1p08IQai9WG14z3S0gzU,7185
288
+ torchvision/ops/diou_loss.py,sha256=6IebWlMYc_2YnbG36niDXgM16vxajSKRfiusEuUJwpQ,3456
289
+ torchvision/ops/drop_block.py,sha256=ZkIzM1b3v5_U7z0eavzaNpN7IBq0N4ZNwwvWArispwg,6010
290
+ torchvision/ops/feature_pyramid_network.py,sha256=Jts5mzUJX3EarcAQU5MDUe0a5Sgn5YjUstaW2JQpgEE,8952
291
+ torchvision/ops/focal_loss.py,sha256=lS5FqgLFuDKlpm0sk5V1VgIA6LFAdJUXQaPi35nEDoU,2319
292
+ torchvision/ops/giou_loss.py,sha256=xB_RlES28k_A6iH2VqWAPBQkiT_zkEwdtREDGR_nVJM,2772
293
+ torchvision/ops/misc.py,sha256=niQnKPuifQzVXAWnnf6TkVsgetqyetjZ6lq3wi2tsZw,13892
294
+ torchvision/ops/poolers.py,sha256=sfgcZWh2dIo9UY437CnpAHdxqPQhuvjNPYzhIKlAIPE,12247
295
+ torchvision/ops/ps_roi_align.py,sha256=6_kmnE6z_3AZZ1N2hrS_uK3cbuzqZhjdM2rC50mfYUo,3715
296
+ torchvision/ops/ps_roi_pool.py,sha256=2JrjJwzVtEeEg0BebkCnGfq4xEDwMCD-Xh915mvNcyI,2940
297
+ torchvision/ops/roi_align.py,sha256=xCwgOCqGfn1WYnJ9krI_ch9_i332gw8YTub1QwO5b84,11030
298
+ torchvision/ops/roi_pool.py,sha256=cN7rSCQfpUzETvP8SjPDl1yfXjbxg9I-tXnHbvAKya8,3015
299
+ torchvision/ops/stochastic_depth.py,sha256=9T4Zu_BaemKZafSmRwrPCVr5aaGH8tmzlsQAZO-1_-Y,2302
300
+ torchvision/transforms/__init__.py,sha256=WCNXTJUbJ1h7YaN9UfrBSvt--ST2PAV4sLICbTS-L5A,55
301
+ torchvision/transforms/__pycache__/__init__.cpython-39.pyc,,
302
+ torchvision/transforms/__pycache__/_functional_pil.cpython-39.pyc,,
303
+ torchvision/transforms/__pycache__/_functional_tensor.cpython-39.pyc,,
304
+ torchvision/transforms/__pycache__/_functional_video.cpython-39.pyc,,
305
+ torchvision/transforms/__pycache__/_presets.cpython-39.pyc,,
306
+ torchvision/transforms/__pycache__/_transforms_video.cpython-39.pyc,,
307
+ torchvision/transforms/__pycache__/autoaugment.cpython-39.pyc,,
308
+ torchvision/transforms/__pycache__/functional.cpython-39.pyc,,
309
+ torchvision/transforms/__pycache__/transforms.cpython-39.pyc,,
310
+ torchvision/transforms/_functional_pil.py,sha256=UEiaElYLuYXkNR__O_dbKts2BKBsb28Rj50RMFgRxig,12505
311
+ torchvision/transforms/_functional_tensor.py,sha256=hpLy9xCwONubxoGfzXiqNs0nEhgVaDKRuMcqAxSi090,34794
312
+ torchvision/transforms/_functional_video.py,sha256=c4BbUi3Y2LvskozFdy619piLBd5acsjxgogYAXmY5P8,3971
313
+ torchvision/transforms/_presets.py,sha256=UVxchNgdPL-4iVHjOxcHndygktw4K76hhxEK8ks1zlw,8700
314
+ torchvision/transforms/_transforms_video.py,sha256=ub2gCT5ELiK918Bq-Pp6mzhWrAZxlj7blfpkA8Dhb1o,5124
315
+ torchvision/transforms/autoaugment.py,sha256=UD8UBlT4dWCIQaNDUDQBtc0osMHHPQluLr7seZJr4cY,28858
316
+ torchvision/transforms/functional.py,sha256=NioCgkCAceO1Aq4_ngYHE3yUVZ6eDsh66D-veM6gHU0,68953
317
+ torchvision/transforms/transforms.py,sha256=NeFpo86xWf2h8vMz_vXpbqV9PzRoROEp6GlNxw9oTZQ,87710
318
+ torchvision/transforms/v2/__init__.py,sha256=AfbkHi6yQoEhrbfNymH2z8sZpqznEcwZZEPWQNFDdOM,1492
319
+ torchvision/transforms/v2/__pycache__/__init__.cpython-39.pyc,,
320
+ torchvision/transforms/v2/__pycache__/_augment.cpython-39.pyc,,
321
+ torchvision/transforms/v2/__pycache__/_auto_augment.cpython-39.pyc,,
322
+ torchvision/transforms/v2/__pycache__/_color.cpython-39.pyc,,
323
+ torchvision/transforms/v2/__pycache__/_container.cpython-39.pyc,,
324
+ torchvision/transforms/v2/__pycache__/_deprecated.cpython-39.pyc,,
325
+ torchvision/transforms/v2/__pycache__/_geometry.cpython-39.pyc,,
326
+ torchvision/transforms/v2/__pycache__/_meta.cpython-39.pyc,,
327
+ torchvision/transforms/v2/__pycache__/_misc.cpython-39.pyc,,
328
+ torchvision/transforms/v2/__pycache__/_temporal.cpython-39.pyc,,
329
+ torchvision/transforms/v2/__pycache__/_transform.cpython-39.pyc,,
330
+ torchvision/transforms/v2/__pycache__/_type_conversion.cpython-39.pyc,,
331
+ torchvision/transforms/v2/__pycache__/_utils.cpython-39.pyc,,
332
+ torchvision/transforms/v2/_augment.py,sha256=LWrgFw9wXXNsYXoD2xxjWB-_iAJjMXrtZDBv2M1amHQ,15625
333
+ torchvision/transforms/v2/_auto_augment.py,sha256=-pM1nhChaYU61KK4Bng8URo4m0pVYDvpWsDAFiQ7MUg,32407
334
+ torchvision/transforms/v2/_color.py,sha256=pFsVgt3o59y8p3_GweVNgelR25cWnvXAtTSyIvrZAsE,17366
335
+ torchvision/transforms/v2/_container.py,sha256=E-8TvTF_qBqC6aLnlK5mnp3V27oQOxhy1X7PZfQAD5w,6229
336
+ torchvision/transforms/v2/_deprecated.py,sha256=9oSk7wHbYIajAizg37oOGFcWC5GICwEuSHGAOTGfGkE,1997
337
+ torchvision/transforms/v2/_geometry.py,sha256=JZnWTJSiZwzTeI9EH84u1642xpVwnXmJk3QZjb5avW0,68331
338
+ torchvision/transforms/v2/_meta.py,sha256=I_5TP_yGo_vHpx7xKDYMIrlzuiusCjcXRX6zR8WzEUU,1441
339
+ torchvision/transforms/v2/_misc.py,sha256=YDCO7d96gdqdO4t3MX3w3ZG5AMmMS1NrL5rFmWy1n8g,18076
340
+ torchvision/transforms/v2/_temporal.py,sha256=FmFtwnluzRzJOX6Q-c2hxePQW0SHAp67ctLsZLWt8FM,932
341
+ torchvision/transforms/v2/_transform.py,sha256=i3KWfXcFjioL3FV5ix1XetK1cLHyw3hutIWwd_wNn9w,8652
342
+ torchvision/transforms/v2/_type_conversion.py,sha256=zWwYZSLm-7Mk_0TrSoJDHX9Kel-fyy36wGlYYfaEvuw,2944
343
+ torchvision/transforms/v2/_utils.py,sha256=CisntDy7j0GuorgmUAV80oS7A1D8zxzFD3AGFrpytCQ,8872
344
+ torchvision/transforms/v2/functional/__init__.py,sha256=BV6Lqai0jNzMjv5TDeSKev2AgpwO79QuPK1m36majRo,3624
345
+ torchvision/transforms/v2/functional/__pycache__/__init__.cpython-39.pyc,,
346
+ torchvision/transforms/v2/functional/__pycache__/_augment.cpython-39.pyc,,
347
+ torchvision/transforms/v2/functional/__pycache__/_color.cpython-39.pyc,,
348
+ torchvision/transforms/v2/functional/__pycache__/_deprecated.cpython-39.pyc,,
349
+ torchvision/transforms/v2/functional/__pycache__/_geometry.cpython-39.pyc,,
350
+ torchvision/transforms/v2/functional/__pycache__/_meta.cpython-39.pyc,,
351
+ torchvision/transforms/v2/functional/__pycache__/_misc.cpython-39.pyc,,
352
+ torchvision/transforms/v2/functional/__pycache__/_temporal.cpython-39.pyc,,
353
+ torchvision/transforms/v2/functional/__pycache__/_type_conversion.cpython-39.pyc,,
354
+ torchvision/transforms/v2/functional/__pycache__/_utils.cpython-39.pyc,,
355
+ torchvision/transforms/v2/functional/_augment.py,sha256=rwliCn3Z9CMwpY7xc0sh-5tD-w9YU9aFPok3e7DAc54,3295
356
+ torchvision/transforms/v2/functional/_color.py,sha256=R8dminUMbhugepxr0nTzKREvfsitWTqU68fGEW8Ui9c,30990
357
+ torchvision/transforms/v2/functional/_deprecated.py,sha256=-X7agTXp-JnbFpsp3xoVk1eZr7OyT-evBt0nlULAR40,825
358
+ torchvision/transforms/v2/functional/_geometry.py,sha256=rUNxmCoLNkW4H-g2tHkJVHeYGrgqkeY5m79XeCaENI8,89743
359
+ torchvision/transforms/v2/functional/_meta.py,sha256=b_MF4SQrmZNVpcN8X-8edP-CEmQ241MKDWpQTG4pmbI,10826
360
+ torchvision/transforms/v2/functional/_misc.py,sha256=BgMN67r7JRqNWDH1KQiioDTCaWMMMb8r854eLRrjHC8,15657
361
+ torchvision/transforms/v2/functional/_temporal.py,sha256=tSRkkqOqUQ0QXjENF82F16El1-J0IDoFKIH-ss_cpC4,1163
362
+ torchvision/transforms/v2/functional/_type_conversion.py,sha256=oYf4LMgiClvEZwwc3WbKI7fJ-rRFhDrVSBKiPA5vxio,896
363
+ torchvision/transforms/v2/functional/_utils.py,sha256=3T5iFgq8whHQbk7duizYmoxspH6mtkP-L7ku627zfBY,5620
364
+ torchvision/tv_tensors/__init__.py,sha256=7UBIZbraVyhIO-ZCeKtD59if85XKkAQ4njjS6RjEfDg,1544
365
+ torchvision/tv_tensors/__pycache__/__init__.cpython-39.pyc,,
366
+ torchvision/tv_tensors/__pycache__/_bounding_boxes.cpython-39.pyc,,
367
+ torchvision/tv_tensors/__pycache__/_dataset_wrapper.cpython-39.pyc,,
368
+ torchvision/tv_tensors/__pycache__/_image.cpython-39.pyc,,
369
+ torchvision/tv_tensors/__pycache__/_mask.cpython-39.pyc,,
370
+ torchvision/tv_tensors/__pycache__/_torch_function_helpers.cpython-39.pyc,,
371
+ torchvision/tv_tensors/__pycache__/_tv_tensor.cpython-39.pyc,,
372
+ torchvision/tv_tensors/__pycache__/_video.cpython-39.pyc,,
373
+ torchvision/tv_tensors/_bounding_boxes.py,sha256=iRykoOkjSkE7AUDpIFNAkTwF14kZUWR4D7w7dfe6RzE,4574
374
+ torchvision/tv_tensors/_dataset_wrapper.py,sha256=3S5X_xgzfxb4Bd4A05s3TwpWoDT5iNfUKXSz7FNOzY8,24878
375
+ torchvision/tv_tensors/_image.py,sha256=r66_5vfWVIFxnTU3BJ74Rsuv9VooF1sVMO2yZgQYDsc,1957
376
+ torchvision/tv_tensors/_mask.py,sha256=x9kqe6ik8EvEjKY45UtuX_CRGXsh5-NNVpoStCFLJbc,1490
377
+ torchvision/tv_tensors/_torch_function_helpers.py,sha256=U7r-QG2jKV_KYeUQJ2hKPlYRLwqz_xhEs7_L1oXpBvM,2348
378
+ torchvision/tv_tensors/_tv_tensor.py,sha256=v-dVm-ZZs4fdT6TVcAeX7KnAJpudVUf55VqdtQtgthE,6380
379
+ torchvision/tv_tensors/_video.py,sha256=zYbXjwFnzsxwV8RGt2BCw_AWxSDPASYDVthSi4YMiTo,1420
380
+ torchvision/utils.py,sha256=eZLOxrFdrw-zQkBfgyNQhuUkfygfRFBUp2LM348qDpE,27075
381
+ torchvision/version.py,sha256=aR1DGq4vEWYD2HnPBEk0N7pRxrAJyUDdX5s5up4963g,206
382
+ torchvision/zlib.dll,sha256=jb_W73N0qDEVi93Mt549VmXpYlyBr1V_FbQVC3h39oc,99608
MLPY/Lib/site-packages/torchvision-0.18.1.dist-info/REQUESTED ADDED
File without changes
MLPY/Lib/site-packages/torchvision-0.18.1.dist-info/WHEEL ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: bdist_wheel (0.37.1)
3
+ Root-Is-Purelib: false
4
+ Tag: cp39-cp39-win_amd64
5
+
MLPY/Lib/site-packages/torchvision-0.18.1.dist-info/top_level.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ torchvision