NimaBoscarino
commited on
Commit
·
0864bd5
1
Parent(s):
f863934
Update README.md
Browse files
README.md
CHANGED
@@ -1,7 +1,154 @@
|
|
1 |
---
|
|
|
|
|
|
|
|
|
2 |
tags:
|
3 |
-
-
|
4 |
-
-
|
5 |
-
|
|
|
|
|
|
|
|
|
6 |
---
|
7 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
language:
|
3 |
+
- en
|
4 |
+
license: apache-2.0
|
5 |
+
library_name: timm
|
6 |
tags:
|
7 |
+
- mobile
|
8 |
+
- vison
|
9 |
+
- autogenerated-modelcard
|
10 |
+
datasets:
|
11 |
+
- imagenet-1k
|
12 |
+
metrics:
|
13 |
+
- accuracy
|
14 |
---
|
15 |
+
|
16 |
+
# EfficientFormer-L1
|
17 |
+
|
18 |
+
## Table of Contents
|
19 |
+
- [EfficientFormer-L1](#-model_id--defaultmymodelname-true)
|
20 |
+
- [Table of Contents](#table-of-contents)
|
21 |
+
- [Model Details](#model-details)
|
22 |
+
- [How to Get Started with the Model](#how-to-get-started-with-the-model)
|
23 |
+
- [Uses](#uses)
|
24 |
+
- [Direct Use](#direct-use)
|
25 |
+
- [Downstream Use](#downstream-use)
|
26 |
+
- [Misuse and Out-of-scope Use](#misuse-and-out-of-scope-use)
|
27 |
+
- [Limitations and Biases](#limitations-and-biases)
|
28 |
+
- [Training](#training)
|
29 |
+
- [Training Data](#training-data)
|
30 |
+
- [Training Procedure](#training-procedure)
|
31 |
+
- [Evaluation Results](#evaluation-results)
|
32 |
+
- [Environmental Impact](#environmental-impact)
|
33 |
+
- [Citation Information](#citation-information)
|
34 |
+
|
35 |
+
|
36 |
+
<model_details>
|
37 |
+
## Model Details
|
38 |
+
|
39 |
+
<!-- Give an overview of your model, the relevant research paper, who trained it, etc. -->
|
40 |
+
|
41 |
+
EfficientFormer-L1, developed by [Snap Research](https://github.com/snap-research), is one of three EfficientFormer models. The EfficientFormer models were released as part of an effort to prove that properly designed transformers can reach extremely low latency on mobile devices while maintaining high performance.
|
42 |
+
|
43 |
+
EfficientFormer-L1 is the fastest of the three models.
|
44 |
+
|
45 |
+
- Developed by: Yanyu Li, Geng Yuan, Yang Wen, Eric Hu, Georgios Evangelidis, Sergey Tulyakov, Yanzhi Wang, Jian Ren
|
46 |
+
- Language(s):
|
47 |
+
- License: This model is licensed under the apache-2.0 license
|
48 |
+
- Resources for more information:
|
49 |
+
- [Research Paper](https://arxiv.org/abs/2206.01191)
|
50 |
+
- [GitHub Repo](https://github.com/snap-research/EfficientFormer/)
|
51 |
+
|
52 |
+
</model_details>
|
53 |
+
|
54 |
+
<how_to_start>
|
55 |
+
## How to Get Started with the Model
|
56 |
+
|
57 |
+
Use the code below to get started with the model.
|
58 |
+
|
59 |
+
```python
|
60 |
+
# A nice code snippet here that describes how to use the model...
|
61 |
+
```
|
62 |
+
</how_to_start>
|
63 |
+
|
64 |
+
<uses>
|
65 |
+
|
66 |
+
## Uses
|
67 |
+
|
68 |
+
#### Direct Use
|
69 |
+
|
70 |
+
<!-- Describe what kind of tasks this model can be used for directly or problems it can solve. -->
|
71 |
+
|
72 |
+
[More Information Needed]
|
73 |
+
|
74 |
+
#### Downstream Use
|
75 |
+
|
76 |
+
<!-- Describe how this model could be leveraged by a downstream model (if applicable) -->
|
77 |
+
|
78 |
+
[More Information Needed]
|
79 |
+
|
80 |
+
#### Misuse and Out-of-scope Use
|
81 |
+
|
82 |
+
<!-- Describe ways in which this model ***should not*** be used. -->
|
83 |
+
|
84 |
+
[More Information Needed]
|
85 |
+
</uses>
|
86 |
+
|
87 |
+
<Limitations_and_Biases>
|
88 |
+
|
89 |
+
## Limitations and Biases
|
90 |
+
|
91 |
+
<!-- Describe limitations and biases of this model or models of it's type. -->
|
92 |
+
|
93 |
+
**CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propogate historical and current stereotypes.**
|
94 |
+
|
95 |
+
[More Information Needed]
|
96 |
+
|
97 |
+
</Limitations_and_Biases>
|
98 |
+
|
99 |
+
<Training>
|
100 |
+
|
101 |
+
## Training
|
102 |
+
|
103 |
+
#### Training Data
|
104 |
+
|
105 |
+
<!-- Describe the dataset used to train this model. -->
|
106 |
+
<!-- Refer to data card if dataset is provided and exists on the hub -->
|
107 |
+
|
108 |
+
See the data card for additional information.
|
109 |
+
|
110 |
+
#### Training Procedure
|
111 |
+
|
112 |
+
<!-- Describe the preprocessing, hardware used, training hyperparameters, etc. -->
|
113 |
+
|
114 |
+
[More Information Needed]
|
115 |
+
|
116 |
+
</Training>
|
117 |
+
|
118 |
+
<Eval_Results>
|
119 |
+
## Evaluation Results
|
120 |
+
|
121 |
+
<!-- Describe evaluation results of this model across any datasets it was evaluated on. -->
|
122 |
+
|
123 |
+
[More Information Needed]
|
124 |
+
|
125 |
+
</Eval_Results>
|
126 |
+
|
127 |
+
<E_Impact>
|
128 |
+
## Environmental Impact
|
129 |
+
|
130 |
+
<!-- Provide information to document the environmental impact of this model -->
|
131 |
+
|
132 |
+
You can estimate carbon emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700)
|
133 |
+
|
134 |
+
- **Hardware Type:**
|
135 |
+
- **Hours used:**
|
136 |
+
- **Cloud Provider:**
|
137 |
+
- **Compute Region:**
|
138 |
+
- **Carbon Emitted:**
|
139 |
+
|
140 |
+
</E_Impact>
|
141 |
+
|
142 |
+
<Cite>
|
143 |
+
|
144 |
+
## Citation Information
|
145 |
+
|
146 |
+
```bibtex
|
147 |
+
@article{li2022efficientformer,
|
148 |
+
title={EfficientFormer: Vision Transformers at MobileNet Speed},
|
149 |
+
author={Li, Yanyu and Yuan, Geng and Wen, Yang and Hu, Eric and Evangelidis, Georgios and Tulyakov, Sergey and Wang, Yanzhi and Ren, Jian},
|
150 |
+
journal={arXiv preprint arXiv:2206.01191},
|
151 |
+
year={2022}
|
152 |
+
}
|
153 |
+
```
|
154 |
+
</Cite>
|