File size: 2,503 Bytes
fd30f81
 
 
 
 
 
 
 
 
5581378
ce3dce6
 
 
5581378
 
 
 
 
ce3dce6
 
 
 
5581378
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ce3dce6
 
5581378
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
---
arxiv: '2006.15418'
tags:
- video
- repetition
datasets:
- countix
---

# RepNet PyTorch

GitHub repository: https://github.com/materight/RepNet-pytorch.

A PyTorch port with pre-trained weights of **RepNet**, from *Counting Out Time: Class Agnostic Video Repetition Counting in the Wild* (CVPR 2020) [[paper]](https://arxiv.org/abs/2006.15418) [[project]](https://sites.google.com/view/repnet) [[notebook]](https://colab.research.google.com/github/google-research/google-research/blob/master/repnet/repnet_colab.ipynb#scrollTo=FUg2vSYhmsT0).

This repo provides an implementation of RepNet written in PyTorch and a script to convert the pre-trained TensorFlow weights provided by the authors. The outputs of the two implementations are almost identical, with a small deviation (less than $10^{-6}$ at most) probably caused by the [limited precision of floating point operations](https://pytorch.org/docs/stable/notes/numerical_accuracy.html).

<div align="center">
  <img src="https://raw.githubusercontent.com/materight/RepNet-pytorch/main/img/example1.gif" height="160" />
  <img src="https://raw.githubusercontent.com/materight/RepNet-pytorch/main/img/example2.gif" height="160" />
  <img src="https://raw.githubusercontent.com/materight/RepNet-pytorch/main/img/example3.gif" height="160" />
  <img src="https://raw.githubusercontent.com/materight/RepNet-pytorch/main/img/example4.gif" height="160" />
</div>

## Get Started
- Clone this repo and install dependencies:
```bash
git clone https://github.com/materight/RepNet-pytorch
cd RepNet-pytorch
pip install -r requirements.txt
```

- To download the TensorFlow pre-trained weights and convert them to PyTorch, run:
```bash
python convert_weights.py
```

## Run inference
Simply run:
```bash
python run.py
```
The script will download a sample video, run inference on it and save the count visualization. You can also specify a video path as argument (either a local path or a YouTube/HTTP URL):
```bash
python run.py --video_path [video_path]
```
If the model does not produce good results, try to run the script with more stride values using `--strides`.

Example of generated videos showing the repetition count, with the periodicity score and the temporal self-similarity matrix:
<div align="center">
  <img src="https://raw.githubusercontent.com/materight/RepNet-pytorch/main/img/example5_score.gif" height="200" />
  <img src="https://raw.githubusercontent.com/materight/RepNet-pytorch/main/img/example5_tsm.png" height="200" />
</div>