Skip to content

Commit f39f88c

Browse files
Coach257shizhelun
andauthored
[Feature] Support ExPose for SMPL-X estimation (#201)
- Expose body + hand + face - smplx datasets - expressive mesh estimator Co-authored-by: shizhelun <[email protected]>
1 parent 208f37f commit f39f88c

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

66 files changed

+7922
-330
lines changed

README.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -84,6 +84,7 @@ Supported methods:
8484
- [x] [PARE](https://pare.is.tue.mpg.de/) (ICCV'2021)
8585
- [x] [DeciWatch](https://ailingzeng.site/deciwatch) (arXiv'2022)
8686
- [x] [SmoothNet](https://ailingzeng.site/smoothnet) (arXiv'2022)
87+
- [x] [ExPose](https://expose.is.tue.mpg.de) (ECCV'2020)
8788

8889
</details>
8990

@@ -109,6 +110,9 @@ Supported datasets:
109110
- [x] [PoseTrack18](https://posetrack.net/users/download.php) (CVPR'2018)
110111
- [x] [SURREAL](https://www.di.ens.fr/willow/research/surreal/data/) (CVPR'2017)
111112
- [x] [UP3D](https://files.is.tuebingen.mpg.de/classner/up/) (CVPR'2017)
113+
- [x] [FreiHand](https://lmb.informatik.uni-freiburg.de/projects/freihand/) (ICCV'2019)
114+
- [x] [EHF](https://smpl-x.is.tue.mpg.de/) (CVPR'2019)
115+
- [x] [Stirling/ESRC-Face3D](http://pics.psych.stir.ac.uk/ESRC/index.htm) (FG'2018)
112116

113117
</details>
114118

README_CN.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -84,6 +84,7 @@ https://user-images.githubusercontent.com/62529255/144362861-e794b404-c48f-4ebe-
8484
- [x] [PARE](https://pare.is.tue.mpg.de/) (ICCV'2021)
8585
- [x] [DeciWatch](https://ailingzeng.site/deciwatch) (arXiv'2022)
8686
- [x] [SmoothNet](https://ailingzeng.site/smoothnet) (arXiv'2022)
87+
- [x] [ExPose](https://expose.is.tue.mpg.de) (ECCV'2020)
8788

8889
</details>
8990

@@ -109,6 +110,9 @@ https://user-images.githubusercontent.com/62529255/144362861-e794b404-c48f-4ebe-
109110
- [x] [PoseTrack18](https://posetrack.net/users/download.php) (CVPR'2018)
110111
- [x] [SURREAL](https://www.di.ens.fr/willow/research/surreal/data/) (CVPR'2017)
111112
- [x] [UP3D](https://files.is.tuebingen.mpg.de/classner/up/) (CVPR'2017)
113+
- [x] [FreiHand](https://lmb.informatik.uni-freiburg.de/projects/freihand/) (ICCV'2019)
114+
- [x] [EHF](https://smpl-x.is.tue.mpg.de/) (CVPR'2019)
115+
- [x] [Stirling/ESRC-Face3D](http://pics.psych.stir.ac.uk/ESRC/index.htm) (FG'2018)
112116

113117
</details>
114118

configs/expose/README.md

Lines changed: 141 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,141 @@
1+
#ExPose
2+
3+
## Introduction
4+
We provide the config files for ExPose: [Monocular Expressive Body Regression through Body-Driven Attention](https://arxiv.org/abs/2008.09062).
5+
6+
7+
```BibTeX
8+
@inproceedings{ExPose:2020,
9+
title = {Monocular Expressive Body Regression through Body-Driven Attention},
10+
author = {Choutas, Vasileios and Pavlakos, Georgios and Bolkart, Timo and Tzionas, Dimitrios and Black, Michael J.},
11+
booktitle = {European Conference on Computer Vision (ECCV)},
12+
pages = {20--40},
13+
year = {2020},
14+
url = {https://expose.is.tue.mpg.de}
15+
}
16+
```
17+
18+
## Notes
19+
20+
- [SMPLX](https://smpl-x.is.tue.mpg.de/) v1.1 is used in our experiments.
21+
- [FLAME](https://flame.is.tue.mpg.de/) 2019 is used in our experiments.
22+
- [MANO](https://mano.is.tue.mpg.de/) v1.2 is used in our experiments.
23+
- [SMPL](https://smpl.is.tue.mpg.de/) v1.0 is used for body evaluation on 3DPW.
24+
- [all_means.pkl](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/data/body_models/smplx/all_means.pkl?versionId=CAEQRBiBgIChyabujhgiIDQwNDMzNzlmM2U4ZTQzNWY5NjUxMmU4ZGQ4NGMwNmIx)
25+
- [J_regressor_h36m.npy](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/J_regressor_h36m.npy?versionId=CAEQHhiBgIDE6c3V6xciIDdjYzE3MzQ4MmU4MzQyNmRiZDA5YTg2YTI5YWFkNjRi)
26+
- [MANO_SMPLX_vertex_ids.pkl](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/data/body_models/smplx/MANO_SMPLX_vertex_ids.pkl?versionId=CAEQRBiBgIDjx9v4jhgiIDJjZjhiMWI1ZGRmMTRmMTI5MDVkMzJkMWUyYTQxZDk2)
27+
- [shape_mean.npy](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/data/body_models/smplx/shape_mean.npy?versionId=CAEQRBiBgIDqwKbujhgiIGM4OTIxMWM3MDNiNzQxN2RiOTRjNDIwZTNiMzdmMDVi)
28+
- [SMPL-X__FLAME_vertex_ids.npy](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/data/body_models/smplx/SMPL-X__FLAME_vertex_ids.npy?versionId=CAEQRBiBgMDUyNv4jhgiIDBlYzNkOTI2YzFlZjRmZWZiZTJkM2IwZGZhZjg4NzE5)
29+
- [SMPLX_to_J14.npy](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/data/body_models/smplx/SMPLX_to_J14.npy?versionId=CAEQRBiBgMDd26fujhgiIDQ3ODhmOGJhMzhhMzQ2M2Y4MTRlNDcxY2VjNmUzY2Qy)
30+
- [flame_dynamic_embedding.npy](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/data/body_models/smplx/flame_dynamic_embedding.npy?versionId=CAEQRBiBgMCn4abujhgiIDBmNmEzYTBiZmIzYjQ5NTg4MmVhZGRjYTYwNWU2MGRk)
31+
- [flame_static_embedding.npy](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/data/body_models/smplx/flame_static_embedding.pkl?versionId=CAEQRBiBgMCAxqbujhgiIGIzMTRiZjZkZjRhMDQ4NzA5YmU2YjQyMTNmYmQ5OWI5)
32+
- [ExPose_curated_fits](https://expose.is.tue.mpg.de)
33+
- [spin_in_smplx](https://expose.is.tue.mpg.de)
34+
- [ffhq_annotations.npz](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/datasets/ffhq_annotations.npz?versionId=CAEQRBiBgMCO46zvjhgiIDJhNDhlYTM2N2NmYjRmM2I4NWI2NDY0ZWM4NjExMzhm) We run [RingNet](https://ringnet.is.tue.mpg.de/) on FFHQ and then fitting to FAN 2D landmarks by [flame-fitting](https://github.com/HavenFeng/photometric_optimization).
35+
36+
As for pretrained model (hrnet_hmr_expose_body.pth). You can download it from [here](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/expose/body/hrnet_hmr_expose_body-d7db2e53_20220708.pth?versionId=CAEQRBiBgMDFt6zujhgiIDMxODBkODE4ZTI5NjQ1OTRiN2I0MDM4NWMwOTA1NTFm) and change the path of pretrained model in the config.
37+
You can also pretrain the model using [hrnet_hmr_expose_body.py](hrnet_hmr_expose_body.py).
38+
39+
As for pretrained model (resnet18_hmr_expose_face.pth). You can download it from [here](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/expose/face/resnet18_hmr_expose_face-aca68aad_20220708.pth?versionId=CAEQRBiBgMCbvbbujhgiIGMxY2RlMjUyMGY4MjRmMDhiM2VkM2VhNWU4Y2ZjODZi) and change the path of pretrained model in the config.
40+
You can also pretrain the model using [resnet18_hmr_expose_face.py](resnet18_hmr_expose_face.py).
41+
42+
As for pretrained model (resnet18_hmr_expose_hand.pth). You can download it from [here](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/expose/hand/resnet18_hmr_expose_hand-c6cf0236_20220708.pth?versionId=CAEQRBiBgIDvqbbujhgiIGFiZTI3YmFkOTMyMTQxZWNiYjQxYzU0NjM0N2U1ZGVh) and change the path of pretrained model in the config.
43+
You can also pretrain the model using [resnet18_hmr_expose_hand.py](resnet18_hmr_expose_hand.py).
44+
45+
Download the above resources and arrange them in the following file structure:
46+
47+
```text
48+
mmhuman3d
49+
├── mmhuman3d
50+
├── docs
51+
├── tests
52+
├── tools
53+
├── configs
54+
└── data
55+
├── body_models
56+
│ ├── all_means.pkl
57+
│ ├── J_regressor_h36m.npy
58+
│ ├── flame
59+
│ │ ├── FLAME_NEUTRAL.pkl
60+
│ │ ├── flame_dynamic_embedding.npy
61+
│ │ └── flame_static_embedding.npy
62+
│ ├── mano
63+
│ │ └── MANO_RIGHT.pkl
64+
│ ├── smpl
65+
│ │ ├── SMPL_FEMALE.pkl
66+
│ │ ├── SMPL_MALE.pkl
67+
│ │ └── SMPL_NEUTRAL.pkl
68+
│ └── smplx
69+
│ ├── all_means.pkl
70+
│ ├── MANO_SMPLX_vertex_ids.pkl
71+
│ ├── shape_mean.npy
72+
│ ├── SMPL-X__FLAME_vertex_ids.npy
73+
│ ├── SMPLX_to_J14.npy
74+
│ └── SMPLX_NEUTRAL.pkl
75+
├── pretrained_models
76+
│ ├── hrnet_pretrain.pth
77+
│ ├── resnet18.pth
78+
│ ├── hrnet_hmr_expose_body.pth
79+
│ ├── resnet18_hmr_expose_face.pth
80+
│ └── resnet18_hmr_expose_hand.pth
81+
├── preprocessed_datasets
82+
│ ├── curated_fits_train.npz
83+
│ ├── ehf_val.npz
84+
│ ├── ffhq_flame_train.npz
85+
│ ├── freihand_test.npz
86+
│ ├── freihand_train.npz
87+
│ ├── freihand_val.npz
88+
│ ├── h36m_smplx_train.npz
89+
│ ├── pw3d_test.npz
90+
│ ├── spin_smplx_train.npz
91+
│ └── stirling_ESRC3D_HQ.npz
92+
└── datasets
93+
├── 3DPW
94+
├── coco
95+
├── EHF
96+
├── ExPose_curated_fits
97+
│ └── train.npz
98+
├── ffhq
99+
│ ├── ffhq_annotations.npz
100+
│ └── ffhq_global_images_1024
101+
├── FreiHand
102+
├── h36m
103+
├── lsp
104+
│ ├── lsp_dataset_original
105+
│ └── lspet
106+
├── mpii
107+
├── spin_in_smplx
108+
│ ├── coco.npz
109+
│ ├── lsp.npz
110+
│ ├── lspet.npz
111+
│ └── mpii.npz
112+
└── stirling
113+
├── annotations
114+
├── F_3D_N
115+
├── M_3D_N
116+
└── Subset_2D_FG2018
117+
```
118+
119+
## Results and Models
120+
121+
We evaluate hrnet_hmr_expose_body on 3DPW. Values are MPJPE/PA-MPJPE.
122+
123+
| Config | 3DPW | Download |
124+
|:------:|:-------:|:------:|
125+
| [hrnet_hmr_expose_body.py](hrnet_hmr_expose_body.py) | 92.59 / 60.43 | [model](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/expose/body/hrnet_hmr_expose_body-d7db2e53_20220708.pth?versionId=CAEQRBiBgMDFt6zujhgiIDMxODBkODE4ZTI5NjQ1OTRiN2I0MDM4NWMwOTA1NTFm) &#124; [log](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/expose/body/20220704_005929.log?versionId=CAEQRBiBgMDCt6zujhgiIGJiYzY0ODdlMGZlMjRjYmZhZDc5YTY2YzM0OTk0NDc3) |
126+
127+
128+
We evaluate resnet18_hmr_expose_face on Stirling/ESRC 3D. Values are 3DRMSE.
129+
| Config | Stirling/ESRC 3D | Download |
130+
|:------:|:-------:|:------:|
131+
| [resnet18_hmr_expose_face.py](resnet18_hmr_expose_face.py) | 2.40 | [model](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/expose/face/resnet18_hmr_expose_face-aca68aad_20220708.pth?versionId=CAEQRBiBgMCbvbbujhgiIGMxY2RlMjUyMGY4MjRmMDhiM2VkM2VhNWU4Y2ZjODZi) &#124; [log](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/expose/face/20220630_111340.log?versionId=CAEQRBiBgICFtLbujhgiIGUzYmEyOGU3N2ZkOTRkNDM5OTIyODZiOWQ1MzJiMWZj) |
132+
133+
We evaluate resnet18_hmr_expose_hand on FreiHand. Values are PA-MPJPE/PA-PVE.
134+
| Config | FreiHand | Download |
135+
|:------:|:-------:|:------:|
136+
| [resnet18_hmr_expose_hand.py](resnet18_hmr_expose_hand.py) | 10.03 / 9.61 | [model](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/expose/hand/resnet18_hmr_expose_hand-c6cf0236_20220708.pth?versionId=CAEQRBiBgIDvqbbujhgiIGFiZTI3YmFkOTMyMTQxZWNiYjQxYzU0NjM0N2U1ZGVh) &#124; [log](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/expose/hand/20220630_110254.log?versionId=CAEQRBiBgMCSuLbujhgiIDlmNDdhODg2MjA2NzQ1Njg5MTBlNWM1NDIxY2QyZmM2) |
137+
138+
We evaluate ExPose on EHF. Values are BODY PA-MPJPE/RIGHT_HAND PA-MPJPE/LEFT_HAND PA-MPJPE/PA-PVE/RIGHT_HAND PA-PVE/LEFT_HAND PA-PVE/FACE PA-PVE.
139+
| Config | EHF | Download |
140+
|:------:|:-------:|:------:|
141+
| [expose.py](expose.py) | 55.70 / 14.6 / 14.4/ 56.65 / 14.6 / 14.5 / 6.90 | [model](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmhuman3d/models/expose/expose/expose-d9d5dbf7_20220708.pth?versionId=CAEQRBiBgMC8vbbujhgiIDg0NWUyM2ZiZGY3MzQ0YmI5YjFjYTA0Y2Q5NDE3MDEw)

0 commit comments

Comments
 (0)