Text-to-3D
Uni-3DAR
nielsr HF staff commited on
Commit
5619406
·
verified ·
1 Parent(s): 4a2409f

Improve model card

Browse files

The model is not compatible with the Transformers library, so we update the `library_name` metadata.

Files changed (1) hide show
  1. README.md +9 -13
README.md CHANGED
@@ -1,6 +1,6 @@
1
  ---
2
  license: mit
3
- library_name: transformers
4
  pipeline_tag: text-to-3d
5
  ---
6
 
@@ -8,8 +8,7 @@ Uni-3DAR
8
  ========
9
  [[Paper](https://arxiv.org/pdf/2503.16278)]
10
 
11
- Introduction
12
- ------------
13
 
14
  <p align="center"><img src="fig/overview.png" width=95%></p>
15
  <p align="center"><b>Schematic illustration of the Uni-3DAR framework</b></p>
@@ -29,21 +28,18 @@ Uni-3DAR is an autoregressive model that unifies various 3D tasks. In particular
29
  Building on octree compression, Uni-3DAR further tokenizes fine-grained 3D patches to maintain structural details, achieving substantially better generation quality than previous diffusion-based models.
30
 
31
 
32
- News
33
- ----
34
 
35
  **2025-03-21:** We have released the core model along with the QM9 training and inference pipeline.
36
 
37
 
38
- Dependencies
39
- ------------
40
 
41
  - [Uni-Core](https://github.com/dptech-corp/Uni-Core). For convenience, you can use our prebuilt Docker image:
42
  `docker pull dptechnology/unicore:2407-pytorch2.4.0-cuda12.5-rdma`
43
 
44
 
45
- Reproducing Results on QM9
46
- --------------------------
47
 
48
  To reproduce results on the QM9 dataset using our pretrained model or train from scratch, please follow the instructions below.
49
 
@@ -64,22 +60,22 @@ bash inference_qm9.sh qm9.pt
64
  To train the model from scratch:
65
 
66
  1. Extract the dataset:
67
- ```
68
  tar -xzvf qm9_data.tar.gz
69
  ```
70
 
71
  2. Run the training script with your desired data path and experiment name:
72
 
73
- ```
74
  base_dir=/your_folder_to_save/ bash train_qm9.sh ./qm9_data/ name_of_your_exp
75
  ```
76
 
77
 
78
- Citation
79
- --------
80
 
81
  Please kindly cite our papers if you use the data/code/model.
82
 
 
83
  ```
84
  @article{lu2025uni3dar,
85
  author = {Shuqi Lu and Haowei Lin and Lin Yao and Zhifeng Gao and Xiaohong Ji and Weinan E and Linfeng Zhang and Guolin Ke},
 
1
  ---
2
  license: mit
3
+ library_name: pytorch
4
  pipeline_tag: text-to-3d
5
  ---
6
 
 
8
  ========
9
  [[Paper](https://arxiv.org/pdf/2503.16278)]
10
 
11
+ ## Introduction
 
12
 
13
  <p align="center"><img src="fig/overview.png" width=95%></p>
14
  <p align="center"><b>Schematic illustration of the Uni-3DAR framework</b></p>
 
28
  Building on octree compression, Uni-3DAR further tokenizes fine-grained 3D patches to maintain structural details, achieving substantially better generation quality than previous diffusion-based models.
29
 
30
 
31
+ ## News
 
32
 
33
  **2025-03-21:** We have released the core model along with the QM9 training and inference pipeline.
34
 
35
 
36
+ ## Dependencies
 
37
 
38
  - [Uni-Core](https://github.com/dptech-corp/Uni-Core). For convenience, you can use our prebuilt Docker image:
39
  `docker pull dptechnology/unicore:2407-pytorch2.4.0-cuda12.5-rdma`
40
 
41
 
42
+ ## Reproducing Results on QM9
 
43
 
44
  To reproduce results on the QM9 dataset using our pretrained model or train from scratch, please follow the instructions below.
45
 
 
60
  To train the model from scratch:
61
 
62
  1. Extract the dataset:
63
+ ```bash
64
  tar -xzvf qm9_data.tar.gz
65
  ```
66
 
67
  2. Run the training script with your desired data path and experiment name:
68
 
69
+ ```bash
70
  base_dir=/your_folder_to_save/ bash train_qm9.sh ./qm9_data/ name_of_your_exp
71
  ```
72
 
73
 
74
+ ## Citation
 
75
 
76
  Please kindly cite our papers if you use the data/code/model.
77
 
78
+ bibtex
79
  ```
80
  @article{lu2025uni3dar,
81
  author = {Shuqi Lu and Haowei Lin and Lin Yao and Zhifeng Gao and Xiaohong Ji and Weinan E and Linfeng Zhang and Guolin Ke},