andreped commited on
Commit
48e4914
·
unverified ·
2 Parent(s): 1e01fca 8f90997

Merge pull request #37 from andreped/demo-fix

Browse files
Files changed (2) hide show
  1. README.md +13 -0
  2. demo/src/utils.py +1 -0
README.md CHANGED
@@ -19,6 +19,12 @@ app_file: demo/app.py
19
  <h1 align="center">DDMR: Deep Deformation Map Registration</h1>
20
  <h3 align="center">Learning deep abdominal CT registration through adaptive loss weighting and synthetic data generation</h3>
21
 
 
 
 
 
 
 
22
  **DDMR** was developed by SINTEF Health Research. The corresponding manuscript describing the framework has been published in [PLOS ONE](https://journals.plos.org/plosone/) and the manuscript is openly available [here](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0282110).
23
 
24
  </div>
@@ -74,6 +80,13 @@ where:
74
 
75
  Use ```ddmr --help``` to see additional options like using precomputed segmentations to crop the images to the desired ROI, or debugging.
76
 
 
 
 
 
 
 
 
77
  ## 🏋️‍♂️ Training
78
 
79
  Use the "MultiTrain" scripts to launch the trainings, providing the neccesary parameters. Those in the COMET folder accepts a .ini configuration file (see COMET/train_config_files for example configurations).
 
19
  <h1 align="center">DDMR: Deep Deformation Map Registration</h1>
20
  <h3 align="center">Learning deep abdominal CT registration through adaptive loss weighting and synthetic data generation</h3>
21
 
22
+ [![license](https://img.shields.io/github/license/DAVFoundation/captain-n3m0.svg?style=flat-square)](https://github.com/DAVFoundation/captain-n3m0/blob/master/LICENSE)
23
+ [![CI/CD](https://github.com/jpdefrutos/DDMR/actions/workflows/deploy.yml/badge.svg)](https://github.com/jpdefrutos/DDMR/actions/workflows/deploy.yml)
24
+ [![Paper](https://zenodo.org/badge/DOI/10.1371/journal.pone.0282110.svg)](https://doi.org/10.1371/journal.pone.0282110)
25
+ <a target="_blank" href="https://huggingface.co/spaces/andreped/DDMR"><img src="https://img.shields.io/badge/🤗%20Hugging%20Face-Spaces-yellow.svg"></a>
26
+
27
+
28
  **DDMR** was developed by SINTEF Health Research. The corresponding manuscript describing the framework has been published in [PLOS ONE](https://journals.plos.org/plosone/) and the manuscript is openly available [here](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0282110).
29
 
30
  </div>
 
80
 
81
  Use ```ddmr --help``` to see additional options like using precomputed segmentations to crop the images to the desired ROI, or debugging.
82
 
83
+ ## 🤗 Demo <a target="_blank" href="https://huggingface.co/spaces/andreped/DDMR"><img src="https://img.shields.io/badge/🤗%20Hugging%20Face-Spaces-yellow.svg"></a>
84
+
85
+ A live demo to easily test the best performing pretrained models was developed in Gradio and is deployed on `Hugging Face`.
86
+
87
+ To access the live demo, click on the `Hugging Face` badge above. Below is a snapshot of the current state of the demo app.
88
+
89
+
90
  ## 🏋️‍♂️ Training
91
 
92
  Use the "MultiTrain" scripts to launch the trainings, providing the neccesary parameters. Those in the COMET folder accepts a .ini configuration file (see COMET/train_config_files for example configurations).
demo/src/utils.py CHANGED
@@ -9,6 +9,7 @@ def load_ct_to_numpy(data_path):
9
  data_path = data_path.name
10
 
11
  image = nib.load(data_path)
 
12
  resampled = resample_to_output(image, None, order=0)
13
  data = resampled.get_fdata()
14
 
 
9
  data_path = data_path.name
10
 
11
  image = nib.load(data_path)
12
+ print("original nibabel image shape:", image.shape)
13
  resampled = resample_to_output(image, None, order=0)
14
  data = resampled.get_fdata()
15