metadata
tags:
- depth-estimation
library_name: coreml
license: apple-ascl
base_model:
- apple/DepthPro
DepthPro CoreML Models
DepthPro is a monocular depth estimation model. This means that it is trained to predict depth on a single image.
Model Variants
Model Inputs and Outputs
DepthPro Normalized Inverse Depth Models
Inputs
image
: 1536x1536 3 color image.
Outputs
normalizedInverseDepth
1536x1536 monochrome image.
DepthPro Models
Inputs
image
: 1536x1536 3 color image.originalWidth
: 1x1x1x1 Tensor containing the original width of the image before resizing.
Outputs
depthMeters
: 1x1x1536x1536 Tensor containing depth in meters.
Download
Install huggingface-cli
brew install huggingface-cli
To download one of the .mlpackage
folders to the models
directory:
huggingface-cli download \
--local-dir models --local-dir-use-symlinks False \
KeighBee/coreml-DepthPro \
--include "DepthProNormalizedInverseDepthPruned10QuantizedLinear.mlpackage/*" "DepthProPruned10QuantizedLinear.mlpackage/*"
To download everything, skip the --include
argument.
Integrate in Swift apps
The huggingface/coreml-examples
repository contains sample Swift code for DepthProNormalizedInverseDepthPruned10QuantizedLinear.mlpackage
and other models. See the instructions there to build the demo app, which shows how to use the model in your own Swift apps.