Spaces:
Running
Running
File size: 1,248 Bytes
d8f8e92 f55ed24 d8f8e92 f55ed24 6870193 9ce9fef 6f6e1be f55ed24 37ac8dd f55ed24 55898f4 f55ed24 60b5753 6870193 60b5753 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
---
title: README
emoji: π
colorFrom: yellow
colorTo: yellow
sdk: static
pinned: false
license: apache-2.0
---
Hierarchy Transformers (HiTs) are capable of interpreting and encoding hierarchies explicitly.
The relevant code in [HierarchyTransformers](https://github.com/KRR-Oxford/HierarchyTransformers) extends from [Sentence-Transformers](https://huggingface.co/sentence-transformers).
## Get Started
Install `hierarchy_tranformers` (check our [repository](https://github.com/KRR-Oxford/HierarchyTransformers)) through `pip` or `GitHub`.
Use the following code to get started with HiTs:
```python
from hierarchy_transformers import HierarchyTransformer
from hierarchy_transformers.utils import get_torch_device
# set up the device (use cpu if no gpu found)
gpu_id = 0
device = get_torch_device(gpu_id)
# load the model
model = HierarchyTransformer.load_pretrained('Hierarchy-Transformers/HiT-MiniLM-L12-WordNet', device)
# entity names to be encoded.
entity_names = ["computer", "personal computer", "fruit", "berry"]
# get the entity embeddings
entity_embeddings = model.encode(entity_names)
```
## Datasets
The datasets for training and evaluating HiTs are available at [Zenodo](https://zenodo.org/doi/10.5281/zenodo.10511042).
|