Spaces:
Running
Running
title: README | |
emoji: π | |
colorFrom: yellow | |
colorTo: yellow | |
sdk: static | |
pinned: false | |
license: apache-2.0 | |
Hierarchy Transformers (HiTs) are capable of interpreting and encoding hierarchies explicitly. | |
The relevant code in [HierarchyTransformers](https://github.com/KRR-Oxford/HierarchyTransformers) extends from [Sentence-Transformers](https://huggingface.co/sentence-transformers). | |
## Get Started | |
Install `hierarchy_tranformers` (check our [repository](https://github.com/KRR-Oxford/HierarchyTransformers)) through `pip` or `GitHub`. | |
Use the following code to get started with HiTs: | |
```python | |
from hierarchy_transformers import HierarchyTransformer | |
from hierarchy_transformers.utils import get_torch_device | |
# set up the device (use cpu if no gpu found) | |
gpu_id = 0 | |
device = get_torch_device(gpu_id) | |
# load the model | |
model = HierarchyTransformer.load_pretrained('Hierarchy-Transformers/HiT-MiniLM-L12-WordNet', device) | |
# entity names to be encoded. | |
entity_names = ["computer", "personal computer", "fruit", "berry"] | |
# get the entity embeddings | |
entity_embeddings = model.encode(entity_names) | |
``` | |
## Datasets | |
The datasets for training and evaluating HiTs are available at [Zenodo](https://zenodo.org/doi/10.5281/zenodo.10511042). | |