Spaces:
Runtime error
Apply for community grant: Academic project
Hi there,
We are a pair of PhD students currently studying at Imperial College London.
We are lucky enough to have access to a cluster of A100 GPUs here at Imperial, so we have trained a couple of LLM models (Llama-based) using large instruction-based datasets with a focus on intellectual property. We would love to make this model available for people to try out so that we can learn more about how people interact with these types of models, however, these models require at least 16GB of RAM, which by your pricing, would add up quite quickly, making a demo of more than a couple of days inaccessible for us.
We anticipate that the model will garner a lot of traffic, which could mean people are encouraged to use other spaces and models on HF!
Please let us know if you would like any more information.
Kind regards,
Egheosa Ogbomo