metadata
license: apache-2.0
inference: false
Query: Serve your models directly from Hugging Face infrastructure and run large scale NLP models in milliseconds with just a few lines of code
Top 5 clarifications generated:
are you looking for a suitable cloud platform to run your models on (Score: 0.3862)
are you looking for a quick test or a more complex model (Score: 0.3364)
how would you like your nlp model to be used (Score: 0.3249)
are you looking for a suitable ldl to use as a server or a client (Score: 0.3182)
how would you like to consume the nlp model (Score: 0.2842)