license: apache-2.0 | |
inference: False | |
Query: Serve your models directly from Hugging Face infrastructure and run large scale NLP models in milliseconds with just a few lines of code | |
Top 5 clarifications generated: <br /> | |
are you looking for a suitable cloud platform to run your models on (Score: 0.3862) <br /> | |
are you looking for a quick test or a more complex model (Score: 0.3364) <br /> | |
how would you like your nlp model to be used (Score: 0.3249) <br /> | |
are you looking for a suitable ldl to use as a server or a client (Score: 0.3182) <br /> | |
how would you like to consume the nlp model (Score: 0.2842) <br /> | |