How to get the whole model list?

#1
by tastypear - opened

I'm curious about which models the serverless inference api supports. The official doesn't seem to give a specific list.

I just tested meta-llama/Meta-Llama-3.1-405B-Instruct-FP8 and it's available.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment