How to send response back to Model

#9
by vikaspandey20 - opened

The llama2 model is giving me the function that needs to be executed together with the appropriate arguments based on the user prompt. In order for the model to remember the chat's context, I want to pass the function's response back to it. How these things can be accomplished.

Howdy, did you try this video

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment