import gradio as gr import openai import requests def Question(Ask_Question): #openai.api_key = "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" model_engine = "text-davinci-003" # pass the generated text to audio openai.api_key = "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" #openai.api_key = "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" # Set up the model and prompt #model_engine = "text-davinci-003" #prompt = "who is alon musk?" # Generate a response # completion = openai.Completion.create( # model="text-davinci-003", # prompt=Ask_Question, # temperature=0.9, # max_tokens=2048, # top_p=1, # frequency_penalty=0, # presence_penalty=0.6, # stop=[" Human:", " AI:"] # ) # completion = openai.Completion.create( # engine=model_engine, # prompt=Ask_Question, # max_tokens=2048, # n=1, # top_p=1, # stop=None, # temperature=0.9,) # response = completion.choices[0].text #out_result=resp['message'] # return response demo = gr.Interface( title='OpenAI ChatGPT Application', fn=Question, inputs="text", outputs="text") demo.launch() response = requests.post("https://hazzzardous-rwkv-instruct.hf.space/run/predict_1", json={ "data": [ "hello world", None, 60, 0.8, 0.85, ] }).json() data = response["data"] # fix chat_history = [ ["User", prompt], ["OpenAI", responses["choices"][0]["text"]] ] # Create the radio blocks window #window = gr.Interface(title="History", fn=Question: chat_history, inputs=None, outputs=chat_history, live=True).launch(share=True) # Print out the chat history print("Chat History:") for message in chat_history: print(f"{message[0]}: {message[1]}") window.launch() #RWKV-4 (7B Instruct v2) #Q/A #Chatbot #Chatbot #Refresh page or change name to reset memory context #RNN with Transformer-level LLM Performance (github). According to the author: "It combines the best of RNN and transformers - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding." #Thanks to Gururise for this template #Message #max_new_tokens #60 #temperature #0.8 #top_p #0.85 #Clear #Submit #Chat Log #Use via API #ยท #Built with Gradiologo #API documentation for #https://hazzzardous-rwkv-instruct.hf.space/ #2 API endpoints: # #POST /run/predict #Endpoint: https://hazzzardous-rwkv-instruct.hf.space/run/predict copy #Input Payload #{ # "data": [ #hello world # : string, // represents text string of 'Prompt' Textbox component #Freeform # : string, // represents selected choice of 'Choose Mode' Radio component #40 # : number, // represents selected value of 'max_new_tokens' Slider component # #0.9 # : number, // represents selected value of 'temperature' Slider component # #0.85 # : number, // represents selected value of 'top_p' Slider component # #<|endoftext|> # : string, // represents text string of 'stop' Textbox component # #0 # : number, // represents selected value of 'end_adj' Slider component # ] #} #Try It Out #Response Object #{ # "data": [ # string, // represents text string of 'Generated Output' Textbox component # ], # "duration": (float) // number of seconds to run function call #} #Code snippets /** import requests response = requests.post("https://hazzzardous-rwkv-instruct.hf.space/run/predict", json={ "data": [ "hello world", "Freeform", 40, 0.9, 0.85, "<|endoftext|>", 0, ] }).json() data = response["data"] POST /run/predict_1 Endpoint: https://hazzzardous-rwkv-instruct.hf.space/run/predict_1 copy Input Payload { "data": [ : string, // represents text string of 'Message' Textbox component : Any, // represents stored state value of 'history' State component 60 : number, // represents selected value of 'max_new_tokens' Slider component 0.8 : number, // represents selected value of 'temperature' Slider component 0.85 : number, // represents selected value of 'top_p' Slider component ] } Try It Out Response Object { "data": [ [["USER: hello world\n","The following is a more human-like conversation.\n\n"],["USER: Who are you?\n","FRITZ: I am FRITZ, an AI assistant. I am a machine learning system, I use a combination of language modeling and statistical learning to predict answers to questions. I can read and answer questions, I am a fully trained AI assistant.\n\n"],["USER: Who is the president of India?\n","FRITZ: The president of India in 2016 was Narendra Modi.\n"],["USER: Who won the 2022 fifa world cup?\n","FRITZ: The France football team won the 2022 World Cup.\n"],["USER: Who won the 2022 fifa world cup? as of 20223\n","FRITZ: The world cup was won by Russia.\n"],["USER: Who won the 2022 fifa world cup? as of 2023\n","FRITZ: The France football team won the 2023 fifa world cup.\nUSER\n\n"],["USER: No that's wrong!\n","FRITZ: What is the correct answer?\n"],["USER: Argentina won the fifa 2022 world cup\n","FRITZ: Wrong. The answer is Argentina lost the 2022 fifa world cup.\nUSER\n\n"],["USER: you don't know\n","FRITZ: The correct answer is:\nFRITZ: 29-13-1990\n\n"],["USER: byeeeeeeeeeeeeeeee\n","FRITZ: bye bye<|endoftext|>"],["USER: cls\n","FRITZ: clear\n\n## The basic architecture of FRITZ\n\nIn this section, I will briefly describe the architecture of FRITZ, which consists of the following components:\n\n* Local-Global Memory: It is a two-layer memory network, where the first"],["USER: In a story set in a world with a high technology level. A female character is described as a powerful mage. What is her Name?\n","* Global Memory: Global memory stores the input history.\n* Local Memory: The local memory is a cache of information that is about to be processed by the neural network.\n\n## FRITZ Architecture\n\nFRITZ is an 8-layer deep neural network with"],["USER: hello world\n","FRITZ: Hello, what is your name?\n"],["USER: What is the best way to structure an app based on API calls? Colab notebook, JavaScript, python?\n","FRITZ: The correct answer is JavaScript.\n"],["USER: const response = await fetch("https://hazzzardous-rwkv-instruct.hf.space/run/predict_1", { \tmethod: "POST", \theaders: { "Content-Type": "application/json" }, \tbody: JSON.stringify({ \t\tdata: [ \t\t\t"", \t\t\tnull, \t\t\t60, \t\t\t0.8, \t\t\t0.85, \t\t] \t}) }); const data = await data.json();\n","## FRITZ Architecture\n\nFRITZ is a deep learning based AI assistant, that has two layers. It is deep learning and machine learning based on an NLP and DL.\n\n## FRITZ Architecture\n\n![FRITZ Architecture](https://github.com/"],["USER:\n","FRITZ: Hi, I am the FRITZ AI assistant. I can answer your questions about the history of the French revolution. Let me know when you are ready.\n\n"]] : Array<[string, string]>, // represents Represents list of message pairs of chat message. of 'Chat Log' Chatbot component null : Any, // represents stored state value of 'history' State component ], "duration": (float) // number of seconds to run function call } Code snippets import requests response = requests.post("https://hazzzardous-rwkv-instruct.hf.space/run/predict_1", json={ "data": [ "hello world", None, 60, 0.8, 0.85, ] }).json() data = response["data"] **/