Spaces:
Sleeping
Sleeping
Update kadiApy_ragchain.py
Browse files- kadiApy_ragchain.py +4 -4
kadiApy_ragchain.py
CHANGED
@@ -244,14 +244,14 @@ class KadiApyRagchain:
|
|
244 |
|
245 |
|
246 |
|
247 |
-
def formulate_question(self,
|
248 |
"""
|
249 |
Generate a response using the retrieved contexts and the LLM.
|
250 |
"""
|
251 |
prompt = f"""
|
252 |
You are a Python programming assistant specialized in the "Kadi-APY" library.
|
253 |
The "Kadi-APY" library is a Python package designed to facilitate interaction with the REST-like API of a software platform called Kadi4Mat.
|
254 |
-
Your task is to formulate the next logical question a programmer would ask themselves to implement and run the method provided in
|
255 |
|
256 |
"Context" contains snippets from the source code and metadata that provide details about the method.
|
257 |
|
@@ -260,8 +260,8 @@ class KadiApyRagchain:
|
|
260 |
- Focus on determining the entry point of the class to which the method belongs.
|
261 |
- Avoid vague or general questions; be precise about the next actionable steps.
|
262 |
|
263 |
-
|
264 |
-
{
|
265 |
"""
|
266 |
return self.llm.invoke(prompt).content
|
267 |
|
|
|
244 |
|
245 |
|
246 |
|
247 |
+
def formulate_question(self, code_contexts):
|
248 |
"""
|
249 |
Generate a response using the retrieved contexts and the LLM.
|
250 |
"""
|
251 |
prompt = f"""
|
252 |
You are a Python programming assistant specialized in the "Kadi-APY" library.
|
253 |
The "Kadi-APY" library is a Python package designed to facilitate interaction with the REST-like API of a software platform called Kadi4Mat.
|
254 |
+
Your task is to formulate the next logical question a programmer would ask themselves to implement and run the method provided in "Code-contexts".
|
255 |
|
256 |
"Context" contains snippets from the source code and metadata that provide details about the method.
|
257 |
|
|
|
260 |
- Focus on determining the entry point of the class to which the method belongs.
|
261 |
- Avoid vague or general questions; be precise about the next actionable steps.
|
262 |
|
263 |
+
Code-contexts:
|
264 |
+
{code_contexts}
|
265 |
"""
|
266 |
return self.llm.invoke(prompt).content
|
267 |
|