Siyona commited on
Commit
e1d5884
·
1 Parent(s): 9b9d268

Deploy TechTales AI on Hugging Face

Browse files
Files changed (6) hide show
  1. README.md +58 -10
  2. TestModelNames.py +5 -0
  3. main.py +91 -0
  4. requirements.txt +10 -0
  5. research_output.txt +22 -0
  6. tools.py +30 -0
README.md CHANGED
@@ -1,13 +1,61 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
- title: TechTales AI
3
- emoji: 🚀
4
- colorFrom: gray
5
- colorTo: purple
6
- sdk: streamlit
7
- sdk_version: 1.44.0
8
- app_file: app.py
9
- pinned: false
10
- short_description: An AI-powered assistant built using LangChain and Claud API
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
  ---
12
 
13
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
1
+ # 🤖 TechTales AI - Your AI-Powered Assistant
2
+
3
+
4
+ TechTales AI is an AI-powered search assistant built using LangChain and Claude API. This project allows users to ask any questions and receive structured responses. Follow the steps below to set up and run the application.
5
+
6
+
7
+ ## 🚀 Features
8
+ - Uses **Claude-3-5-Sonnet** for intelligent responses.
9
+ - Supports **tool integration** (search, wiki, save tools).
10
+ - Built with **Streamlit** for an interactive UI.
11
+ - Secure API key handling through **.env file**.
12
+
13
  ---
14
+
15
+ ## 🛠️ Installation & Setup
16
+
17
+ ### **1️⃣ Clone the Repository**
18
+ ```bash
19
+ git clone https://huggingface.co/spaces/YourHuggingFaceRepo/TechTales-AI.git
20
+ cd TechTales-AI
21
+ ```
22
+
23
+ ### **2️⃣ Create a Virtual Environment (Optional but Recommended)**
24
+ ```bash
25
+ python -m venv venv
26
+ source venv/bin/activate # On Mac/Linux
27
+ venv\Scripts\activate # On Windows
28
+ ```
29
+
30
+ ### **3️⃣ Install Dependencies**
31
+ ```bash
32
+ pip install -r requirements.txt
33
+ ```
34
+
35
+ ### **4️⃣ Set Up API Keys**
36
+ This project uses the **Claude API**. You need to create a `.env` file to store your API key.
37
+
38
+ #### **Create a `.env` file in the project root and add:**
39
+ ```env
40
+ ANTHROPIC_API_KEY=your_claude_api_key_here
41
+ ```
42
+ 💡 **Note:** You must obtain your own API key from [Anthropic](https://www.anthropic.com/) as the key is not provided in this repository.
43
+
44
+ ### **5️⃣ Run the Application**
45
+ ```bash
46
+ streamlit run app.py
47
+ ```
48
+
49
+ This will start a local server, and you can interact with TechTales-AI via the browser.
50
+
51
+ ## 🔥 Contributing
52
+ Feel free to fork the repo and submit pull requests to improve TechTales AI! 😊
53
+
54
  ---
55
 
56
+ ## 📄 License
57
+ This project is licensed under the MIT License.
58
+
59
+ Happy coding! 🚀
60
+
61
+
TestModelNames.py ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ import anthropic
2
+
3
+ client = anthropic.Anthropic(api_key="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")
4
+ models = client.models.list()
5
+ print(models) # This will show the exact API-compatible model names
main.py ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import streamlit as st
2
+ from dotenv import load_dotenv
3
+ from pydantic import BaseModel
4
+ from langchain_anthropic import ChatAnthropic
5
+ from langchain_core.prompts import ChatPromptTemplate
6
+ from langchain_core.output_parsers import PydanticOutputParser
7
+ from langchain.agents import create_tool_calling_agent, AgentExecutor
8
+ from tools import search_tool, wiki_tool, save_tool
9
+
10
+ # Load environment variables
11
+ load_dotenv()
12
+
13
+ # Define the response format
14
+ class ResearchResponse(BaseModel):
15
+ topic: str
16
+ summary: str
17
+ sources: list[str]
18
+ tools_used: list[str]
19
+
20
+ # Initialize Claude model with low token usage
21
+ llm = ChatAnthropic(model="claude-3-5-sonnet-20241022", max_tokens=300)
22
+
23
+ # Create parser
24
+ parser = PydanticOutputParser(pydantic_object=ResearchResponse)
25
+
26
+ # Define prompt template
27
+ prompt = ChatPromptTemplate.from_messages(
28
+ [
29
+ (
30
+ "system",
31
+ """
32
+ You are an AI assistant that will help of any general questions.
33
+ Answer the user query and use necessary tools. Don't answer any offensive questions or any swearings.
34
+ Wrap the output in this format and provide no other text\n{format_instructions}
35
+ """,
36
+ ),
37
+ ("placeholder", "{chat_history}"),
38
+ ("human", "{query}"),
39
+ ("placeholder", "{agent_scratchpad}"),
40
+ ]
41
+ ).partial(format_instructions=parser.get_format_instructions())
42
+
43
+ # Define tools and create the agent
44
+ tools = [search_tool, wiki_tool, save_tool]
45
+ agent = create_tool_calling_agent(llm=llm, prompt=prompt, tools=tools)
46
+
47
+ # Initialize the agent executor
48
+ agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=False)
49
+
50
+ # Streamlit UI
51
+ st.set_page_config(page_title="TechTales AI - Your own AI Powered Assistant", layout="centered")
52
+ st.title("💡 TechTales AI - Your own AI Powered Assistant")
53
+
54
+ # Initialize chat history in session state
55
+ if "messages" not in st.session_state:
56
+ st.session_state.messages = []
57
+
58
+ # Display chat history
59
+ for msg in st.session_state.messages:
60
+ st.chat_message(msg["role"]).write(msg["content"])
61
+
62
+ # Chat input with Enter button
63
+ query = st.text_input("I am your personal Assistant, Ask me anything...", key="query_input")
64
+ submit_button = st.button("Enter")
65
+
66
+ if submit_button and query:
67
+ # Add user message to history
68
+ st.session_state.messages.append({"role": "user", "content": query})
69
+ st.chat_message("user").write(query)
70
+
71
+ # Get AI response
72
+ raw_response = agent_executor.invoke({"query": query})
73
+
74
+ try:
75
+ structured_response = parser.parse(raw_response.get("output")[0]["text"])
76
+ response_text = f"**Topic:** {structured_response.topic}\n\n**Summary:** {structured_response.summary}\n\n**Sources:** {', '.join(structured_response.sources)}"
77
+ except Exception:
78
+ response_text = "I'm sorry, I couldn't process that request."
79
+
80
+ # Add AI response to history
81
+ st.session_state.messages.append({"role": "assistant", "content": response_text})
82
+ st.chat_message("assistant").write(response_text)
83
+
84
+ # Clear Chat Button
85
+ if st.button("🗑️ Clear Chat"):
86
+ st.session_state.messages = []
87
+ st.rerun() # Corrected method
88
+
89
+ # Developer Name Display
90
+ st.markdown("---")
91
+ st.markdown("**Developed by: Pankaj Kumar**", unsafe_allow_html=True)
requirements.txt ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ langchain
2
+ wikipedia
3
+ langchain-community
4
+ langchain-openai
5
+ langchain-anthropic
6
+ python-dotenv
7
+ pydantic
8
+ duckduckgo-search
9
+ streamlit
10
+ streamlit-chat
research_output.txt ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ --- Research Output ---
2
+ Timestamp: 2025-03-24 17:17:52
3
+
4
+ Capital of France Research:
5
+
6
+ Capital City: Paris
7
+ Country: France
8
+ Official Status: Capital since 1944
9
+
10
+ Key Statistics:
11
+ - City Area: 105 km² (41 sq mi)
12
+ - City Population: 2,048,472 (2025 estimate)
13
+ - Metropolitan Population: 12,271,794 (2023)
14
+ - Percentage of National Population: ~19%
15
+
16
+ Historical Significance:
17
+ - Major global center since 17th century
18
+ - Known as "City of Light" since 19th century
19
+ - Centers of: finance, diplomacy, commerce, culture, fashion, gastronomy
20
+
21
+ Location: Center of Île-de-France region
22
+
tools.py ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from langchain_community.tools import WikipediaQueryRun, DuckDuckGoSearchRun
2
+ from langchain_community.utilities import WikipediaAPIWrapper
3
+ from langchain.tools import Tool
4
+ from datetime import datetime
5
+
6
+ def save_to_txt(data: str, filename: str = "research_output.txt"):
7
+ timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
8
+ formatted_text = f"--- Research Output ---\nTimestamp: {timestamp}\n\n{data}\n\n"
9
+
10
+ with open(filename, "a", encoding="utf-8") as f:
11
+ f.write(formatted_text)
12
+
13
+ return f"Data successfully saved to {filename}"
14
+
15
+ save_tool = Tool(
16
+ name="save_text_to_file",
17
+ func=save_to_txt,
18
+ description="Saves structured research data to a text file.",
19
+ )
20
+
21
+ search = DuckDuckGoSearchRun()
22
+ search_tool = Tool(
23
+ name="search",
24
+ func=search.run,
25
+ description="Search the web for information",
26
+ )
27
+
28
+ api_wrapper = WikipediaAPIWrapper(top_k_results=1, doc_content_chars_max=1000)
29
+ wiki_tool = WikipediaQueryRun(api_wrapper=api_wrapper)
30
+