{ "cells": [ { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Defaulting to user installation because normal site-packages is not writeable\n", "Requirement already satisfied: openai in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (1.54.3)\n", "Requirement already satisfied: jiter<1,>=0.4.0 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from openai) (0.7.0)\n", "Requirement already satisfied: httpx<1,>=0.23.0 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from openai) (0.27.2)\n", "Requirement already satisfied: anyio<5,>=3.5.0 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from openai) (4.6.2.post1)\n", "Requirement already satisfied: tqdm>4 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from openai) (4.67.0)\n", "Requirement already satisfied: typing-extensions<5,>=4.11 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from openai) (4.12.2)\n", "Requirement already satisfied: sniffio in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from openai) (1.3.1)\n", "Collecting pydantic<3,>=1.9.0\n", " Downloading pydantic-2.10.2-py3-none-any.whl (456 kB)\n", "\u001b[K |████████████████████████████████| 456 kB 1.6 MB/s eta 0:00:01\n", "\u001b[?25hRequirement already satisfied: distro<2,>=1.7.0 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from openai) (1.9.0)\n", "Requirement already satisfied: exceptiongroup>=1.0.2 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from anyio<5,>=3.5.0->openai) (1.2.2)\n", "Requirement already satisfied: idna>=2.8 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from anyio<5,>=3.5.0->openai) (3.10)\n", "Requirement already satisfied: certifi in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from httpx<1,>=0.23.0->openai) (2024.8.30)\n", "Requirement already satisfied: httpcore==1.* in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from httpx<1,>=0.23.0->openai) (1.0.7)\n", "Requirement already satisfied: h11<0.15,>=0.13 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->openai) (0.14.0)\n", "Requirement already satisfied: annotated-types>=0.6.0 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from pydantic<3,>=1.9.0->openai) (0.7.0)\n", "Collecting pydantic-core==2.27.1\n", " Downloading pydantic_core-2.27.1-cp39-cp39-macosx_11_0_arm64.whl (1.8 MB)\n", "\u001b[K |████████████████████████████████| 1.8 MB 4.3 MB/s eta 0:00:01\n", "\u001b[?25hInstalling collected packages: pydantic-core, pydantic\n", " Attempting uninstall: pydantic-core\n", " Found existing installation: pydantic-core 2.23.4\n", " Uninstalling pydantic-core-2.23.4:\n", " Successfully uninstalled pydantic-core-2.23.4\n", " Attempting uninstall: pydantic\n", " Found existing installation: pydantic 1.6.2\n", " Uninstalling pydantic-1.6.2:\n", " Successfully uninstalled pydantic-1.6.2\n", "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", "tea-client 0.0.7 requires httpx[http2]~=0.14.2, but you have httpx 0.27.2 which is incompatible.\n", "tea-client 0.0.7 requires pydantic~=1.6.1, but you have pydantic 2.10.2 which is incompatible.\u001b[0m\n", "Successfully installed pydantic-2.10.2 pydantic-core-2.27.1\n", "\u001b[33mWARNING: You are using pip version 21.2.4; however, version 24.3.1 is available.\n", "You should consider upgrading via the '/Library/Developer/CommandLineTools/usr/bin/python3 -m pip install --upgrade pip' command.\u001b[0m\n", "Note: you may need to restart the kernel to use updated packages.\n" ] } ], "source": [ "%pip install openai\n", "# !pip install typing_extensions==4.7.1 --upgrade" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "openai_key='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'" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Energy cannot be created or destroyed; it only transforms forms." ] } ], "source": [ "from openai import OpenAI\n", "\n", "client = OpenAI(api_key=openai_key)\n", "\n", "stream = client.chat.completions.create(\n", " model=\"gpt-4o-mini\",\n", " messages=[{\"role\": \"user\", \"content\": \"Say this is a test, and explain law of thermo dynamics in 10 words\"}],\n", " stream=True,\n", ")\n", "for chunk in stream:\n", " if chunk.choices[0].delta.content is not None:\n", " print(chunk.choices[0].delta.content, end=\"\")" ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [], "source": [ "with open('cleaned_output.txt', 'r') as file:\n", " content = file.read()\n" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [], "source": [ "prompt= \"\"\"\n", "Extract detailed project information from the following text and structure it in JSON format. The JSON should have each project as a main key, with tasks as subkeys. For each task, include the following fields: \"description\", \"priority\", \"assigned_to\", and \"current_status\". Use the conversation details to populate the values accurately. \n", "\n", "Text:\n", "'''\n", "{content}\n", "'''\n", "\n", "Expected JSON Output:\n", "{{\n", " \"project_name_1\": {{\n", " \"Task-1\": {{\n", " \"description\": \"Brief description of the task\",\n", " \"priority\": \"high/medium/low\",\n", " \"assigned_to\": \"Person responsible\",\n", " \"current_status\": \"Status of the task (e.g., completed, in progress, pending)\"\n", " }},\n", " \"Task-2\": {{\n", " \"description\": \"Brief description of the task\",\n", " \"priority\": \"high/medium/low\",\n", " \"assigned_to\": \"Person responsible\",\n", " \"current_status\": \"Status of the task (e.g., completed, in progress, pending)\"\n", " }}\n", " }},\n", " \"project_name_2\": {{\n", " \"Task-1\": {{\n", " \"description\": \"Brief description of the task\",\n", " \"priority\": \"high/medium/low\",\n", " \"assigned_to\": \"Person responsible\",\n", " \"current_status\": \"Status of the task (e.g., completed, in progress, pending)\"\n", " }}\n", " }}\n", "}}\n", "\n", "Follow this structure and ensure each project's tasks are accurately represented with the appropriate fields. Keep the output concise and relevant to the project information discussed in the text.\n", "\"\"\"" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [], "source": [ "final_prompt= prompt.format(content=content)" ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "```json\n", "{\n", " \"Bonnie Plant Project\": {\n", " \"Task-1\": {\n", " \"description\": \"Coordinate with Nikate to send out email to Bonnie plans\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Nikate\",\n", " \"current_status\": \"completed\"\n", " }\n", " },\n", " \"RAG Article and Blog Project\": {\n", " \"Task-1\": {\n", " \"description\": \"Add content to the RAG article and seek feedback\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " },\n", " \"Task-2\": {\n", " \"description\": \"Review RAG documentation and provide feedback\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Vivek\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"G Copilot Case Study\": {\n", " \"Task-1\": {\n", " \"description\": \"Develop a case study outline for G Copilot\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " },\n", " \"Task-2\": {\n", " \"description\": \"Provide access credentials for G Copilot team\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " }\n", " },\n", " \"Washington Government Project\": {\n", " \"Task-1\": {\n", " \"description\": \"Request summary from the team for submission\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"Internal Tool Development\": {\n", " \"Task-1\": {\n", " \"description\": \"Organize thoughts and provide a high-level presentation on internal tool ideas\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"Website Innovation\": {\n", " \"Task-1\": {\n", " \"description\": \"Complete website development with integrated features\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Jaspreet\",\n", " \"current_status\": \"in progress\"\n", " }\n", " }\n", "}\n", "```" ] } ], "source": [ "stream = client.chat.completions.create(\n", " model=\"gpt-4o\",\n", " messages=[{\"role\": \"user\", \"content\": final_prompt}],\n", " stream=True,\n", ")\n", "for chunk in stream:\n", " if chunk.choices[0].delta.content is not None:\n", " print(chunk.choices[0].delta.content, end=\"\")" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "## PHase-3" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Defaulting to user installation because normal site-packages is not writeable\n", "Collecting pymongo\n", " Downloading pymongo-4.10.1-cp39-cp39-macosx_11_0_arm64.whl (781 kB)\n", "\u001b[K |████████████████████████████████| 781 kB 1.1 MB/s eta 0:00:01\n", "\u001b[?25hCollecting dnspython<3.0.0,>=1.16.0\n", " Downloading dnspython-2.7.0-py3-none-any.whl (313 kB)\n", "\u001b[K |████████████████████████████████| 313 kB 3.4 MB/s eta 0:00:01\n", "\u001b[?25hInstalling collected packages: dnspython, pymongo\n", "Successfully installed dnspython-2.7.0 pymongo-4.10.1\n", "\u001b[33mWARNING: You are using pip version 21.2.4; however, version 24.3.1 is available.\n", "You should consider upgrading via the '/Library/Developer/CommandLineTools/usr/bin/python3 -m pip install --upgrade pip' command.\u001b[0m\n", "Note: you may need to restart the kernel to use updated packages.\n" ] } ], "source": [ "%pip install pymongo" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [], "source": [ "from pymongo import MongoClient\n", "\n", "def get_mongo_client():\n", " \"\"\"Connect to the MongoDB Atlas cluster.\"\"\"\n", " connection_string = \"mongodb+srv://shahid:Protondev%40456@cluster0.ruurd.mongodb.net/\"\n", " client = MongoClient(connection_string)\n", " return client\n", "\n", "def get_database():\n", " \"\"\"Connect to the task_management database.\"\"\"\n", " client = get_mongo_client()\n", " db = client[\"task_management\"]\n", " return db\n" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [], "source": [ "def test_connection():\n", " db = get_database()\n", " print(\"Connected to MongoDB:\", db.list_collection_names())" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Connected to MongoDB: ['weekly_tasks']\n" ] } ], "source": [ "test_connection()" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [], "source": [ "def insert_weekly_task_data(json_data, week_identifier):\n", " \"\"\"\n", " Insert JSON data into the weekly_tasks collection.\n", "\n", " Args:\n", " json_data (dict): JSON object containing task data.\n", " week_identifier (str): A unique identifier for the week (e.g., \"Week_1\").\n", " \"\"\"\n", " db = get_database()\n", " collection = db[\"weekly_tasks\"]\n", " \n", " # Check if a document for the given week already exists\n", " existing_document = collection.find_one({\"week\": week_identifier})\n", " if existing_document:\n", " print(f\"Document for {week_identifier} already exists. Skipping insert.\")\n", " return\n", "\n", " # Insert the document if it doesn't already exist\n", " document = {\n", " \"week\": week_identifier,\n", " \"tasks\": json_data\n", " }\n", " result = collection.insert_one(document)\n", " print(f\"Inserted document with ID: {result.inserted_id}\")" ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [], "source": [ "json_data= '''{\n", " \"Bonnie Plant Project\": {\n", " \"Task-1\": {\n", " \"description\": \"Coordinate with Nikate to send out email to Bonnie plans\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Nikate\",\n", " \"current_status\": \"completed\"\n", " }\n", " },\n", " \"RAG Article and Blog Project\": {\n", " \"Task-1\": {\n", " \"description\": \"Add content to the RAG article and seek feedback\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " },\n", " \"Task-2\": {\n", " \"description\": \"Review RAG documentation and provide feedback\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Vivek\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"G Copilot Case Study\": {\n", " \"Task-1\": {\n", " \"description\": \"Develop a case study outline for G Copilot\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " },\n", " \"Task-2\": {\n", " \"description\": \"Provide access credentials for G Copilot team\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " }\n", " },\n", " \"Washington Government Project\": {\n", " \"Task-1\": {\n", " \"description\": \"Request summary from the team for submission\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"Internal Tool Development\": {\n", " \"Task-1\": {\n", " \"description\": \"Organize thoughts and provide a high-level presentation on internal tool ideas\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"Website Innovation\": {\n", " \"Task-1\": {\n", " \"description\": \"Complete website development with integrated features\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Jaspreet\",\n", " \"current_status\": \"in progress\"\n", " }\n", " }\n", "}'''" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Inserted document with ID: 673a4885353ff570ac1d3779\n" ] } ], "source": [ "week_identifier= \"Week_1\"\n", "insert_weekly_task_data(json_data, week_identifier)" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Defaulting to user installation because normal site-packages is not writeable\n", "Collecting paperswithcode-client\n", " Downloading paperswithcode_client-0.3.1-py3-none-any.whl (24 kB)\n", "Collecting tea-console==0.0.6\n", " Downloading tea_console-0.0.6-py3-none-any.whl (12 kB)\n", "Collecting typer==0.3.2\n", " Downloading typer-0.3.2-py3-none-any.whl (21 kB)\n", "Collecting tea-client==0.0.7\n", " Downloading tea_client-0.0.7-py3-none-any.whl (11 kB)\n", "Collecting tea~=0.1.2\n", " Downloading tea-0.1.7-py3-none-any.whl (41 kB)\n", "\u001b[K |████████████████████████████████| 41 kB 1.0 MB/s eta 0:00:01\n", "\u001b[?25hCollecting pydantic~=1.6.1\n", " Downloading pydantic-1.6.2-py36.py37.py38-none-any.whl (99 kB)\n", "\u001b[K |████████████████████████████████| 99 kB 2.1 MB/s eta 0:00:011\n", "\u001b[?25hCollecting httpx[http2]~=0.14.2\n", " Downloading httpx-0.14.3-py3-none-any.whl (62 kB)\n", "\u001b[K |████████████████████████████████| 62 kB 2.8 MB/s eta 0:00:01\n", "\u001b[?25hCollecting tzlocal~=2.1\n", " Downloading tzlocal-2.1-py2.py3-none-any.whl (16 kB)\n", "Collecting rich~=9.11.0\n", " Downloading rich-9.11.1-py3-none-any.whl (195 kB)\n", "\u001b[K |████████████████████████████████| 195 kB 7.1 MB/s eta 0:00:01\n", "\u001b[?25hCollecting pytz~=2021.1\n", " Downloading pytz-2021.3-py2.py3-none-any.whl (503 kB)\n", "\u001b[K |████████████████████████████████| 503 kB 4.8 MB/s eta 0:00:01\n", "\u001b[?25hCollecting click<7.2.0,>=7.1.1\n", " Downloading click-7.1.2-py2.py3-none-any.whl (82 kB)\n", "\u001b[K |████████████████████████████████| 82 kB 2.0 MB/s eta 0:00:011\n", "\u001b[?25hCollecting rfc3986[idna2008]<2,>=1.3\n", " Downloading rfc3986-1.5.0-py2.py3-none-any.whl (31 kB)\n", "Collecting chardet==3.*\n", " Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB)\n", "\u001b[K |████████████████████████████████| 133 kB 4.2 MB/s eta 0:00:01\n", "\u001b[?25hRequirement already satisfied: certifi in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from httpx[http2]~=0.14.2->tea-client==0.0.7->paperswithcode-client) (2024.8.30)\n", "Collecting httpcore==0.10.*\n", " Downloading httpcore-0.10.2-py3-none-any.whl (48 kB)\n", "\u001b[K |████████████████████████████████| 48 kB 2.2 MB/s eta 0:00:011\n", "\u001b[?25hRequirement already satisfied: sniffio in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from httpx[http2]~=0.14.2->tea-client==0.0.7->paperswithcode-client) (1.3.1)\n", "Collecting h2==3.*\n", " Downloading h2-3.2.0-py2.py3-none-any.whl (65 kB)\n", "\u001b[K |████████████████████████████████| 65 kB 2.9 MB/s eta 0:00:011\n", "\u001b[?25hCollecting hyperframe<6,>=5.2.0\n", " Downloading hyperframe-5.2.0-py2.py3-none-any.whl (12 kB)\n", "Collecting hpack<4,>=3.0\n", " Downloading hpack-3.0.0-py2.py3-none-any.whl (38 kB)\n", "Collecting h11<0.10,>=0.8\n", " Downloading h11-0.9.0-py2.py3-none-any.whl (53 kB)\n", "\u001b[K |████████████████████████████████| 53 kB 2.2 MB/s eta 0:00:01\n", "\u001b[?25hRequirement already satisfied: idna in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from rfc3986[idna2008]<2,>=1.3->httpx[http2]~=0.14.2->tea-client==0.0.7->paperswithcode-client) (3.10)\n", "Collecting colorama<0.5.0,>=0.4.0\n", " Downloading colorama-0.4.6-py2.py3-none-any.whl (25 kB)\n", "Requirement already satisfied: pygments<3.0.0,>=2.6.0 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from rich~=9.11.0->tea-console==0.0.6->paperswithcode-client) (2.18.0)\n", "Collecting typing-extensions<4.0.0,>=3.7.4\n", " Downloading typing_extensions-3.10.0.2-py3-none-any.whl (26 kB)\n", "Collecting commonmark<0.10.0,>=0.9.0\n", " Downloading commonmark-0.9.1-py2.py3-none-any.whl (51 kB)\n", "\u001b[K |████████████████████████████████| 51 kB 2.2 MB/s eta 0:00:01\n", "\u001b[?25hCollecting tea~=0.1.2\n", " Downloading tea-0.1.6-py3-none-any.whl (41 kB)\n", "\u001b[K |████████████████████████████████| 41 kB 1.2 MB/s eta 0:00:01\n", "\u001b[?25h Downloading tea-0.1.5-py3-none-any.whl (41 kB)\n", "\u001b[K |████████████████████████████████| 41 kB 427 kB/s eta 0:00:01\n", "\u001b[?25h Downloading tea-0.1.4-py3-none-any.whl (41 kB)\n", "\u001b[K |████████████████████████████████| 41 kB 617 kB/s eta 0:00:01\n", "\u001b[?25hCollecting psutil~=5.8.0\n", " Downloading psutil-5.8.0.tar.gz (470 kB)\n", "\u001b[K |████████████████████████████████| 470 kB 12.9 MB/s eta 0:00:01\n", "\u001b[?25hBuilding wheels for collected packages: psutil\n", " Building wheel for psutil (setup.py) ... \u001b[?25ldone\n", "\u001b[?25h Created wheel for psutil: filename=psutil-5.8.0-cp39-cp39-macosx_10_9_universal2.whl size=260525 sha256=f4aecba874d0f5983f0d73f276f033eb7f8a6f4fd7f070b2783d73818fb94eac\n", " Stored in directory: /Users/sk4467/Library/Caches/pip/wheels/ee/66/e6/aecfd75e0bd554fc1b4dd982e9088dbdc79d10c3601cf3d7f3\n", "Successfully built psutil\n", "Installing collected packages: rfc3986, h11, pytz, hyperframe, httpcore, hpack, chardet, tzlocal, typing-extensions, psutil, httpx, h2, commonmark, colorama, click, typer, tea, rich, pydantic, tea-console, tea-client, paperswithcode-client\n", " Attempting uninstall: h11\n", " Found existing installation: h11 0.14.0\n", " Uninstalling h11-0.14.0:\n", " Successfully uninstalled h11-0.14.0\n", " Attempting uninstall: httpcore\n", " Found existing installation: httpcore 1.0.6\n", " Uninstalling httpcore-1.0.6:\n", " Successfully uninstalled httpcore-1.0.6\n", " Attempting uninstall: typing-extensions\n", " Found existing installation: typing-extensions 4.12.2\n", " Uninstalling typing-extensions-4.12.2:\n", " Successfully uninstalled typing-extensions-4.12.2\n", " Attempting uninstall: psutil\n", " Found existing installation: psutil 6.1.0\n", " Uninstalling psutil-6.1.0:\n", " Successfully uninstalled psutil-6.1.0\n", " Attempting uninstall: httpx\n", " Found existing installation: httpx 0.27.2\n", " Uninstalling httpx-0.27.2:\n", " Successfully uninstalled httpx-0.27.2\n", " Attempting uninstall: pydantic\n", " Found existing installation: pydantic 2.9.2\n", " Uninstalling pydantic-2.9.2:\n", " Successfully uninstalled pydantic-2.9.2\n", "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", "pydantic-core 2.23.4 requires typing-extensions!=4.7.0,>=4.6.0, but you have typing-extensions 3.10.0.2 which is incompatible.\n", "openai 1.54.3 requires httpx<1,>=0.23.0, but you have httpx 0.14.3 which is incompatible.\n", "openai 1.54.3 requires pydantic<3,>=1.9.0, but you have pydantic 1.6.2 which is incompatible.\n", "openai 1.54.3 requires typing-extensions<5,>=4.11, but you have typing-extensions 3.10.0.2 which is incompatible.\n", "anyio 4.6.2.post1 requires typing-extensions>=4.1; python_version < \"3.11\", but you have typing-extensions 3.10.0.2 which is incompatible.\u001b[0m\n", "Successfully installed chardet-3.0.4 click-7.1.2 colorama-0.4.6 commonmark-0.9.1 h11-0.9.0 h2-3.2.0 hpack-3.0.0 httpcore-0.10.2 httpx-0.14.3 hyperframe-5.2.0 paperswithcode-client-0.3.1 psutil-5.8.0 pydantic-1.6.2 pytz-2021.3 rfc3986-1.5.0 rich-9.11.1 tea-0.1.4 tea-client-0.0.7 tea-console-0.0.6 typer-0.3.2 typing-extensions-3.10.0.2 tzlocal-2.1\n", "\u001b[33mWARNING: You are using pip version 21.2.4; however, version 24.3.1 is available.\n", "You should consider upgrading via the '/Library/Developer/CommandLineTools/usr/bin/python3 -m pip install --upgrade pip' command.\u001b[0m\n", "Note: you may need to restart the kernel to use updated packages.\n" ] } ], "source": [ "%pip install paperswithcode-client" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Defaulting to user installation because normal site-packages is not writeable\n", "Collecting notion-client\n", " Downloading notion_client-2.2.1-py2.py3-none-any.whl (13 kB)\n", "Collecting httpx>=0.15.0\n", " Using cached httpx-0.27.2-py3-none-any.whl (76 kB)\n", "Requirement already satisfied: anyio in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from httpx>=0.15.0->notion-client) (4.6.2.post1)\n", "Requirement already satisfied: sniffio in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from httpx>=0.15.0->notion-client) (1.3.1)\n", "Collecting httpcore==1.*\n", " Downloading httpcore-1.0.7-py3-none-any.whl (78 kB)\n", "\u001b[K |████████████████████████████████| 78 kB 1.3 MB/s eta 0:00:01\n", "\u001b[?25hRequirement already satisfied: certifi in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from httpx>=0.15.0->notion-client) (2024.8.30)\n", "Requirement already satisfied: idna in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from httpx>=0.15.0->notion-client) (3.10)\n", "Collecting h11<0.15,>=0.13\n", " Using cached h11-0.14.0-py3-none-any.whl (58 kB)\n", "Collecting typing-extensions>=4.1\n", " Using cached typing_extensions-4.12.2-py3-none-any.whl (37 kB)\n", "Requirement already satisfied: exceptiongroup>=1.0.2 in /Users/sk4467/Library/Python/3.9/lib/python/site-packages (from anyio->httpx>=0.15.0->notion-client) (1.2.2)\n", "Installing collected packages: typing-extensions, h11, httpcore, httpx, notion-client\n", " Attempting uninstall: typing-extensions\n", " Found existing installation: typing-extensions 3.10.0.2\n", " Uninstalling typing-extensions-3.10.0.2:\n", " Successfully uninstalled typing-extensions-3.10.0.2\n", " Attempting uninstall: h11\n", " Found existing installation: h11 0.9.0\n", " Uninstalling h11-0.9.0:\n", " Successfully uninstalled h11-0.9.0\n", " Attempting uninstall: httpcore\n", " Found existing installation: httpcore 0.10.2\n", " Uninstalling httpcore-0.10.2:\n", " Successfully uninstalled httpcore-0.10.2\n", " Attempting uninstall: httpx\n", " Found existing installation: httpx 0.14.3\n", " Uninstalling httpx-0.14.3:\n", " Successfully uninstalled httpx-0.14.3\n", "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", "tea-client 0.0.7 requires httpx[http2]~=0.14.2, but you have httpx 0.27.2 which is incompatible.\n", "rich 9.11.1 requires typing-extensions<4.0.0,>=3.7.4, but you have typing-extensions 4.12.2 which is incompatible.\n", "openai 1.54.3 requires pydantic<3,>=1.9.0, but you have pydantic 1.6.2 which is incompatible.\u001b[0m\n", "Successfully installed h11-0.14.0 httpcore-1.0.7 httpx-0.27.2 notion-client-2.2.1 typing-extensions-4.12.2\n", "\u001b[33mWARNING: You are using pip version 21.2.4; however, version 24.3.1 is available.\n", "You should consider upgrading via the '/Library/Developer/CommandLineTools/usr/bin/python3 -m pip install --upgrade pip' command.\u001b[0m\n", "Note: you may need to restart the kernel to use updated packages.\n" ] } ], "source": [ "%pip install notion-client" ] }, { "cell_type": "code", "execution_count": 24, "metadata": {}, "outputs": [], "source": [ "from notion_client import Client\n", "\n", "# Initialize Notion client with your integration token\n", "notion = Client(auth=\"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\")\n", "\n", "# Create a new database or append rows to an existing one\n", "parent_page_id = \"148b2f92b9948099a854e8b21a0640a3\" # Replace with your parent page ID\n", "\n", "# Define properties of your new database\n", "database_properties = {\n", " \"Name\": {\"title\": {}},\n", " \"Age\": {\"number\": {}},\n", " \"Role\": {\"rich_text\": {}},\n", "}\n", "\n", "# Create a new database\n", "database = notion.databases.create(\n", " parent={\"type\": \"page_id\", \"page_id\": parent_page_id},\n", " title=[{\"type\": \"text\", \"text\": {\"content\": \"My JSON Table\"}}],\n", " properties=database_properties,\n", ")\n", "\n", "database_id = database[\"id\"]\n", "\n", "# Add rows to the database\n", "json_data = [\n", " {\"Name\": \"Alice\", \"Age\": 25, \"Role\": \"Engineer\"},\n", " {\"Name\": \"Bob\", \"Age\": 30, \"Role\": \"Designer\"}\n", "]\n", "\n", "for row in json_data:\n", " notion.pages.create(\n", " parent={\"database_id\": database_id},\n", " properties={\n", " \"Name\": {\"title\": [{\"type\": \"text\", \"text\": {\"content\": row[\"Name\"]}}]},\n", " \"Age\": {\"number\": row[\"Age\"]},\n", " \"Role\": {\"rich_text\": [{\"type\": \"text\", \"text\": {\"content\": row[\"Role\"]}}]},\n", " }\n", " )\n" ] }, { "cell_type": "code", "execution_count": 41, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Database created with ID: 14db2f92-b994-81fb-9132-f4e4cb46ac13\n" ] } ], "source": [ "from notion_client import Client\n", "\n", "# Initialize Notion client\n", "notion = Client(auth=\"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\")\n", "\n", "# Page ID where the database will be created\n", "parent_page_id = \"148b2f92b9948099a854e8b21a0640a3\"\n", "\n", "# Define the database schema\n", "database_schema = {\n", " \"parent\": {\"type\": \"page_id\", \"page_id\": parent_page_id},\n", " \"title\": [{\"type\": \"text\", \"text\": {\"content\": \"Task Dashboard\"}}],\n", " \"properties\": {\n", " \"Project Name\": {\"title\": {}},\n", " \"Task ID\": {\"rich_text\": {}},\n", " \"Description\": {\"rich_text\": {}},\n", " \"Priority\": {\"select\": {\"options\": [\n", " {\"name\": \"high\", \"color\": \"red\"},\n", " {\"name\": \"medium\", \"color\": \"yellow\"},\n", " {\"name\": \"low\", \"color\": \"green\"}\n", " ]}},\n", " \"Assigned To\": {\"rich_text\": {}},\n", " \"Current Status\": {\"select\": {\"options\": [\n", " {\"name\": \"completed\", \"color\": \"blue\"},\n", " {\"name\": \"in progress\", \"color\": \"yellow\"},\n", " {\"name\": \"pending\", \"color\": \"orange\"}\n", " ]}},\n", " \"Created At\": {\"date\": {}}\n", " }\n", "}\n", "\n", "# Create the database\n", "response = notion.databases.create(**database_schema)\n", "\n", "# Print the database ID\n", "print(\"Database created with ID:\", response[\"id\"])\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "14db2f92-b994-8140-bee6-d4540d75c374\n", "\n", "14db2f92-b994-81fb-9132-f4e4cb46ac13" ] }, { "cell_type": "code", "execution_count": 51, "metadata": {}, "outputs": [], "source": [ "from pymongo import DESCENDING\n", "mongo_client = MongoClient(\"mongodb+srv://shahid:Protondev%40456@cluster0.ruurd.mongodb.net/\") # Replace with your MongoDB URI\n", "db = mongo_client[\"task_management\"]\n", "employee_project_collection = db[\"employee_project\"]\n", "def fetch_latest_task_entry():\n", " \"\"\"\n", " Fetch the most recent entry from MongoDB.\n", " Returns:\n", " dict: The latest task entry as a dictionary.\n", " \"\"\"\n", " latest_entry = employee_project_collection.find_one(sort=[(\"created_at\", DESCENDING)])\n", " if latest_entry:\n", " return latest_entry\n", " else:\n", " raise ValueError(\"No entries found in MongoDB.\")" ] }, { "cell_type": "code", "execution_count": 52, "metadata": {}, "outputs": [], "source": [ "latest_entry= fetch_latest_task_entry()" ] }, { "cell_type": "code", "execution_count": 49, "metadata": {}, "outputs": [], "source": [ "notion = Client(auth=\"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\")\n", "parent_page_id = \"148b2f92b9948099a854e8b21a0640a3\" \n", "notion_database_id = \"14db2f92-b994-81fb-9132-f4e4cb46ac13\"\n", "from datetime import datetime\n", "def push_to_notion(latest_entry):\n", " \"\"\"\n", " Push tasks from the latest entry to the Notion database.\n", " Args:\n", " latest_entry (dict): The most recent task data from MongoDB.\n", " \"\"\"\n", " # Extract the tasks from the JSON\n", " tasks = latest_entry.get(\"consolidated_final_task\", {})\n", " created_at = latest_entry.get(\"created_at\", None)\n", "\n", " # Iterate over projects and their tasks\n", " for project_name, task_list in tasks.items():\n", " for task_id, task_details in task_list.items():\n", " # Map MongoDB fields to Notion properties\n", " notion_task = {\n", " \"parent\": {\"database_id\": notion_database_id},\n", " \"properties\": {\n", " \"Project Name\": {\"title\": [{\"type\": \"text\", \"text\": {\"content\": project_name}}]},\n", " \"Task ID\": {\"rich_text\": [{\"type\": \"text\", \"text\": {\"content\": task_id}}]},\n", " \"Description\": {\"rich_text\": [{\"type\": \"text\", \"text\": {\"content\": task_details.get(\"description\", \"\")}}]},\n", " \"Priority\": {\"select\": {\"name\": task_details.get(\"priority\", \"low\")}},\n", " \"Assigned To\": {\"rich_text\": [{\"type\": \"text\", \"text\": {\"content\": task_details.get(\"assigned_to\", \"\")}}]}, # Updated to rich_text\n", " \"Current Status\": {\"select\": {\"name\": task_details.get(\"current_status\", \"pending\")}},\n", " \"Created At\": {\"date\": {\"start\": created_at.isoformat() if created_at else datetime.utcnow().isoformat()}}\n", " }\n", " }\n", "\n", " # Push each task to Notion\n", " try:\n", " response = notion.pages.create(**notion_task)\n", " print(f\"Task pushed to Notion: {response['id']}\")\n", " except Exception as e:\n", " print(f\"Failed to push task {task_id} to Notion: {e}\")" ] }, { "cell_type": "code", "execution_count": 53, "metadata": {}, "outputs": [], "source": [ "def push_to_notion(latest_entry):\n", " \"\"\"\n", " Push tasks from the latest entry to the Notion database.\n", " Args:\n", " latest_entry (dict): The most recent task data from MongoDB.\n", " \"\"\"\n", " # Extract the tasks from the JSON\n", " tasks = latest_entry.get(\"consolidated_final_task\", {})\n", " created_at = latest_entry.get(\"created_at\", None)\n", "\n", " # Step 1: Clear existing tasks in Notion database\n", " \n", " try:\n", " # Query all pages in the Notion database (this will fetch the existing tasks)\n", " notion_database = notion.databases.query(database_id=notion_database_id)\n", " \n", " # Loop through the database pages and delete them\n", " for page in notion_database['results']:\n", " notion.pages.update(page_id=page['id'], archived=True)\n", " print(\"Old tasks removed from Notion successfully.\")\n", " except Exception as e:\n", " print(f\"Failed to clear tasks in Notion: {e}\")\n", "\n", "# Step 2: Push new tasks to Notion\n", " try:\n", " # Iterate over projects and their tasks\n", " for project_name, task_list in tasks.items():\n", " for task_id, task_details in task_list.items():\n", " # Map MongoDB fields to Notion properties\n", " notion_task = {\n", " \"parent\": {\"database_id\": notion_database_id},\n", " \"properties\": {\n", " \"Project Name\": {\"title\": [{\"type\": \"text\", \"text\": {\"content\": project_name}}]},\n", " \"Task ID\": {\"rich_text\": [{\"type\": \"text\", \"text\": {\"content\": task_id}}]},\n", " \"Description\": {\"rich_text\": [{\"type\": \"text\", \"text\": {\"content\": task_details.get(\"description\", \"\")}}]},\n", " \"Priority\": {\"select\": {\"name\": task_details.get(\"priority\", \"low\")}},\n", " \"Assigned To\": {\"rich_text\": [{\"type\": \"text\", \"text\": {\"content\": task_details.get(\"assigned_to\", \"\")}}]}, # Updated to rich_text\n", " \"Current Status\": {\"select\": {\"name\": task_details.get(\"current_status\", \"pending\")}},\n", " \"Created At\": {\"date\": {\"start\": created_at.isoformat() if created_at else datetime.utcnow().isoformat()}}\n", " }\n", " }\n", "\n", " # Push each task to Notion\n", " response = notion.pages.create(**notion_task)\n", " print(f\"Task pushed to Notion: {response['id']}\")\n", " except Exception as e:\n", " print(f\"Failed to push tasks to Notion: {e}\")\n" ] }, { "cell_type": "code", "execution_count": 54, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Failed to clear tasks in Notion: 'PagesEndpoint' object has no attribute 'delete'\n", "Task pushed to Notion: 14db2f92-b994-81e4-b057-f312d91ce256\n", "Task pushed to Notion: 14db2f92-b994-8130-a8a8-d6c5933a5ddf\n", "Task pushed to Notion: 14db2f92-b994-816e-8816-d17e67a74c93\n", "Task pushed to Notion: 14db2f92-b994-812e-b042-ea13beed124b\n", "Task pushed to Notion: 14db2f92-b994-8110-bf20-ff01d43b2e57\n", "Task pushed to Notion: 14db2f92-b994-8187-8fee-ee2477162d77\n", "Task pushed to Notion: 14db2f92-b994-81df-b592-d5f838200d56\n", "Task pushed to Notion: 14db2f92-b994-811b-8fb5-d2221492bb52\n", "Task pushed to Notion: 14db2f92-b994-819d-b72c-cb380ffaa08c\n", "Task pushed to Notion: 14db2f92-b994-81f5-be9a-ead09cd9eac5\n", "Task pushed to Notion: 14db2f92-b994-81bd-8e24-f3a789c0ad37\n", "Task pushed to Notion: 14db2f92-b994-816d-a458-ec4af0da65c7\n", "Task pushed to Notion: 14db2f92-b994-813d-bbbd-dd17295e98bd\n", "Task pushed to Notion: 14db2f92-b994-81c0-b7e5-f60003ca6d8b\n", "Task pushed to Notion: 14db2f92-b994-8155-af5d-d4e674bd73fe\n", "Task pushed to Notion: 14db2f92-b994-8102-899e-dcd990824e7f\n", "Task pushed to Notion: 14db2f92-b994-81a0-9e10-ebb6e3578d4a\n", "Task pushed to Notion: 14db2f92-b994-8114-a190-d8f23b824d6e\n" ] } ], "source": [ "push_to_notion(latest_entry)" ] }, { "cell_type": "code", "execution_count": 30, "metadata": {}, "outputs": [ { "ename": "AttributeError", "evalue": "'function' object has no attribute 'get'", "output_type": "error", "traceback": [ "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", "\u001b[0;31mAttributeError\u001b[0m Traceback (most recent call last)", "Cell \u001b[0;32mIn[30], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m tasks \u001b[38;5;241m=\u001b[39m \u001b[43mfinal_taks\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mconsolidated_final_task\u001b[39m\u001b[38;5;124m\"\u001b[39m, {})\n\u001b[1;32m 2\u001b[0m created_at \u001b[38;5;241m=\u001b[39m final_taks\u001b[38;5;241m.\u001b[39mget(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mcreated_at\u001b[39m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;28;01mNone\u001b[39;00m)\n", "\u001b[0;31mAttributeError\u001b[0m: 'function' object has no attribute 'get'" ] } ], "source": [ "tasks = final_taks.get(\"consolidated_final_task\", {})\n", "created_at = final_taks.get(\"created_at\", None)" ] }, { "cell_type": "code", "execution_count": 29, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "" ] }, "execution_count": 29, "metadata": {}, "output_type": "execute_result" } ], "source": [ "final_taks" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [], "source": [ "from pymongo import DESCENDING\n", "def fetch_recent_two_entries():\n", " \"\"\"\n", " Fetch the two most recent entries from the weekly_tasks collection\n", " based on the created_at timestamp.\n", " \n", " Returns:\n", " list: A list of the two most recent documents from the collection.\n", " \"\"\"\n", " db = get_database()\n", " collection = db[\"weekly_tasks\"]\n", "\n", " # Query to fetch the two most recent entries\n", " recent_entries = list(\n", " collection.find().sort(\"created_at\", DESCENDING).limit(2)\n", " )\n", " return recent_entries" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{'_id': ObjectId('6746fead3d4e2084b16de53d'), 'week': '2024_Week_48', 'unique_id': '79f4c789-31d9-493b-948b-1a714935de86', 'tasks': {'Bonnie Plans': {'Task-1': {'description': 'Vivek mentioned that everything needed from Bonnie Plans is completed', 'priority': 'high', 'assigned_to': 'Nikate', 'current_status': 'completed'}}, 'RAG Article and Blog': {'Task-1': {'description': 'Add content to the RAG article and blog, and seek feedback', 'priority': 'high', 'assigned_to': 'Shahid S', 'current_status': 'in progress'}, 'Task-2': {'description': 'Review RAG documentation and provide feedback', 'priority': 'high', 'assigned_to': 'Vivek', 'current_status': 'pending'}}, 'G Copilot Case Study': {'Task-1': {'description': 'Prepare G Copilot case study based on templates', 'priority': 'medium', 'assigned_to': 'Shahid S', 'current_status': 'in progress'}, 'Task-2': {'description': 'Provide feedback on G Copilot case study', 'priority': 'medium', 'assigned_to': 'Vivek', 'current_status': 'pending'}}, 'Internal Tool': {'Task-1': {'description': 'Conceptualize internal tool and present high-level ideas', 'priority': 'low', 'assigned_to': 'Shahid S', 'current_status': 'pending'}}, \"Jaspreet's Projects\": {'Task-1': {'description': \"Access Github repo and review Jaspreet's code\", 'priority': 'medium', 'assigned_to': 'Shahid S', 'current_status': 'pending'}}, 'Washington Government Project': {'Task-1': {'description': 'Write a high-level case study for the Washington Government project', 'priority': 'high', 'assigned_to': 'Shahid S', 'current_status': 'pending'}}, 'Marketing and Website Development': {'Task-1': {'description': 'Scope the requirements for the new website development', 'priority': 'medium', 'assigned_to': 'Shahid S', 'current_status': 'pending'}}, 'BFSI and AI Use Cases': {'Task-1': {'description': 'Discuss BFSI use cases and explore edge cases in AI implementations', 'priority': 'medium', 'assigned_to': 'Shahid S', 'current_status': 'in progress'}}}, 'created_at': datetime.datetime(2024, 11, 27, 16, 42, 45, 178000)}\n", "{'_id': ObjectId('6746feaa3d4e2084b16de53b'), 'week': '2024_Week_48', 'unique_id': '2cf272c8-be42-423c-bba2-75f0202499b7', 'tasks': {'Bonnie Plans': {'Task-1': {'description': 'Vivek mentioned that everything needed from Bonnie Plans is completed', 'priority': 'high', 'assigned_to': 'Nikate', 'current_status': 'completed'}}, 'RAG Article and Blog': {'Task-1': {'description': 'Add content to the RAG article and blog, and seek feedback', 'priority': 'high', 'assigned_to': 'Shahid S', 'current_status': 'in progress'}, 'Task-2': {'description': 'Review RAG documentation and provide feedback', 'priority': 'high', 'assigned_to': 'Vivek', 'current_status': 'pending'}}, 'G Copilot Case Study': {'Task-1': {'description': 'Prepare G Copilot case study based on templates', 'priority': 'medium', 'assigned_to': 'Shahid S', 'current_status': 'in progress'}, 'Task-2': {'description': 'Provide feedback on G Copilot case study', 'priority': 'medium', 'assigned_to': 'Vivek', 'current_status': 'pending'}}, 'Internal Tool': {'Task-1': {'description': 'Conceptualize internal tool and present high-level ideas', 'priority': 'low', 'assigned_to': 'Shahid S', 'current_status': 'pending'}}, \"Jaspreet's Projects\": {'Task-1': {'description': \"Access Github repo and review Jaspreet's code\", 'priority': 'medium', 'assigned_to': 'Shahid S', 'current_status': 'pending'}}, 'Washington Government Project': {'Task-1': {'description': 'Write a high-level case study for the Washington Government project', 'priority': 'high', 'assigned_to': 'Shahid S', 'current_status': 'pending'}}, 'Marketing and Website Development': {'Task-1': {'description': 'Scope the requirements for the new website development', 'priority': 'medium', 'assigned_to': 'Shahid S', 'current_status': 'pending'}}, 'BFSI and AI Use Cases': {'Task-1': {'description': 'Discuss BFSI use cases and explore edge cases in AI implementations', 'priority': 'medium', 'assigned_to': 'Shahid S', 'current_status': 'in progress'}}}, 'created_at': datetime.datetime(2024, 11, 27, 16, 42, 42, 919000)}\n" ] } ], "source": [ "recent_tasks = fetch_recent_two_entries()\n", "for task in recent_tasks:\n", " print(task)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#####################" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [], "source": [ "from pymongo import DESCENDING\n", "db = get_database()\n", "collection = db[\"weekly_tasks\"]\n", "\n", "# Query to fetch the two most recent entries\n", "recent_entries = list(\n", " collection.find().sort(\"created_at\", DESCENDING).limit(2)\n", ")\n", "# Extract task data from the entries\n", "old_task_data = recent_entries[1][\"tasks\"] # Older entry\n", "new_task_data = recent_entries[0][\"tasks\"] " ] }, { "cell_type": "code", "execution_count": 21, "metadata": {}, "outputs": [], "source": [ "from openai import OpenAI\n", "def compare_task_data(old_task_data, new_task_data):\n", " \"\"\"\n", " Send old and new task data to the LLM for comparison.\n", "\n", " Args:\n", " old_task_data (dict): JSON data for the older tasks.\n", " new_task_data (dict): JSON data for the newer tasks.\n", "\n", " Returns:\n", " dict: Consolidated JSON with updates and new tasks.\n", " \"\"\"\n", " # Prepare the prompt\n", " prompt = f\"\"\"\n", "\n", " Given the following two sets of task JSON data, compare them and:\n", "\n", " 1. Identify projects and tasks present in the second JSON but not in the first. \n", " - If two projects have different names but are contextually similar (e.g., due to spelling differences or tasks), treat them as the same project and merge their tasks.\n", "\n", " 2. For tasks that exist in both JSONs within the same project:\n", " - Compare the following fields:\n", " - \"description\"\n", " - \"priority\"\n", " - \"assigned_to\"\n", " - \"current_status\"\n", " - If any changes are detected in these fields, update the task details in the output.\n", "\n", " 3. If a project or task in the second JSON contains new tasks or subtasks not present in the first JSON:\n", " - Add those tasks or subtasks to the corresponding project in the output.\n", "\n", " 4. Ensure the final JSON structure meets the following conditions:\n", " - Each project appears only once in the JSON.\n", " - All tasks are uniquely represented under their respective projects.\n", " - Updates to tasks (e.g., changes in \"priority\", \"assigned_to\", or \"current_status\") are applied.\n", " - Tasks or subtasks are not duplicated across the output.\n", "\n", " FIRST TASK DATA:\n", " '''\n", " {old_task_data}\n", " '''\n", "\n", " SECOND TASK DATA:\n", " '''\n", " {new_task_data}\n", " '''\n", "\n", " Expected Output:\n", " A single consolidated JSON structure where:\n", " - Projects are uniquely represented and merged based on contextual similarity.\n", " - Each project contains all relevant tasks, including updates and newly added ones.\n", " - All tasks follow this structure:\n", "\n", " Return a single consolidated JSON structure with:\n", " {{\n", " \"project_name_1\": {{\n", " \"Task-1\": {{\n", " \"description\": \"Brief description of the task\",\n", " \"priority\": \"high/medium/low\",\n", " \"assigned_to\": \"Person responsible\",\n", " \"current_status\": \"Status of the task (e.g., completed, in progress, pending)\"\n", " }},\n", " \"Task-2\": {{\n", " \"description\": \"Brief description of the task\",\n", " \"priority\": \"high/medium/low\",\n", " \"assigned_to\": \"Person responsible\",\n", " \"current_status\": \"Status of the task (e.g., completed, in progress, pending)\"\n", " }}\n", " }},\n", " \"project_name_2\": {{\n", " \"Task-1\": {{\n", " \"description\": \"Brief description of the task\",\n", " \"priority\": \"high/medium/low\",\n", " \"assigned_to\": \"Person responsible\",\n", " \"current_status\": \"Status of the task (e.g., completed, in progress, pending)\"\n", " }}\n", " }}\n", " }}\n", " \"\"\"\n", "\n", "\n", " client = OpenAI(api_key='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')\n", " \n", " stream = client.chat.completions.create(\n", " model=\"gpt-4o\",\n", " messages=[{\"role\": \"user\", \"content\":prompt}],\n", " # stream=True,\n", " )\n", " raw_response = stream.choices[0].message.content\n", " # final_response= extract_json_from_raw_response(raw_response)\n", " return raw_response" ] }, { "cell_type": "code", "execution_count": 22, "metadata": {}, "outputs": [], "source": [ "consolidated_json=compare_task_data(old_task_data,new_task_data)" ] }, { "cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Here is the consolidated JSON structure after comparing and merging the two sets of task data:\n", "\n", "```json\n", "{\n", " \"Bonnie Plans\": {\n", " \"Task-1\": {\n", " \"description\": \"Complete the Bony Plants project.\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"completed\"\n", " }\n", " },\n", " \"RAG Article and Blog\": {\n", " \"Task-1\": {\n", " \"description\": \"Add content to the RAG article and blog, and seek feedback\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " },\n", " \"Task-2\": {\n", " \"description\": \"Review RAG documentation and provide feedback\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Vivek\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"G Copilot Case Study\": {\n", " \"Task-1\": {\n", " \"description\": \"Draft the initial case study document for G Copilot.\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"pending\"\n", " },\n", " \"Task-2\": {\n", " \"description\": \"Provide feedback on G Copilot case study\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Vivek\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"Internal Tool\": {\n", " \"Task-1\": {\n", " \"description\": \"Conceptualize internal tool and present high-level ideas\",\n", " \"priority\": \"low\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"Jaspreet's Projects\": {\n", " \"Task-1\": {\n", " \"description\": \"Access Github repo and review Jaspreet's code\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"Washington Government Project\": {\n", " \"Task-1\": {\n", " \"description\": \"Submit the project for review to the Applore team.\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " },\n", " \"Task-2\": {\n", " \"description\": \"Ensure reminders are sent daily to follow up on the review.\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"Marketing and Website Development\": {\n", " \"Task-1\": {\n", " \"description\": \"Create a high-level roadmap and plan milestones for the project.\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " }\n", " },\n", " \"BFSI and AI Use Cases\": {\n", " \"Task-1\": {\n", " \"description\": \"Discuss BFSI use cases and explore edge cases in AI implementations\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " }\n", " },\n", " \"S3 R3 Project\": {\n", " \"Task-1\": {\n", " \"description\": \"Discuss RTF S3 R3 alarm with the team once feedback is received.\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"pending\"\n", " }\n", " },\n", " \"Grant Engine and RAG Solution\": {\n", " \"Task-1\": {\n", " \"description\": \"Draft the use cases document for the Grant Engine and RAG Solution.\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " }\n", " },\n", " \"Internal LLM Comparison Tool\": {\n", " \"Task-1\": {\n", " \"description\": \"Define metrics and process for comparing LLMs using business problem statements.\",\n", " \"priority\": \"medium\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"in progress\"\n", " }\n", " },\n", " \"Task Manager and Plugins\": {\n", " \"Task-1\": {\n", " \"description\": \"Develop a task manager prototype for better project management and communication.\",\n", " \"priority\": \"high\",\n", " \"assigned_to\": \"Shahid S\",\n", " \"current_status\": \"pending\"\n", " }\n", " }\n", "}\n", "```\n", "\n", "### Explanation\n", "1. **Project merging based on contextual similarity**:\n", " - \"Bonnie Plans\" and \"Bony Plants\" were treated as the same project based on spelling similarity.\n", "\n", "2. **Task merging and updating**:\n", " - Tasks were merged and updated based on field changes in descriptions, priority, assigned_to, and current_status.\n", "\n", "3. **Addition of new projects and tasks**:\n", " - New projects and tasks from the second JSON that were not present in the first JSON were added to the consolidated output.\n" ] } ], "source": [ "print(consolidated_json)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.6" } }, "nbformat": 4, "nbformat_minor": 2 }