Système de réponse aux questions de documentation intelligente basé sur Webhook, Pinecone + OpenAI + n8n

Avancé

Ceci est unInternal Wiki, AI RAGworkflow d'automatisation du domainecontenant 30 nœuds.Utilise principalement des nœuds comme Webhook, GoogleDrive, ManualTrigger, Agent, RespondToWebhook. Chatbot de questions-réponses pour documents basé sur OpenAI GPT, base de données vectorielle Pinecone et Google Drive

Prérequis
  • Point de terminaison HTTP Webhook (généré automatiquement par n8n)
  • Informations d'identification Google Drive API
  • Clé API OpenAI
  • Clé API Pinecone
Aperçu du workflow
Visualisation des connexions entre les nœuds, avec support du zoom et du déplacement
Exporter le workflow
Copiez la configuration JSON suivante dans n8n pour importer et utiliser ce workflow
{
  "id": "UVMlpwIIsDBBFclU",
  "meta": {
    "instanceId": "92e36925b2d06addd7a010605535ce53ac105737436355f7e52e2980c726ed3d",
    "templateCredsSetupCompleted": true
  },
  "name": "AI-Powered Document QA System using Webhook, Pinecone + OpenAI + n8n",
  "tags": [
    {
      "id": "Bv4R1pgV3YCnUGME",
      "name": "webhook",
      "createdAt": "2025-07-04T05:26:19.837Z",
      "updatedAt": "2025-07-04T05:26:19.837Z"
    },
    {
      "id": "lTpSGA7vnSvUGQs6",
      "name": "lovable",
      "createdAt": "2025-07-04T05:26:29.453Z",
      "updatedAt": "2025-07-04T05:26:29.453Z"
    },
    {
      "id": "oKGIn6U0wpeHShTN",
      "name": "working flow",
      "createdAt": "2025-06-02T06:27:44.762Z",
      "updatedAt": "2025-06-02T06:27:44.762Z"
    }
  ],
  "nodes": [
    {
      "id": "784badb8-0cf6-434d-9d5d-1670757b548b",
      "name": "Lors du clic sur 'Exécuter le workflow'",
      "type": "n8n-nodes-base.manualTrigger",
      "position": [
        -300,
        -40
      ],
      "parameters": {},
      "typeVersion": 1
    },
    {
      "id": "26b93e8c-0a72-4491-90fe-55b5f5da02a0",
      "name": "Drive Google",
      "type": "n8n-nodes-base.googleDrive",
      "position": [
        -80,
        -40
      ],
      "parameters": {
        "filter": {
          "folderId": {
            "__rl": true,
            "mode": "list",
            "value": "1NgITWoqBgLAVof9bxF0jIrVToQ9c919u",
            "cachedResultUrl": "https://drive.google.com/drive/folders/1NgITWoqBgLAVof9bxF0jIrVToQ9c919u",
            "cachedResultName": "contract document"
          }
        },
        "options": {},
        "resource": "fileFolder"
      },
      "credentials": {
        "googleDriveOAuth2Api": {
          "id": "RFbg76pQ49AUClT1",
          "name": "name"
        }
      },
      "typeVersion": 3
    },
    {
      "id": "21174f84-5f7b-45bc-944b-0f0a7c2ffd49",
      "name": "Drive Google1",
      "type": "n8n-nodes-base.googleDrive",
      "position": [
        140,
        -40
      ],
      "parameters": {
        "fileId": {
          "__rl": true,
          "mode": "id",
          "value": "={{ $json.id }}"
        },
        "options": {},
        "operation": "download"
      },
      "credentials": {
        "googleDriveOAuth2Api": {
          "id": "RFbg76pQ49AUClT1",
          "name": "name"
        }
      },
      "typeVersion": 3
    },
    {
      "id": "d84e6051-cc04-4f51-b9c3-0e69e2193571",
      "name": "Magasin vectoriel Pinecone",
      "type": "@n8n/n8n-nodes-langchain.vectorStorePinecone",
      "position": [
        360,
        -40
      ],
      "parameters": {
        "mode": "insert",
        "options": {},
        "pineconeIndex": {
          "__rl": true,
          "mode": "list",
          "value": "package1536",
          "cachedResultName": "package1536"
        }
      },
      "credentials": {
        "pineconeApi": {
          "id": "id",
          "name": "PineconeApi account 2"
        }
      },
      "typeVersion": 1.3
    },
    {
      "id": "3185a781-28af-4ee0-be7b-2183b80ce0e3",
      "name": "Embeddings OpenAI",
      "type": "@n8n/n8n-nodes-langchain.embeddingsOpenAi",
      "position": [
        300,
        160
      ],
      "parameters": {
        "options": {}
      },
      "credentials": {
        "openAiApi": {
          "id": "id",
          "name": "OpenAi account 5"
        }
      },
      "typeVersion": 1.2
    },
    {
      "id": "8eccc3bb-654f-4a92-8074-9d2418afae12",
      "name": "Chargeur de données par défaut",
      "type": "@n8n/n8n-nodes-langchain.documentDefaultDataLoader",
      "position": [
        500,
        180
      ],
      "parameters": {
        "options": {},
        "dataType": "binary",
        "textSplittingMode": "custom"
      },
      "typeVersion": 1.1
    },
    {
      "id": "9a6a4542-81f0-4fa6-b0fa-6fbfcf5fb3d3",
      "name": "Séparateur de texte récursif",
      "type": "@n8n/n8n-nodes-langchain.textSplitterRecursiveCharacterTextSplitter",
      "position": [
        600,
        400
      ],
      "parameters": {
        "options": {},
        "chunkOverlap": 100
      },
      "typeVersion": 1
    },
    {
      "id": "60485603-13aa-46c8-9824-011b75d368bd",
      "name": "Note autocollante",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -420,
        -180
      ],
      "parameters": {
        "width": 1300,
        "height": 980,
        "content": "## Document Loading \n1. Connect to Google Drive folder to access Contract Agreement Documents\n2. Download and Vectorize the Data using Vector Embedding \n3. Store in Pinecone Database"
      },
      "typeVersion": 1
    },
    {
      "id": "349466bc-c0c7-4e4e-9e9c-78554a3123ae",
      "name": "Note autocollante1",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -420,
        940
      ],
      "parameters": {
        "width": 1300,
        "height": 720,
        "content": "## Query Document via Chat (for testing)"
      },
      "typeVersion": 1
    },
    {
      "id": "id",
      "name": "À la réception d'un message de chat",
      "type": "@n8n/n8n-nodes-langchain.chatTrigger",
      "position": [
        -100,
        980
      ],
      "webhookId": "id",
      "parameters": {
        "options": {}
      },
      "typeVersion": 1.1
    },
    {
      "id": "4240e62e-0b44-4dbd-9cff-87a404a496bd",
      "name": "Agent IA",
      "type": "@n8n/n8n-nodes-langchain.agent",
      "position": [
        120,
        980
      ],
      "parameters": {
        "options": {
          "systemMessage": "*Role*\nYou are a highly experienced contracting, commercial and legal adviser who thoroughly understands the contract related to shipping, clearing and forwarding agreements and advise and reply to chat queries looking into the pinecone vector database and respond accordingly. \n\n**Instructions**\nyou will receive chat query to which you have to reply back in chat\nyou will only look for information in the pinecone vector databse\nyou will not create your own reply if you don't get the answer from the database\n\nNote:\nbe polite and professional in your response\ncan use emojis where it is appropriate\n"
        }
      },
      "typeVersion": 2
    },
    {
      "id": "34d9e834-3aba-4c80-8c4d-4206fcdbfac3",
      "name": "Modèle de chat OpenAI",
      "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi",
      "position": [
        80,
        1200
      ],
      "parameters": {
        "model": {
          "__rl": true,
          "mode": "list",
          "value": "gpt-4.1-mini"
        },
        "options": {}
      },
      "credentials": {
        "openAiApi": {
          "id": "id",
          "name": "OpenAi account 5"
        }
      },
      "typeVersion": 1.2
    },
    {
      "id": "784924f6-d197-4666-9a05-e36020021ae2",
      "name": "Mémoire simple",
      "type": "@n8n/n8n-nodes-langchain.memoryBufferWindow",
      "position": [
        200,
        1200
      ],
      "parameters": {},
      "typeVersion": 1.3
    },
    {
      "id": "00b70c8d-5940-4eef-84c4-b87d69df3ab9",
      "name": "Répondre aux questions avec un magasin vectoriel",
      "type": "@n8n/n8n-nodes-langchain.toolVectorStore",
      "position": [
        380,
        1200
      ],
      "parameters": {
        "description": "When ever there is a query from chat, use this pinecone vector database to analyse and construct the response. "
      },
      "typeVersion": 1.1
    },
    {
      "id": "dfefbee7-5125-42da-b696-f343dc89573c",
      "name": "Magasin vectoriel Pinecone1",
      "type": "@n8n/n8n-nodes-langchain.vectorStorePinecone",
      "position": [
        180,
        1360
      ],
      "parameters": {
        "options": {},
        "pineconeIndex": {
          "__rl": true,
          "mode": "list",
          "value": "package1536",
          "cachedResultName": "package1536"
        }
      },
      "credentials": {
        "pineconeApi": {
          "id": "id",
          "name": "PineconeApi account 2"
        }
      },
      "typeVersion": 1.3
    },
    {
      "id": "8a0e2476-661e-4702-8563-ec0b12033884",
      "name": "Embeddings OpenAI1",
      "type": "@n8n/n8n-nodes-langchain.embeddingsOpenAi",
      "position": [
        200,
        1500
      ],
      "parameters": {
        "options": {}
      },
      "credentials": {
        "openAiApi": {
          "id": "SCKN5KUziIpM8NB7",
          "name": "OpenAi account 5"
        }
      },
      "typeVersion": 1.2
    },
    {
      "id": "31a4456c-4a35-4beb-9c4b-de49e460e492",
      "name": "Modèle de chat OpenAI1",
      "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi",
      "position": [
        520,
        1420
      ],
      "parameters": {
        "model": {
          "__rl": true,
          "mode": "list",
          "value": "gpt-4.1-mini"
        },
        "options": {}
      },
      "credentials": {
        "openAiApi": {
          "id": "SCKN5KUziIpM8NB7",
          "name": "OpenAi account 5"
        }
      },
      "typeVersion": 1.2
    },
    {
      "id": "7aa47a91-19f9-4a0e-b1b2-5867cf4982ef",
      "name": "Note autocollante2",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        1660,
        -160
      ],
      "parameters": {
        "width": 1200,
        "height": 980,
        "content": "## Query document from a user interface connectied via Webhook\n"
      },
      "typeVersion": 1
    },
    {
      "id": "c9da6a17-a0aa-4d3c-844a-1c3785a956eb",
      "name": "Webhook",
      "type": "n8n-nodes-base.webhook",
      "position": [
        1900,
        0
      ],
      "webhookId": "12b44ee5-c43e-430c-a1d4-4fc5ff5e45c4",
      "parameters": {
        "path": "12b44ee5-c43e-430c-a1d4-4fc5ff5e45c4",
        "options": {},
        "httpMethod": "POST",
        "responseMode": "responseNode"
      },
      "typeVersion": 2
    },
    {
      "id": "b1e8830f-8cfe-40ef-b611-76e70cd9184b",
      "name": "Agent IA1",
      "type": "@n8n/n8n-nodes-langchain.agent",
      "position": [
        2120,
        0
      ],
      "parameters": {
        "text": "=the query: {{ $json.body.query }}",
        "options": {
          "systemMessage": "*Role*\nYou are a highly experienced contracting, commercial and legal adviser who thoroughly understands the contract related to shipping, clearing and forwarding agreements and advise and reply to chat queries looking into the pinecone vector database and respond accordingly. \n\n**Instructions**\nyou will receive chat query to which you have to reply back in chat\nyou will only look for information in the pinecone vector databse\nyou will not create your own reply if you don't get the answer from the database\n\nNote:\nbe polite and professional in your response\ncan use emojis where it is appropriate\n"
        },
        "promptType": "define"
      },
      "typeVersion": 2
    },
    {
      "id": "87db20d4-7a7c-48a6-a29a-2fd089f93a43",
      "name": "Modèle de chat OpenAI2",
      "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi",
      "position": [
        2020,
        220
      ],
      "parameters": {
        "model": {
          "__rl": true,
          "mode": "list",
          "value": "gpt-4.1-mini"
        },
        "options": {}
      },
      "credentials": {
        "openAiApi": {
          "id": "id",
          "name": "OpenAi account 5"
        }
      },
      "typeVersion": 1.2
    },
    {
      "id": "2454b5ff-e53e-41c5-9844-f171d63ee2d4",
      "name": "Mémoire simple1",
      "type": "@n8n/n8n-nodes-langchain.memoryBufferWindow",
      "disabled": true,
      "position": [
        2180,
        220
      ],
      "parameters": {},
      "typeVersion": 1.3
    },
    {
      "id": "e33b7eff-0166-43b2-ab7e-5f53063164a9",
      "name": "Répondre aux questions avec un magasin vectoriel1",
      "type": "@n8n/n8n-nodes-langchain.toolVectorStore",
      "position": [
        2380,
        220
      ],
      "parameters": {
        "description": "When ever there is a query from chat, use this pinecone vector database to analyse and construct the response. "
      },
      "typeVersion": 1.1
    },
    {
      "id": "e223bcf1-7085-433a-a51d-708b0c36a2e4",
      "name": "Magasin vectoriel Pinecone2",
      "type": "@n8n/n8n-nodes-langchain.vectorStorePinecone",
      "position": [
        2180,
        380
      ],
      "parameters": {
        "options": {},
        "pineconeIndex": {
          "__rl": true,
          "mode": "list",
          "value": "package1536",
          "cachedResultName": "package1536"
        }
      },
      "credentials": {
        "pineconeApi": {
          "id": "HqCFDvnsq0D6wXpJ",
          "name": "PineconeApi account 2"
        }
      },
      "typeVersion": 1.3
    },
    {
      "id": "9e3f06a1-900b-427e-8775-dad8ddc1de80",
      "name": "Embeddings OpenAI2",
      "type": "@n8n/n8n-nodes-langchain.embeddingsOpenAi",
      "position": [
        2200,
        520
      ],
      "parameters": {
        "options": {}
      },
      "credentials": {
        "openAiApi": {
          "id": "id",
          "name": "OpenAi account 5"
        }
      },
      "typeVersion": 1.2
    },
    {
      "id": "df06efec-1f75-4309-923b-044e1c1991f3",
      "name": "Modèle de chat OpenAI3",
      "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi",
      "position": [
        2520,
        440
      ],
      "parameters": {
        "model": {
          "__rl": true,
          "mode": "list",
          "value": "gpt-4.1-mini"
        },
        "options": {}
      },
      "credentials": {
        "openAiApi": {
          "id": "id",
          "name": "OpenAi account 5"
        }
      },
      "typeVersion": 1.2
    },
    {
      "id": "01b59805-abdd-49ff-a553-0dddf3ed1450",
      "name": "Répondre à Webhook",
      "type": "n8n-nodes-base.respondToWebhook",
      "position": [
        2480,
        0
      ],
      "parameters": {
        "options": {
          "responseKey": "={{ $json.output }}"
        }
      },
      "typeVersion": 1.4
    },
    {
      "id": "05fd0853-0ebd-4a99-9345-982c9e664e27",
      "name": "Note autocollante3",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -1000,
        -180
      ],
      "parameters": {
        "color": 4,
        "width": 560,
        "height": 980,
        "content": "This project demonstrates how to build a Retrieval-Augmented Generation (RAG) system using n8n, which:\n🧾 Downloads any pdf file format documents from Google Drive\n📚 Converts them into vector embeddings using OpenAI\n🔍 Stores and searches them in Pinecone Vector DB\n💬 Allows natural language querying of contracts using AI Agents\n\n## Document Loading & RAG Setup\nThis flow automates:\nReading documents from a Google Drive folder\nVectorizing using text-embedding-3-small\nUploading vectors into Pinecone for later semantic search\n\n### 🧱 Workflow Structure\nA [Manual Trigger] --> B[Google Drive Search]\nB --> C[Google Drive Download]\nC --> D[Pinecone Vector Store]\nD --> E[Default Data Loader]\nE --> F[Recursive Character Text Splitter]\nE --> G[OpenAI Embedding]\n\n### 🪜 Steps\nManual Trigger: Kickstarts the workflow on demand for loading new documents.\nGoogle Drive Search & Download\nNode: Google Drive (Search: file/folder), Credentials required to access google drive folders and files\nDownloads PDF documents from the google drive\n\n#### Recursive Text Splitter to Break long documents into overlapping chunks\nSettings:\nChunk Size: 1000\nChunk Overlap: 100\n\n#### OpenAI Embedding\nModel: text-embedding-3-small\nUsed for creating document vectors\n\n#### Pinecone Vector Store\nIndex: package1536\nBatch Size: 200\nSettings:\nType: Dense\nRegion: us-east-1\nMode: Insert Documents\n\n\n"
      },
      "typeVersion": 1
    },
    {
      "id": "7f1cc5b2-104e-4571-a838-29c71c79bd08",
      "name": "Note autocollante4",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        -1000,
        940
      ],
      "parameters": {
        "color": 4,
        "width": 560,
        "height": 720,
        "content": "## Quyerying the Documetn via Chat \nThis flow enables chat-style querying of stored documents using OpenAI-powered agents with vector memory.\n\n### 🧱 Workflow Diagram\n  A[Webhook (chat message)] --> B[AI Agent]\n  B --> C[OpenAI Chat Model]\n  B --> D[Simple Memory]\n  B --> E[Answer with Vector Store]\n  E --> F[Pinecone Vector Store]\n  F --> G[Embeddings OpenAI]\n### 🪜 Components\nChat Trigger\nAI Agent Node\n\nHandles query flow using:\nChat Model: OpenAI GPT\nMemory: Simple Memory\nTool: Question Answer with Vector Store\nPinecone Vector Store\nConnected via same embedding index as Flow 1 Embeddings\nEnsures document chunks are retrievable using vector similarity\nResponse Node\nReturns final AI response to user via chat response\n\n"
      },
      "typeVersion": 1
    },
    {
      "id": "e11b8fbd-c24b-469f-a196-1e507a6d3e75",
      "name": "Note autocollante5",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        1080,
        -160
      ],
      "parameters": {
        "color": 4,
        "width": 560,
        "height": 980,
        "content": "## 🌐 Flow 3: UI-Based Query with webhook connecting to Lovable\nThis flow uses a web UI built using Lovable to query contracts directly from a form interface.\n\n### 📥 Webhook Setup for Lovable\nWebhook Node\nMethod: POST\nURL: your webhook url\nResponse: Using 'Respond to Webhook' Node\n\n### 🧱 Workflow Logic\n  A[Webhook (Lovable Form)] --> B[AI Agent]\n  B --> C[OpenAI Chat Model]\n  B --> D[Simple Memory]\n  B --> E[Answer with Vector Store]\n  E --> F[Pinecone Vector Store]\n  F --> G[Embeddings OpenAI]\n  B --> H[Respond to Webhook]\n\n### 💡 Lovable UI\nUsers can submit:\nFull Name\nEmail\nDepartment\nFreeform Query\n\nData is sent via webhook to n8n and responded with the answer from contract content.\n\n### 🔍 Use Cases\nContract Querying for Legal/HR teams\nProcurement & Vendor Agreement QA\nCustomer Support Automation (based on terms)\nRAG Systems for private document knowledge\n\n⚙️ Tools & Tech Stack\nComponent\tTool Used\nAI Embedding\tOpenAI text-embedding-3-small\nVector DB\tPinecone\nChunking\tRecursive Text Splitter\nAI Agent\tOpenAI GPT Chat\nAutomation\tn8n\nUI Integration\tLovable (form-based)\n\n\n\n"
      },
      "typeVersion": 1
    }
  ],
  "active": false,
  "pinData": {},
  "settings": {
    "executionOrder": "v1"
  },
  "versionId": "460c7740-a2d1-41f7-92d5-fc9113152663",
  "connections": {
    "c9da6a17-a0aa-4d3c-844a-1c3785a956eb": {
      "main": [
        [
          {
            "node": "b1e8830f-8cfe-40ef-b611-76e70cd9184b",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "b1e8830f-8cfe-40ef-b611-76e70cd9184b": {
      "main": [
        [
          {
            "node": "01b59805-abdd-49ff-a553-0dddf3ed1450",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "26b93e8c-0a72-4491-90fe-55b5f5da02a0": {
      "main": [
        [
          {
            "node": "21174f84-5f7b-45bc-944b-0f0a7c2ffd49",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "21174f84-5f7b-45bc-944b-0f0a7c2ffd49": {
      "main": [
        [
          {
            "node": "d84e6051-cc04-4f51-b9c3-0e69e2193571",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "784924f6-d197-4666-9a05-e36020021ae2": {
      "ai_memory": [
        [
          {
            "node": "4240e62e-0b44-4dbd-9cff-87a404a496bd",
            "type": "ai_memory",
            "index": 0
          }
        ]
      ]
    },
    "2454b5ff-e53e-41c5-9844-f171d63ee2d4": {
      "ai_memory": [
        [
          {
            "node": "b1e8830f-8cfe-40ef-b611-76e70cd9184b",
            "type": "ai_memory",
            "index": 0
          }
        ]
      ]
    },
    "3185a781-28af-4ee0-be7b-2183b80ce0e3": {
      "ai_embedding": [
        [
          {
            "node": "d84e6051-cc04-4f51-b9c3-0e69e2193571",
            "type": "ai_embedding",
            "index": 0
          }
        ]
      ]
    },
    "34d9e834-3aba-4c80-8c4d-4206fcdbfac3": {
      "ai_languageModel": [
        [
          {
            "node": "4240e62e-0b44-4dbd-9cff-87a404a496bd",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "8a0e2476-661e-4702-8563-ec0b12033884": {
      "ai_embedding": [
        [
          {
            "node": "dfefbee7-5125-42da-b696-f343dc89573c",
            "type": "ai_embedding",
            "index": 0
          }
        ]
      ]
    },
    "9e3f06a1-900b-427e-8775-dad8ddc1de80": {
      "ai_embedding": [
        [
          {
            "node": "e223bcf1-7085-433a-a51d-708b0c36a2e4",
            "type": "ai_embedding",
            "index": 0
          }
        ]
      ]
    },
    "31a4456c-4a35-4beb-9c4b-de49e460e492": {
      "ai_languageModel": [
        [
          {
            "node": "00b70c8d-5940-4eef-84c4-b87d69df3ab9",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "87db20d4-7a7c-48a6-a29a-2fd089f93a43": {
      "ai_languageModel": [
        [
          {
            "node": "b1e8830f-8cfe-40ef-b611-76e70cd9184b",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "df06efec-1f75-4309-923b-044e1c1991f3": {
      "ai_languageModel": [
        [
          {
            "node": "e33b7eff-0166-43b2-ab7e-5f53063164a9",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "8eccc3bb-654f-4a92-8074-9d2418afae12": {
      "ai_document": [
        [
          {
            "node": "d84e6051-cc04-4f51-b9c3-0e69e2193571",
            "type": "ai_document",
            "index": 0
          }
        ]
      ]
    },
    "dfefbee7-5125-42da-b696-f343dc89573c": {
      "ai_vectorStore": [
        [
          {
            "node": "00b70c8d-5940-4eef-84c4-b87d69df3ab9",
            "type": "ai_vectorStore",
            "index": 0
          }
        ]
      ]
    },
    "e223bcf1-7085-433a-a51d-708b0c36a2e4": {
      "ai_vectorStore": [
        [
          {
            "node": "e33b7eff-0166-43b2-ab7e-5f53063164a9",
            "type": "ai_vectorStore",
            "index": 0
          }
        ]
      ]
    },
    "id": {
      "main": [
        [
          {
            "node": "4240e62e-0b44-4dbd-9cff-87a404a496bd",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "9a6a4542-81f0-4fa6-b0fa-6fbfcf5fb3d3": {
      "ai_textSplitter": [
        [
          {
            "node": "8eccc3bb-654f-4a92-8074-9d2418afae12",
            "type": "ai_textSplitter",
            "index": 0
          }
        ]
      ]
    },
    "00b70c8d-5940-4eef-84c4-b87d69df3ab9": {
      "ai_tool": [
        [
          {
            "node": "4240e62e-0b44-4dbd-9cff-87a404a496bd",
            "type": "ai_tool",
            "index": 0
          }
        ]
      ]
    },
    "784badb8-0cf6-434d-9d5d-1670757b548b": {
      "main": [
        [
          {
            "node": "26b93e8c-0a72-4491-90fe-55b5f5da02a0",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "e33b7eff-0166-43b2-ab7e-5f53063164a9": {
      "ai_tool": [
        [
          {
            "node": "b1e8830f-8cfe-40ef-b611-76e70cd9184b",
            "type": "ai_tool",
            "index": 0
          }
        ]
      ]
    }
  }
}
Foire aux questions

Comment utiliser ce workflow ?

Copiez le code de configuration JSON ci-dessus, créez un nouveau workflow dans votre instance n8n et sélectionnez "Importer depuis le JSON", collez la configuration et modifiez les paramètres d'authentification selon vos besoins.

Dans quelles scénarios ce workflow est-il adapté ?

Avancé - Wiki interne, RAG IA

Est-ce payant ?

Ce workflow est entièrement gratuit et peut être utilisé directement. Veuillez noter que les services tiers utilisés dans le workflow (comme l'API OpenAI) peuvent nécessiter un paiement de votre part.

Informations sur le workflow
Niveau de difficulté
Avancé
Nombre de nœuds30
Catégorie2
Types de nœuds14
Description de la difficulté

Adapté aux utilisateurs avancés, avec des workflows complexes contenant 16+ nœuds

Auteur
Mohan Gopal

Mohan Gopal

@mohan

B2B and B2C Travel App Consultant. Building AI Agent for Travel Solution.

Liens externes
Voir sur n8n.io

Partager ce workflow

Catégories

Catégories: 34