LLM input node
The LLM Input node functions to pass documents, snippets, and uploaded data to other nodes within an LLM flow. Information entered through this node is stored in variables, and skills are executed by subsequent nodes according to the specified actions.
Supported types
The LLM Input node supports the following three input types:
Document
You can load a pre-registered list of documents and use them as input values.
Selected documents are stored in variables and used by downstream nodes.
Snippet
You can input a specific block of text or part of structured data.
E.g.: customer reviews, conversation history, etc.
Upload
Users can upload files directly and use them as input sources.
Supported file types and size limits follow system policy.
🔄 How it works
The user selects one of Document, Snippet, or Upload in the Input node.
The selected data is saved to a variable.
That variable is passed to subsequently placed nodes and processed according to the specified action. E.g.: executing skills such as summarization, translation, analysis
✅ Use cases
A flow that summarizes contracts uploaded by customers
Creating meeting minutes based on selected snippet data
Loading a specific document list to automate classification tasks

1. Document Input
You can select documents uploaded to the knowledge base to specify the search scope. If no separate documents are selected, all uploaded documents are included in the search target.
Specify document scope: You can restrict the search scope by selecting specific documents.
Specify scope by hashtag: You can limit the search scope by setting hashtags and specific variables; if not set, the search targets all documents.
Allow user search ON/OFF:
ON: Users can search documents themselves, and you can enter the message to display when prompting for a query.
OFF: Users cannot search; the system automatically searches the documents designated by it.
Select document search method:
Search by document content
Search by document title
Both methods can be selected simultaneously
Variable storage: You can save the selected document list to a specific variable. Variables can be created in the project settings menu or selected from a dropdown menu.

2. Snippet Input
Snippets are short pieces of data retrieved from third-party services such as messengers, email, and calendars. For example, you can integrate with Google Calendar to search and utilize schedule information.
User query prompt: You can set a message that prompts the user to enter what to search for.
Specify search source:
You can search data from various third-party services such as Google Calendar, Slack, Gmail, etc.
Depending on the service, there may be data size and page limits.
Variable storage: Searched snippet data can be saved to a specific variable for use.
E.g.: an event searched from Google Calendar
SNIAfter saving it to a variable, specify a particular action to the model in the LLM execution node
Finally, let's look at the upload input type.

3. Upload Input
Users can upload documents directly and use them. When Upload is selected, the user input field is disabled and configured to allow only file uploads.
File upload prompt: You can enter a message requesting the user to upload a file.
Supported file types and sizes:
Up to 100MB
Supported file formats: txt, .docx, .csv, .xls, .xlsx, .xlsm, .jpg, .jpeg, .png, .hwp, .hwpx, .pdf, .ppt, .pptx, .doc
File upload handling:
An error message is displayed if the file format is not supported or the size exceeds the limit.
When a file upload completes successfully, the chat window displays the message "Document has been uploaded."
Variable storage: Uploaded files can be designated as variables and used dynamically in LLM prompts.
The LLM Input node is a starting point that helps easily compose and connect the data needed within a flow.
Data structuring and utilization You can structure various forms of input data to write flexible LLM flows.
Support for precise actions Accurate actions can be performed based on the input data.
The LLM Input node is a key element that determines the performance and response quality of the entire flow. By entering and configuring data accurately, you can maximize the LLM's efficiency and the usefulness of actions.
Last updated