Summarization Chain
Condense large documents, chat logs, or articles into short, readable outputs while maintaining key context and intent.
The Summarization Chain component transforms long pieces of text into concise, meaningful summaries using customizable AI processing.

To configure this component, start by selecting an AI model, for example, ChatGPT or Gemini.

Once you connect your account for the chosen provider, you can select the specific model you want to use for the summarization process.

Next, define your chunking strategy.
You’ll then specify the characters per chunk, such as 1000, which sets the size of each text segment. Additionally, you can configure chunk overlap characters, for example, 200, to maintain context across adjacent chunks.

After setting these options, paste your input text in the provided field. Once done, click Save in the top-right corner to add the Summarization Chain step to your flow.

You can test this step by clicking the Test button in the top-right corner.
The summarized output will appear in the console, showing how your text was condensed by the selected model.

The summarized output can then be used as input for other steps in your flow, or you can connect it with other components to further process or store the summarized data.
You can also use this step to summarize text received dynamically from other sources, such as connectors, variables, or AI actions, before passing it to the next stage in your automation.
Last updated
Was this helpful?

