AI Actions, Variables in AI Scenarios, and Bulk AI Recognition – Nectain Release 52
At Nectain, we’re working intensively on our AI Center—the engine that powers intelligence across every workflow in our DMS. Our goal is to fuse the latest advances in AI with the flexible, builder-style configurability that has long been the hallmark of the Nectain platform.
Release #52 — August 28, 2025. This update introduces three major capabilities: AI Actions, multi-step Scenarios with variables, and Bulk File Recognition using LLMs.
These enhancements aren’t just incremental—they lay the foundation for a deeper, next-generation AI experience throughout Nectain.
AI Actions: powerful workflows in one click
Nectain can execute a wide range of complex, multi-step AI tasks. To make them feel native for users, we’ve packaged everything into a single button that triggers a tailored AI workflow for each document type.
How it works for users
- Open a document and you’ll see an AI Action button.
- Click it, and the AI performs the defined steps on that document—using its metadata automatically.
- No prompt engineering, no copy-pasting, no switching tabs or apps.
- Do your everyday multi-step tasks with document/data analysis — Nectain instantly recognizes, extracts, validates, and routes information from invoices, patient forms, contracts, permits, and onboarding documents into the right workflows.
Under the hood (for admins)
- Create the AI Action by writing a scenario that defines what the AI should do (e.g., extract contract data → validate fields → summarize key terms).
- Add the button to the document type; clicking it will launch the AI Action.
- Map document metadata to the scenario so inputs/outputs are written directly to attributes or connected applications.
- Set availability rules to control which document types and states expose the action.
This gives admins full control: design the prompts, map attributes, and configure conditions to ensure accuracy, compliance, and scalability—while end users get a fast, one-click AI experience.
Variables in AI Scenarios
What we mean by complex AI workflows? It means users can run non-trivial tasks that go far beyond a single prompt. These workflows can adapt dynamically to your business rules.

What it looks like to the user
You can open a document and click the AI Action. Nectain fetches the right data from documents and processes, runs multiple AI steps, and delivers one simple result (answer, summary, validation, or action).
One of the favourite cases from our healthcare clients (a clinic):
“End of the month rolls around and you’re juggling policies in SharePoint, staffing logs in HRMS, and a pile of intake PDFs - copy-pasting while the clock screams. Now you can tap AI Action → Generate Compliance Report, and Nectain pulls everything together, runs checks, and produces an audit-ready summary with citations. Gaps are flagged, tasks are created with linked evidence, and you get a live run plus a full history for audit. “
Under the hood
An AI Scenario is an atomic, reusable prompt block (often multi-round) with parameters and variables. At Nectain, we treat an AI Scenario like a LEGO brick—something you create and configure in the process of building a BPMN workflow. Each scenario becomes part of the workflow structure, and together they deliver end-to-end automation.
In Nectain an AI scenario typically includes:
- Template prompts: system + user messages
- Variables: text/JSON/JSON Schema to parameterize steps
- File inputs: pass documents (including PDFs/AcroForms) to the model
- Tools/RAG: enrich queries with enterprise data and lookups
- Execution settings: model choice, max tokens, temperature/top-p
- Outputs & mapping: write extracted/validated data to document attributes
- Observability: live execution streaming and full request/response history
The first version of AI Scenarios accepted predefined variables injected into system and user messages. Then we added support for AI Tools, enabling Retrieval-Augmented Generation (RAG) scenarios by enriching queries with enterprise data.
In the latest release, we introduced full variable support in AI processes, bringing a new level of flexibility to how scenarios are designed and executed:
- You can declare variables inside queries and define exactly where results should be stored.
- Variables can be reused in later steps of a multi-step scenario—even across long-running processes.
- Any step’s result can act as a variable, letting you chain logic without extra configuration.
- You can pass documents as file inputs to the model, making it easier than ever to query documents stored in the system.
- We’ve added support for PDFs that LLMs don’t handle out of the box, including PDFs with AcroForms.
Bulk Document Recognition with LLMs
One of the most common requests from our customers (across industries from healthcare to e-commerce) is the ability to put batch intake on rails to stop file-by-file clicking when a pack of documents lands on your desk.
The big news: Nectain runs streamlined recognition in one pass—classifying, extracting, and validating—even for previously unseen document types with no prebuilt templates or training required.

What it looks like for users
When you launch Nectain for the first time, right out of the box, you can immediately test high-volume document recognition—whether it’s hundreds of patient intake forms or supplier invoices for e-commerce. Nectain instantly classifies them into the correct document types, extracts the data, and applies validation—all without manual setup or training.
With this release, Nectain introduces two powerful modes of mass file recognition using LLM models.
Recognition without Preview
- Upload your files, and the system automatically creates documents as the result of recognition.
- Simply specify the document types in the label: their attribute lists are passed to the LLM.
- The model analyses each file, determines its type, and creates a document with attributes already filled.
- A new validation log ensures accuracy (e.g., if an amount must not exceed 500, the system clears invalid values).
Recognition with Preview
Nectain serves high-stakes domains—finance, healthcare, legal, and insurance—so we use a human-in-the-loop approach that blends AI with human oversight to maximize accuracy.
We continuously test multiple LLMs, but no model is perfect; human review resolves ambiguous cases, corrects misses, and enforces business rules, delivering reliable, audit-ready results across diverse document types.
That’s why we also offer Recognition with Preview: it runs the same pipeline, but files where there is a possibility of error are pre-verified by a user before a document is created—ideal for added compliance checks or manual quality control.
That’s Release 52 in a nutshell—one-click AI Actions, variables-driven scenarios, and high-volume recognition to put document intake on rails.
If you want to see these features in action, hit our team for a quick demo and we’ll walk you through it with your real docs.
And we’re not stopping: right now we’re building new AI superpowers—like auto-creating document drafts from all the data you’ve already provided to Nectain.
Stay tuned!
Related Posts
Get Updates and Insights Straight to Your Inbox
Stay up to date with the latest news, announcements, and articles.




