Insight · A delivery story

Three workflows. 20 tasks. Six departments. The one that broke our customer was the task no one owned.

I want to tell you a story. We make labs for enterprise customers. A while back I sat down to map what actually happens between a customer's first call and a working lab on their users' screens. Three workflows. 20separate tasks. Six teams inside Nuvepro, plus the customer's IT team and the cloud providers on the other side. Half of those tasks were not in anyone's job description. One of them broke a customer last quarter. Here is what happened.

By Giridhar LV·Founder & CEO, Nuvepro. Author of The Agentic Enterprise.··11 min read

You have probably heard this pitch

A services firm shows up with a deck. They want to automate one of your workflows. Twelve weeks. A bot. A dashboard.

The middle slide says "automate your hire-to-retire workflow" or "automate your AR workflow" or "automate your support workflow." The before-and-after picture looks clean. There is a price. There is a timeline.

A year later, the bot is doing the easy part. The senior person who used to absorb the messy cases is still absorbing them. The handoffs that were not in the diagram are still happening over Slack and email. The pilot is technically a success and nobody noticed.

When I look at why this keeps happening, the answer is always the same. The pitch is for the workflow. The work is in the tasks underneath.

Let me show you what I mean, using our own operation.

The three workflows that move every customer through Nuvepro

We sell task intelligence. I figured we should run it on ourselves first.

Nuvepro builds AI-ready labs for enterprise customers. Sandbox labs for hands-on practice. Learning Solutions labs for guided projects and assessments. The work moves through HubSpot tickets and Freshdesk tickets, four teams inside the company, and the customer's IT team on the other side.

There are three workflows that every customer passes through.

  • Lab Feasibility Review. Before any deal closes. Six tasks across Sales, Sandbox, and Learning Solutions.
  • Lab Delivery. After the deal closes, until the lab is live. Eight tasks across Sales, Sandbox, Learning Solutions, Support & Delivery, and the cloud providers.
  • Lab Support.Every day after the lab is live. Six tasks across Support, Sandbox, Learning Solutions, Engineering, and the customer's IT team.

Twenty tasks. Six teams inside Nuvepro plus the customer's IT team and the cloud providers. 9 of the 20 tasks are hidden, meaning they do not appear in any job description we wrote. Work stops if they do not happen.

Here is the actual list, in order, with a label for each one.

Workflow 1: Lab Feasibility Review

Before any deal closes. Six tasks. Three teams. Two of the tasks are hidden.

A customer asks Sales for a lab. From the outside it looks like one conversation. From the inside it is six tasks across three teams. One of those tasks is the most important decision in the whole flow, and it is not in any job description.

Workflow 1

Lab Feasibility Review

Before any deal closes
6 tasks · Sales · Sandbox · Learning Solutions
01
Raise a HubSpot ticket from the customer ask
Sales
Automate
02
Decide what kind of lab this is, then send it to Sandbox or Learning Solutions
Sales
This decision shapes the next two weeks of work. Nobody is hired to do it.
AugmentHidden
03
Run feasibility (scope, tech check, resourcing, cost, draft response)
Sandbox or Learning Solutions
One word. Five different jobs underneath.
Augment
04
Open a second ticket back to Sandbox if the lab needs sandbox work
Learning Solutions
One delivery team becomes the customer of the other halfway through. Nobody has a job title for this.
AugmentHidden
05
Send a tentative cost back to Sales
Sandbox or Learning Solutions
Automate
06
Close the deal with the customer
Sales
If the customer pushes back, we redo feasibility. The deal does not look closed from the inside.
Human-Only

The decision at task 2 is the one nobody talks about. The salesperson decides whether this is a Sandbox lab or a Learning Solutions lab. That decision picks the team that will scope the next two weeks. We discovered we even had a step like this only by tracing tickets backward.

Task 4 is the one I find funny. When Learning Solutions takes a lab that needs a sandbox underneath, Learning Solutions opens a fresh ticket back to Sandbox. The org chart says these are two parallel teams. The ticket trail says one of them is sometimes the customer of the other. There is no role name for that. Nobody is hired to be the one team that buys from the other team. The work happens anyway.

Workflow 2: Lab Delivery

After the deal closes, until the lab is live. Eight tasks. Five teams, including the cloud providers.

The HubSpot ticket flips to order closed and now the work moves through five teams. Six of the eight tasks are hidden. One of them is the task that broke the customer in the section after this.

Workflow 2

Lab Delivery

After the deal closes, until the lab is live
8 tasks · Sales · Sandbox · Learning Solutions · Support & Delivery · Hyperscalers
07
Re-route the ticket once it flips to order-closed
Sandbox or Learning Solutions
Same routing decision Sales made in step 2, made again here. Nobody noticed it lives in two places.
AutomateHidden
08
Build the lab (provision, configure, test, debug)
Sandbox or Learning Solutions
Four jobs again, one label.
Augment
09
Save a reusable template from the built lab
Sandbox or Learning Solutions
Saves time on the next twenty labs. Nobody is hired to do this.
AugmentHidden
10
Hand off to Support & Delivery
Sandbox or Learning Solutions
Automate
11
Find out how many users, what the dates are, how long the lab runs, what the budget per user is
Support & Delivery
We chase these from sales notes, sometimes the customer directly. Not in any job description.
AugmentHidden
12
Set up a new team inside the Nuvepro platform for this lab
Support & Delivery
Sits between support and engineering. Nobody owns it.
AutomateHidden
13
Check cloud quotas. Raise a ticket with AWS or Azure if quotas need to go up
Support & Delivery + Hyperscalers
If the hyperscaler does not lift the quota in time, the lab cannot launch.
AugmentHidden
14
Send onboarding emails, or notify the customer if their SSO is set up
Support & Delivery
Two different jobs depending on the customer setup.
Automate

Task 9, the templating step, is interesting. After the lab is built, somebody saves it as a reusable template. That work is not for this customer. It saves time on the next twenty. Nobody is hired to do it. Either the engineer who built the lab decides to do it, or it does not happen, or it happens three weeks later when somebody else has to rebuild the same thing from scratch.

Task 11 is the hinge for the whole delivery. Support & Delivery has to find out how many users will be on the lab, when it starts, when it ends, and what the budget per user is. Some of this came in through the sales conversation. Some of it did not. They check the sales notes. They ask sales. They sometimes ask the customer. There is no job description for this step. If it is missed, the lab gets built for the wrong shape, and the wrong shape is what the customer feels later.

Task 13 leaves the company. If the lab needs more cloud capacity than the default account allows, somebody on the Support side opens a ticket with AWS or Azure. If the cloud provider does not lift the quota in time, the lab cannot launch. That is an external dependency, but the workflow diagram has it as a sub-step of "configure."

Workflow 3: Lab Support

Every day the lab is live. Six tasks. Four channels. Five teams.

Customers do not file issues through one queue. They file through Freshdesk. They WhatsApp the program manager. They email the partner manager. They get on a call with their account person. Whatever channel the issue lands in, support has to spot it, sort it into one of four buckets, and pull in the right team.

Workflow 3

Lab Support

Every day after the lab is live
6 tasks · Support & Delivery · Sandbox · Learning Solutions · Engineering · Customer IT
15
Watch four channels at once: Freshdesk, WhatsApp, email, customer calls
Support & Delivery
Nobody is hired to watch four channels at once.
AutomateHidden
16
Sort the issue into one of four buckets
Support & Delivery
The third hidden routing step in this story.
AugmentHidden
17
Bucket 1: lab is not accessible. Get the customer's IT team to open a firewall
Support & Delivery + Customer IT
We cannot fix this. Customer's IT team has to.
Augment
18
Bucket 2: lab is not working. Send it back to Sandbox or Learning Solutions
Support & Delivery
Automate
19
Bucket 3: user is stuck. Set up a call and walk them through it
Support & Delivery
Different from a ticket. You have to teach someone live.
Human-Only
20
Bucket 4: platform problem. Hand it to Engineering with a way to reproduce it
Support & Delivery + Engineering
Augment

Notice the pattern. Workflow 1 had a hidden routing step at task 2. Workflow 2 had it again at task 7. Workflow 3 has it at task 16. Three workflows. Three hidden routing steps. None of them is in any job description. All three of them decide where the work goes next.

And the four buckets are not four versions of the same job. Bucket 1 is firewall coordination with a customer's IT team. Bucket 2 is internal routing back to engineering. Bucket 3 is teaching a person live on a video call. Bucket 4 is bug triage with our own engineering team. Four very different things hiding under one label called "issue resolution." A services firm that promises to "automate support" cannot do all four with one product.

The phone call

Last quarter, this is what happened.

A customer rolled out a lab to a global cohort. They told us how many users. They told us when it had to be live. They told us the budget per user. They did not tell us that a third of the cohort was based in India and a third in Singapore.

We provisioned everything in the US default region.

A week in, my phone rings. Customer's program lead.

"What's happening? My users are saying the experience is suffering. Pages are slow. The labs feel frozen. We have a session tomorrow."

I open Freshdesk. India users are stacking tickets. "Lab not loading." "Pages take five seconds." "Cannot run the exercise." Support has triaged them as Bucket 1, lab not accessible. They are looping in the customer's IT team. Firewalls get checked. None of it is the problem.

The problem was not a firewall. The problem was that somewhere upstream in our flow, nobody had a task that read "ask the customer where their users actually live, and provision regions accordingly."

Sales did not ask. User-geography is not a Sales qualification field. Support & Delivery did not check. The rollout-details task at workflow 2, task 11, does not include geography. Build did not branch into multi-region. Nobody had flagged regional spread upstream. The customer only found out we got it wrong when their users started complaining about lag.

The fix was small. We added two new tasks. A sales qualification question on user geography distribution. A build-step branch that auto-suggests multi-region provisioning when users span more than one region. Both of them are deployable to AI today. A simple agent reads the qualification answer and suggests the region split.

The fix was not visible from the workflow diagram. The diagram says "configure lab." That is one box. The tasks underneath it are several. One of them did not exist until a customer call forced it into existence.

The shape of the work, summed up

20 tasks. Six teams. Half of the tasks are not in any job description.

8
Automate
Ready for an AI agent today.
10
Augment
Person plus AI. The judgment work that is faster with a copilot.
2
Human-Only
Live teaching. Closing a deal. Stays human.
9
Hidden
Nobody is hired to do these. Work stops without them.

The number that surprised me when I drew this up was 9. Half of the tasks that move a customer through Nuvepro are not in anyone's job description. They run on the senior people who absorb whatever the system cannot label. A services proposal pricing "automate the workflow" would not have priced for any of them.

So what does this mean for an AI deployment?

The decision is different for every task. That is the whole point.

Look at what a quick scan of those 20 tasks tells me.

Task 1, raise a HubSpot ticket from a customer ask, can be an agent. Task 2, decide what kind of lab this is, can be a suggestion from an AI plus a final human read. Task 6, close the deal with the customer, stays human, because that is a relationship. Task 19, teach a stuck user through a lab exercise on a video call, stays human, because that is teaching.

None of these decisions are visible at the workflow level. If somebody walks in and asks "can you automate the lab feasibility workflow?" the only honest answer is "parts of it, depending on which part." If they ask "can you automate task 2, the lab-type decision in Sales?" the answer is much sharper. Yes, with a human review on the close calls, here is what the agent looks like, here is when it ships.

That is also why the workflow-level pitch keeps stalling. The services firm has to deliver an answer for all six support tasks at once. Watching four channels. Sorting into four buckets. Coordinating with the customer's IT team. Sending issues back to engineering. Teaching a stuck user live. Triaging a platform bug. Six different things, only some of them are even bot-shaped. One pilot does not ship against all of them.

The way out is the smaller move. Pick one task. Decide automate, augment, or human. Ship the agent for the automate one. Train the team for the augment one. Protect the human one. Move to the next task.

What we do for customers

The same thing I just did with our own workflows. Three steps.

Step 1

Map the work

Pull tasks from every source that knows them. Job descriptions, SOPs, ticket trails, real job postings, workflow frameworks, and structured conversations with the people doing the work. The documented and the lived, side by side.

Step 2

Label every task

Each task gets one of three labels. Automate. Augment. Human-only. Each with a reason. Each with hours saved per week and an annual impact at the role and workflow level.

Step 3

Redesign and ship

Build the AI agents for the automate tasks. Train the team for the augment ones. Keep the human ones. Ship the redesigned role or workflow live. Measure.

Try this on one of your own workflows

Pick one workflow somebody has been telling you to automate. Run a free task audit on it. You will see automate, augment, and human-only side by side, with hours saved per week and an annual impact at the task level.

What CXOs usually ask me at this point

Questions from COOs, CHROs, and CFOs who have been pitched workflow automation more times than they can count.

Because the workflows are real. I am not pulling a generic example from a database. We run Nuvepro Delivery and Support every day. The lab feasibility flow, the post-close build flow, the support flow. These are the operations that decide whether our customers see value or churn. When I say workflow-level AI decisions are at the wrong level, the evidence is our own ticket trail in HubSpot and Freshdesk. Outside-in case studies are easy to dismiss. The inside-out version is harder to argue with.
Process mining describes what your systems already record. Workflow automation builds bots that execute a defined sequence. Both treat the workflow as the unit. This article argues the unit is the task. The same workflow contains automate-ready tasks, augment-friendly tasks, and human-only tasks side by side. You cannot deploy AI to a workflow. You can deploy AI to a task. The decision happens one level down.
Because the proposal is priced at the workflow level but AI works at the task level. A quote for 'automate the support workflow' usually covers the documented tasks: ticket creation, routing, status updates. The hidden routing step at task 16 in this story, the four-channel watching at task 15, the customer-IT coordination at task 17, the live teaching call at task 19. These are different jobs. One bot cannot do all four. The pilot stalls when the hidden tasks resurface and nobody priced for them.
A customer rolled out a lab to users in multiple countries and did not tell us. We provisioned all the labs in the US default region. India users hit slow page loads, opened Freshdesk tickets reporting lag, and we triaged them as Bucket 1 (lab not accessible). The cause was not a firewall. It was a missing upstream task. Nobody had asked the customer where their users actually live, and nobody had decided to provision in multiple regions during build. That task does not appear in the Sales job description. It does not appear in the Support job description. It only existed in retrospect. Once we added it as an explicit task with an owner, the failure stopped happening.
It makes every AI-or-not decision concrete. Take task 11, the rollout-details task. At the workflow level you ask 'can we automate delivery?' and get hand-waving. At the task level you ask 'can AI pull user count, lab dates, duration, and budget per user out of sales notes and customer email?' That is a yes-or-no question. The answer is yes, with a fallback to a human follow-up message when fields are missing. That is something we can ship. It would not have been visible at the workflow level.
No. Keep the workflow maps. Add a task layer underneath them. Every step in your existing diagram is already pointing at one or more tasks. Decompose each step into the tasks it contains, classify each task as automate, augment, or human-only, and you have an AI deployment plan instead of an automation pitch. Nuvepro does this from job descriptions, SOPs, real job postings, APQC workflows, and conversations with the people doing the work. The workflow gives you scope. The tasks give you the decision.
More than the diagram suggests. The three workflows in this article contain 20 tasks across six teams. They are small workflows. Across the 7,000-plus workflows we have catalogued, the median sits between 30 and 50 distinct tasks once you decompose for AI deployment. The point is not the count. The point is that the count is a different number than the workflow-level diagram implies, and the AI-or-not decision is made at that count.
Because that is what hidden tasks are. They surface only when you trace a ticket end to end, look at handoffs that have no owner, or watch a customer escalate something that does not match any documented label. We are using this article as the trigger to audit our own job descriptions and SOPs against the task list. The same exercise is what we run with customers. The first audit always surfaces tasks the JDs missed. The second surfaces fewer. By the third, the redesigned operating model is what you are looking at.