The £16 Million Tech Decision Nobody is Tracking

Every enterprise has a system of record for finance. For customers. For code. For risk. 

None of them has one for the decisions that determine which technology enters their estate. 

The single most consequential act in enterprise technology – choosing what to build on, choosing what to trust with production data – happens inside spreadsheets, email threads, and steering committees that meet once a month. The output is a PowerPoint. The audit trail is a memory. 

This is the Decision Latency problem. And it costs more than most CIOs realise. 

What is Decision Latency?

Here’s what we see, repeatedly, in the programmes running on our platform. A business unit identifies a need. AI, usually. They find vendors. Plenty of them. Discovery takes a week, maybe two. Then the clock starts. 

Getting access to a vendor’s technology in a secure, compliant environment: 6 – 12 weeks. Procurement review: 4 – 8 weeks. Security and architecture assessment: 
3 – 6 weeks. 

And that’s before anyone writes a line of evaluation code. The entire process – from when we think this vendor could work to when we have evidence it does – averages 7.3 months based on what we see. 

Seven months. To make a single tech decision. 

You can multiply that, though. Most CIOs estimate their organisation runs 15 to 20 evaluations a year. The real number, when you count departmental initiatives, AI experiments, and proof-of-concept requests that never surface formally, is closer to 200 – 300. 

To understand Decision Latency, it’s useful to think of it not as one problem, but four, stacked on top of each other: 

Access Latency 

The time between identifying a vendor and being able to work with their technology in a secure environment. Enterprises can find vendors easily. But they can’t get to them. Infrastructure requests, data provisioning, compliance checks, environment setup – every one of these is a queue, and every queue adds weeks. 

Evaluation Latency 

The time between getting access and producing evidence. Most evaluations are opinion-driven, not evidence-driven. A senior architect spends three days with a vendor’s documentation and makes a recommendation. But genuine evaluation requires running real workloads against real data in a controlled environment. And most enterprises don’t have the infrastructure to do this. Production environments are locked down for good reason. Sandbox environments exist, but they’re often too generic – stripped of the data characteristics, the integration patterns, the edge cases that determine whether technology works in your context.  

Governance Latency 

Security reviews. Architecture boards. Procurement gates. Risk assessments. Each is perfectly legitimate. Each operates in sequence, with its own template. But the governance process was designed for a world where enterprises made a handful of major technology decisions per year. That world no longer exists. 

Commitment Latency 

The final mile. Where ready-approved decisions sit in a queue behind other approved decisions, waiting for engineering capacity. This is where good evaluations can die. By the time the budget appears, the business need may have morphed. When engineering capacity opens up, a team might be firefighting production issues.  

The vendor you evaluated six months ago has been leapfrogged by a competitor. Or the technology you approved still works, but the urgency that drove the evaluation has evaporated, and nobody remembers why this mattered. Stack all four, and you get the 7.3-month average. Stack them across 200 evaluations, and you get an enterprise that is structurally incapable of moving at the speed its strategy requires. 

Why Nobody Fixes It

The Decision Latency problem persists because nobody owns it. The CIO owns the technology strategy. Procurement owns the commercial process. Security owns the risk assessment. Architecture owns the standards. Engineering owns the delivery. Each function optimises its own stage. But nobody optimises the pipeline. 

This is a systems problem, not a people problem. The individuals running each stage are competent. The process connecting those stages is not. Compare it to any other enterprise function. Finance has ERP. Sales has CRM. Engineering has CI/CD. HR has HCM. 

Each of these exists because someone recognised that a critical enterprise function was being run on spreadsheets and emails and built infrastructure to fix it. Each got built because the pain was acute and measurable – revenue leakage in sales, compliance risk in finance, deployment failures in engineering. The ROI was clear. The budget case was self-explanatory. 

Technology evaluation never got that infrastructure because the cost was invisible. Spread across departments. Buried in existing headcount. Attributed to “due diligence” rather than structural inefficiency. Nobody could point to a line item and say: this is what Decision Latency costs us. So nobody built the system to fix it. 

The Numbers That Change Conversations

£180,000 to £220,000 per evaluation. 

That’s the fully loaded cost. Let me show you how it breaks down. 

Engineering time: 4 – 6 weeks of senior engineering and architecture effort at £1,200 – 1,500 per day. That’s £24,000 – £45,000 right off the bat.  

Security review cycles: another 2 – 3 weeks across InfoSec, risk, and compliance teams – add £12,000 – £18,000.  

Procurement overhead: vendor negotiations, contract review, legal sign-off – £8,000 – 12,000. Infrastructure provisioning: standing up environments, data pipelines, access controls – £15,000 – £25,000 in cloud costs and DevOps time. 

Then there’s the invisible cost: the opportunity cost of delayed deployment. If the technology you’re evaluating could save £500,000 annually in operational costs, every month of delay costs £41,000. A 7-month evaluation cycle means you’ve spent £287,000 in foregone value before you’ve deployed anything. 

Most of it is invisible because it’s spread across teams and buried in existing headcount. But it’s real. Now apply it:  if an enterprise runs 200 evaluations a year and 60% never result in a deployment, it is spending north of £16 million annually evaluating technology it will never use. 

It’s time to recognise the pattern for what it is: an infrastructure gap. 

What Changes When You Fix It?

The organisations that move fastest on AI – and on whatever comes off the back of AI – won’t be the ones with the best models or the biggest budgets. They’ll be the ones who have built infrastructure for making technology decisions at speed, with evidence, under governance. 

That infrastructure doesn’t exist in most enterprises today. The process exists. The committees exist. The templates exist. But the infrastructure doesn’t. 

Fixing it means collapsing the four latency layers into a single platform. Instant access to vendor technology in a secure environment – this eliminates Access Latency entirely. No more 6 – 12 week waits for infrastructure provisioning. The environment is already there, production-like, compliant, and ready. 

  • Evidence-based evaluation against real workloads – this collapses Evaluation Latency from weeks to days. Run your actual data through their actual technology. Generate evidence, not opinion. 
  • Governance embedded into the workflow, not appended after it – this removes the sequential waiting that creates Governance Latency. Security reviews, architecture assessments, and procurement gates happen in parallel, inside the same system, with shared visibility. 
  • A system of record that captures every decision, every piece of evidence, every outcome – this attacks Commitment Latency by making decisions visible and persistent. When engineering capacity opens up, the evaluation is still there, still relevant, still ready. And the next evaluation starts smarter than the last. 

This is what we built NayaOne to do. Every enterprise evaluates technology. The question is whether they do it with infrastructure or with email. The ones still using email just don’t know what it’s costing them yet. Find out how NayaOne reduces a 7-month evaluation cycle to a few days – with evidence, not opinion. Book a guided walkthrough.

Get in touch with us

Reach out for inquiries or collaborations