14 October 2025

Why 50% of Tech Projects Fail & How to Fix It

Why 50% of Tech Projects Fail & How to Fix It

Closing the Tech ROI Gap: Turning AI and Automation into Measurable Results for Australian Finance Leaders

Episode two of Finance Focus brings host Brendan Ritchie together with First Focus CEO Ross Sardi and Jared Morris, founder and CEO of Balden, to tackle a stubborn challenge: the gap between technology investment and real financial outcomes. With the vast majority of business initiatives now technology-led, Australian CFOs and finance leaders must clearly define how AI and automation translate into their P&L and balance sheet. This article distils the discussion into a practical, finance-grade approach that maps every tech initiative to adoption, leading indicators and, ultimately, ROI within a 12–18 month horizon.

Key takeaways

  • Define ROI up front: name the P&L or balance sheet line each initiative will move and set explicit targets before any spend.
  • Use a 12–18 month horizon: most technology projects should show tangible results inside a year, with full ROI emerging by 18 months.
  • Measure in the right order: quarter one is adoption, quarter two is leading indicators, quarter three onwards is financial outcomes.
  • Productivity ≠ efficiency: time saved only becomes ROI when it’s redeployed to revenue growth, reduced cost, or risk reduction.
  • Anchor to five drivers: efficiency, revenue growth, quality, productivity, and risk—pick your driver(s) and measure accordingly.
  • Pilot with purpose: roll out to defined cohorts, prove the use case (“bullets”), then scale the winners (“cannonballs”).
  • Tie AI to KPIs: if you deploy Copilot or similar, lift the team’s KPIs (retention, upsell, cycle time) so capacity converts to cash.
  • Pressure-test vendor claims: include integration, enablement, and support in the total cost; avoid the “700% ROI” mirage.
  • Balance innovation and governance: offer sanctioned tools to avoid shadow IT while protecting security and compliance in Australia.
  • Move the forecast now: if the project promises better retention or margin, update next year’s targets today and assign owners.

Watch the episode

Watch on YouTube

Why tech ROI is harder than ever

Digital transformation isn’t new, yet success rates still vary widely. Many projects underdeliver against their original aims, not because the technology fails, but because the business outcome was never defined in financial terms. Generative AI has heightened this pattern. Teams feel pressure to “get involved” quickly, proof-of-concepts multiply, and enthusiasm becomes the plan. As Ross Sardi notes, that approach breaks the moment you ask: which cell in the forecast changes and by how much?

The honest answer for many initiatives is “we’re not sure.” That’s precisely why finance must lead with a simple rule: no project proceeds without a mapped outcome, a measurement plan, and a timeline that fits within the rhythm of Australian budgeting cycles.

Set a realistic ROI horizon: 12–18 months

Technology moves quickly. If your business case requires a five-year wait to see a return, the ground will shift beneath you long before payback. Both Ross and Jared advocate for a 12–18 month window, with visible progress inside the first year. This doesn’t imply short-term thinking; it enforces modular delivery and faster feedback loops. If a project needs multiple phases, that’s fine—just ensure each phase has a result that is observable and useful within the year.

Importantly, this aligns with the practical realities of Australian organisations that plan around financial years, enterprise agreements, and local compliance requirements. You can plan long-term while still demanding short-cycle proof.

Quarterly ROI checkpoints that actually work

Many teams try to evaluate ROI at the finish line, long after the runway’s been burned. The smarter sequence looks like this:

  • Quarter 1 (0–90 days): Adoption — Track who is using the tool, how often, and for which workflows. Perfect outcomes are unnecessary at this stage. No adoption means no ROI later.
  • Quarter 2: Leading indicators — Watch cycle times, engagement rates, stage conversions, and throughput. These are the signals that the promised outcomes are plausible.
  • Quarter 3 onward: Financial outcomes — Measure churn reduction, ARPC uplift, lower overtime, reduced rework, fewer incidents, or deferred hiring. Tie outcomes to the driver selected.

This cadence protects investment and creates space to pivot early. If adoption stalls, fix enablement or process fit. If leading indicators don’t move, reassess the use case. By the time you reach financial targets, the evidence chain is clear.

The five ROI drivers to frame every business case

To avoid vague justifications, classify each initiative under one or more of these drivers:

  • Efficiency — deliver the same output for lower cost. Examples: less overtime, fewer manual steps, reduced contractor spend.
  • Revenue growth — more sales through higher win rates, faster cycle times, or larger deal sizes and attach rates.
  • Quality — improved customer experience, fewer defects, less rework, stronger retention and advocacy.
  • Productivity — more output per hour. Valuable, but it only becomes ROI once you redeploy the saved time.
  • Risk reduction — lower incident frequency or impact, improved compliance, stronger data governance.

This lens keeps the conversation commercial. It forces clear ownership and creates a measurement plan the board can trust.

Productivity is not efficiency (and why that matters)

Teams often celebrate “time saved” as a win. On its own, it isn’t. If your account managers save 20 minutes per meeting thanks to AI-generated summaries, you’ve increased productivity—and your licence costs. Unless you redeploy that time into activities that raise revenue or reduce cost, you haven’t created a financial return.

Finance’s role is to ask the next question: what will we do with the freed capacity? Will we increase client contact frequency to lift retention? Lift proposal throughput to grow ARPC? Cover more clients per rep without harming service quality? Or reduce overtime and avoid backfill? Pick the pathway and attach the KPIs, then raise the targets so the capacity converts to cash.

Case study: Copilot meeting summaries done properly

Consider Microsoft Copilot automating meeting notes for a sales or account management team. The naive case celebrates faster notes; the finance-grade case spells out the chain from time saved to dollars:

  • Capacity gain — 20 minutes saved per meeting × average meetings per rep × number of reps.
  • Redeployment — reallocate capacity to proactive outreach, QBR preparation, or follow-ups on expansion opportunities.
  • Leading indicators — higher QBR completion, shorter response times, more proposals sent, better stage conversion.
  • Financial outcomes — churn down by a defined percentage, ARPC up, or a headcount backfill avoided due to higher coverage.

To make this real, lift the KPIs immediately. If you don’t raise targets, the time saved will quietly re-inflate meetings, admin, and context switching. The ROI disappears.

Design cohorts, not sprinklers

Randomly sprinkling licences across the business is a recipe for noise. Instead, choose a defined cohort with a single use case and clear KPIs. Pilot for 8–12 weeks, instrument the workflow, and collect the evidence. If the signals are strong, scale. If they aren’t, stop or pivot. This “bullets before cannonballs” approach respects budget, avoids change fatigue, and proves value before you expand.

Move the forecast now (or it doesn’t count)

When a project promises better retention, faster billing, or higher ARPC, reflect that in the forecast immediately. Move the target cell, assign an owner, and commit to a quarterly check-in with finance. If your plan is real, it should withstand the scrutiny of a live forecast. If it isn’t ready, pause and sharpen the business case until it is.

Australian realities: compliance, capacity, and culture

Local context matters. Australian businesses operate with specific compliance, privacy, and data residency expectations. Align your AI rollouts with identity and access controls, information protection, and records management from day one. On the capability side, factor in local skills constraints and hiring lead times when you model capacity. Culturally, most organisations don’t want tech initiatives to drive redundancies—so efficiency arguments often rely on attrition absorption, overtime reduction, or deferred hiring. Make that explicit in your ROI plan.

Avoiding the vendor ROI mirage

In a crowded AI market, it’s easy to be dazzled by headline claims. Jared’s advice is consistent: approach with caution. If someone promises “700% ROI,” ask what costs were excluded. The total cost of ownership includes licences, integration work, data mapping, workflow redesign, security reviews, change management, and ongoing support. Your hurdle rate should at least clear what your business capital could earn in low-risk alternatives. If the real, fully costed return doesn’t beat that, it doesn’t make the cut.

Practical checks you can apply this week:

  • Cost everything — include integration, enablement, and support effort, not just subscription fees.
  • Fit to process — confirm how the tool maps to your actual workflow and data model, not an idealised demo.
  • Downstream impact — ensure postings, reconciliations, and reporting flow cleanly into your general ledger.
  • Sensitivity — model conservative adoption and outcome rates; the case should still hold.
  • References — prioritise real customer outcomes over vendor calculators or marketing anecdotes.

The general ledger as the “central nervous system”

Jared frames the general ledger as the business’s central nervous system. Every upstream change—sales process, service delivery, or automation—eventually lands in the ledger. Designing AI initiatives with this in mind ensures data quality, posting accuracy, and faster month-end. It’s also the quickest way to prove value: when the ledger moves predictably, the board can see the impact and the team gains confidence to scale.

From idea to finance-grade business case

Turn a technology proposal into a decision-ready business case with this sequence:

  • 1) Select the driver — efficiency, revenue, quality, productivity, or risk.
  • 2) Define the outcome — specify the forecast cell you will move, by how much, and by when.
  • 3) Map leading indicators — adoption metrics and process signals that prove momentum.
  • 4) Design the pilot — cohort, duration, KPIs, and clear success criteria for scaling.
  • 5) Cost it properly — licences, integration, governance, enablement, and support.
  • 6) Redeploy capacity — document exactly how time saved converts to revenue or reduced cost.
  • 7) Governance — security, privacy, and Australian compliance embedded from day one.
  • 8) Update the forecast — move the targets now; review quarterly with finance.

Quarter-by-quarter example: lifting retention in an Australian services firm

Imagine an Australian managed services provider piloting Copilot meeting summaries and QBR support across a 10-person account management cohort:

  • Driver: quality and revenue (retention and expansion).
  • Target outcome: reduce churn from 10% to 7% and lift ARPC by 3% within 12 months.
  • Quarter 1: 95% licence adoption, QBR templates standardised, follow-up time reduced by 30%.
  • Quarter 2: QBR completion up 20 points, proposals per account up 15%, pipeline stage conversion up 5 points.
  • Quarter 3–4: churn improvement visible in cohorts, ARPC uplift in renewal cycles, overtime down 10% due to better prep.
  • Forecast: churn and ARPC cells moved now; owners named; quarterly finance reviews scheduled.

This is what “capacity to cash” looks like. It’s specific, measurable, and defensible.

When growth is flat

Not every business is in expansion mode. If you’re stable or consolidating, you can still generate ROI without job losses. Options include absorbing attrition without backfill, reducing overtime, and deferring planned hires while holding service quality. These efficiency outcomes are legitimate ROI—just state them plainly and measure them the same way you would a top-line uplift.

Practical KPI menu for finance-led AI pilots

  • Revenue: win rate, average deal size, attach rate, pipeline velocity, proposals per rep.
  • Quality/retention: churn rate, NPS/CSAT, QBR completion, time-to-follow-up, ticket rework rate.
  • Efficiency: cost per ticket, cost per invoice, overtime hours, contractor spend, days sales outstanding.
  • Productivity (leading): meetings per rep, emails answered same day, first-response time, tasks closed per week.
  • Risk: incident frequency, policy exceptions, unresolved vulnerabilities, audit findings.

Pick a small set that aligns to your driver, baseline them, and track weekly. Then let the quarterly rhythm tell the story.

Culture: keep the goal steady, flex the plan

Finance leadership sets the tone. When the outcome is clear and non-negotiable, the conversation shifts from “whether we’ll hit it” to “how we’ll hit it.” Weekly check-ins and quarterly reviews create a culture of small, confident adjustments rather than last-minute scrambles. That’s how ROI grows up: through consistent ownership, modest course corrections, and a live forecast everyone trusts.

From episode to execution: actions for this month

  • Pick one use case and one cohort; name the forecast cell it will move.
  • Commit to a 12–18 month horizon with quarterly checkpoints: adoption → leading indicators → financial results.
  • Document a capacity redeployment plan so productivity becomes revenue, margin, or lower churn.
  • Lift KPIs immediately for the pilot cohort; publish them to the team.
  • Cost the whole effort—licences, integration, enablement, and governance—before you start.
  • Align with Australian compliance and security requirements; use sanctioned platforms to avoid shadow IT.
  • Update the forecast now and assign named owners for the moved targets.

Final word

As Brendan Ritchie, Ross Sardi and Jared Morris emphasise, the ROI gap isn’t a technical issue—it’s a management choice. Define the financial outcome, measure in the right order, and make a plan to convert capacity into dollars. Do that, and your AI and automation investments will stop being experiments and start compounding into predictable, Australian market-ready results.

Follow First Focus

LinkedIn: First Focus IT

Facebook: First Focus IT

Instagram: @firstfocusit

Insights