You Need Automation. But You Don’t Always Need Agentic and You Almost Never Need Gen-AI!

In our last post we dove into how analytics must drive source to pay, because most of source to pay should be automated and touch free as most of the source to pay process is straight forward (and capable of being automated for the last decade), non-strategic, and low to medium value.

Strategic Sourcing is an activity that should be focussed on high risk, high complexity, and/or high value categories and occasionally focussed on medium risk, medium complexity, and/or medium value categories where there is incomplete information or insufficient product/category history, atypical turbulence in the market, or highly particular requirements that just came into effect as a result of new regulations. That’s a minority of products/categories, not a majority.

Procurement should only be focussed on significant exceptions. And, with proper, modern, systems with proper e-document integration and exchange, most of the documents should be arriving in standardized digital formats, and most of the processing should, thus, be fully automated. And most of what is non-standard will be PDF in relatively standard formats that LLMs will be able to process to 95% accuracy and only require a few human verifications and field completions. The days of 20 people invoice processing team should be long gone, as the tech, even for standardized PDFs, has been in production by the leading players for over 8 years. Invoice discrepancies can be auto-identified, suppliers auto-notified, suggested corrections auto-included, one-click acceptance emails/screens for the suppliers included, and most contingencies accounted for. Only in the rare situations where suppliers refuse to accept a correction, invoices are in very non-standard or handwritten format, payments don’t go through, etc. should a human need to get involved. However, 95% to 99% of all documents and transactions that flow through Procurement should be 100% automated.

But most of this doesn’t need experimental Agentic AI or Gen-AI. Classic RPA will do just fine. For most of the rest, Adaptive RPA, with a bit of Machine Learning / Auto-Suggestion based on human-based exception processing, will do the trick nicely. If you look closely at current generation (A)RPA, Machine Learning, Optimization, and Predictive Analytics and walk through the full source-to-pay process, there is very little that can’t be automated without Gen-AI LLMs or experimental Agentic Systems. Sourcing — there are many standard (seven step) processes that can be completely automated based on data analysis, data-based risk assessments, goal definitions, and optimization. RFX (including e-Auctions) can be fully automated and, from the time you specify a product/category to source, everything can be automated to the award (including the demand pull/calculation from other systems).

When it comes time to contract, if you have standard templates or a large clause library, the system can automatically create the contract from the template and RFP responses, integrate DocuSign, and auto-execute it. If you don’t, or if you have to use the supplier’s paper, then you might use an LLM to create a draft for human review and/or analyze the supplier’s paper for terms, pricing (to make sure it matches the bid) and potential risks, as well as suggested revisions, before you sign. Gen-AI/LLMs unnecessary, but useful on a point-basis if you don’t have a good historical equivalent of a solution like Coupa Exari or iCertis.

Supplier onboarding can be fully automated with RPA powered dynamic workflows and third party data ingestion, as can risk and compliance analysis — no modern Agentic solutions needed.

Then we get to automatic invoice monitoring and point-based re-orders, receipt creation from inventory integration, and invoice processing in e-Procurement which has all been around for at least a decade. Automated approvals subject to tolerances, rules and pre-approvals — as well as predictive analytics on payments for new or one-time suppliers/orders or (slightly) out-of-tolerance invoices can automate the entire invoice-to-pay process.

We can get through the entire process on best-of-breed, classically oriented, RPA tech with some machine learning that processes human decisions in exception management, alters or augments the rules (and guardrails), and auto-processes the same type of situation next time. We quickly get to 95%+ throughput for any task that should be mostly automated, and a top human employee with BoB (A)RPA solutions and some augmented intelligence packages for analytics and research becomes 10 to 20 times as productive as they would have been in the past.

That’s the real future of Procurement. Small, top-talent teams (mentoring small emerging top-talent teams) doing the work of teams five to ten times their size, doing it better, and delivering more value than anyone would have believed possible with best-of-breed tools. Not error-prone, hallucinatory, agentic systems that work well in demos and a few select categories, and go all over the place in reality (and then try to hide their mistakes like Nick Leeson [who single-handedly collapsed Barings Bank] until they do a modern equivalent of the 2005 J-Com trade and cost you hundreds of millions of dollars on your key billion dollar product line).

So while you need to modernize at all costs, you don’t need to go full Agentic on unproven solutions. Get 90% of the way on tech that has been proven where you can control the automation level until you get comfortable with automation and learn where you can safely hand tightly boxed “decisions” to the machine (where well-defined calculations would determine your decision the majority of the time) and where you can’t. Otherwise, you’ll just end up being another member of the 94% AI failure camp. That’s not a statistic you want to be part of, especially given the cost of this tech today (and the increased cost tomorrow as energy grids start to break and the compute costs for modern AI tech goes through the proverbial roof).

Roll Up The Space To … Lose!

Over the past decade, a number of the big PE firms in our space decided that a “roll up the space to win” strategy was the right approach and bought a large number, and in some cases dozens, of assets in the Procurement space globally. Vista, Main, KKR, Accel-KKR, and Thoma Bravo all followed this strategy in the hopes that with enough assets, they’d control enough of the space that controls the transactions to give them a long term home inside a significant number of major corporations.

It was a great plan, and a great play at the time (as it worked out well for them), but one that may backfire for anyone who is late to the party as the Age of AI, coupled with the realization that bit pushing applications don’t cost very much anymore, means that these nine and ten figure plays are not going to maintain their market dominance, or their income stream that depends on seven figure annual subscriptions, for much longer.

As per THE PROPHET‘s recent piece on An LP, an AI Builder, and a PE Advisor Walk into a Bar, the time for the traditional players is coming to an end — especially the mega-suites that thought they could charge 7-figure license fees until the end of time. Whether or not Agentic AI can fully replace them (they can’t, by the way), the price compression is changing the game. (Especially when you’ve been warned that Now is NOT a Great Time to Buy … a Mega-Suite.)

If you don’t have time to read the piece, THE PROPHET believes that Agentic AI is going to effectively boil the ocean and cook all of the traditional plays, charging high six and seven figures a year for relatively simple tasks that can be mostly automated by these Agentic AI solutions for a tenth of the cost (until compute costs skyrocket, but still, that’s significant downward price pressure now) in the process. If you don’t trust AI, even better, since applications that were built on modern stacks in the last 5 years with the ability to wrap discrete tasks in micro-services and orchestrate them into dynamically configurable workflows that exactly match your needs, cost about a fifth of what these big plays do and do more for you. Either way, as has been indicated many times on this blog over the past, unless your Source to Pay needs are in the top 10%, you probably don’t need to be paying more than 250K for your Source to Pay.

Now, no one can see the future with clarity, everything is in flux, and we could both be wrong, but all software (like hardware) depreciates with time, tech always advances faster than we would like to think, organizations in unregulated market places under severe cost pressure are always looking for ways to cut costs, and those providers who are running on old stacks that are hard to adapt and support aren’t going to be able to keep up or keep costs low enough. There’s going to be a massive shift, and any major players not in the public sector (where contracts tend to extend beyond our professional lifetimes due to the slow pace of change in government organizations) are on the verge of shifting out of business.

So, like THE PROPHET, when it comes to the continued dominance of traditional SaaS in a big PE portfolio, I’m not buying it either.

Maybe If Procurement Had Embraced Magic and Logic Decades Ago …

… it would not be in such dire straits today.

The Procurement Ledger recently ran an article on Agility with Purpose which ran an interview with Jeanette Hübsch, a Global Procurement Leader and Senior Consultant with Proxima, who said that in Marketing Procurement, agility isn’t just a buzzword—it’s a necessity because the landscape shifts constantly with evolving consumer trends, digital innovation, and now AI-driven content creation. Procurement must move beyond being a cost controller; it needs to be a business enabler, a partner, and a source of innovation. And she’s right.

But she goes on to say that in indirect procurement, disruptions don’t always look like delayed shipments—they might be sudden campaign pivots, regulatory changes, or shifting budgets. Agility means being able to respond without slowing the business down and to accomplish this her focus is on building an adaptable supplier ecosystem—trusted partners with modular contracts, alternative sourcing strategies, and the ability to flex with us. Strong relationships, clarity, and scenario planning make a big difference.

In other words, it seems that marketing has understood what Procurement needs, and has needed, for at least the last two decades, but yet it’s still the sacred cow in most organizations that Procurement is not allowed to touch (when Procurement should be designed around good strategies that come from good Marketing Provider Management) and be allowed to shoot any scared cow ready for pasture.

Thought leaders have been promoting the embracement of good procurement by Marketing and marketing flexibility by Procurement for at least two decades now (and we first started talking about it in our two-part piece on Magic and Logic [Part I and Part II] two decades ago), especially since Decideware (founded in 1999 in Australia), one of the leaders in Marketing Procurement Technology, was just starting to break into the North American market at that time.

And even if they were a little slow on the uptake, then it should have been adopted when some thought leaders tried to make it vogue in the mid 2010s. A decade ago I penned a two-part piece on what to do if Marketing Mayhem Got You Down? Maybe it’s time to master the Marketing Way (Part 1 and Part 2) and ran a two-part series by Brian Seipel (of Source One, which was acquired by Corcentric) on why you should Ditch the Pepsi Blues, Already: Become a Marketing Procurement Asset (Part I and Part II). By then Decideware was taking marketing magic to a whole new level, but still marketing procurement (and the best practices it could bring) was still being largely ignored, even when marketing needed procurement more than ever. (In fact, Marketing recognition of Procurement wasn’t until mid 2019, and within a year COVID had shut everything down.)

While marketing suppliers (i.e. advertising agencies) are often very different from custom manufactured part suppliers (i.e. factories), and the categories need to be managed differently because of that, in a world where supply chains are being broken daily by geopolitics, unrest, and natural disasters, both require agility, creativity, multiple relationships and multi-sourcing, risk monitoring, mitigation, and management, and the ability to react as needed. The underlying best practices required by both sides are similar in theory and the lessons each side can teach the other can make both sides stronger.

So, if you want to be a better Procurement Pro, think like a Marketer, and if you truly want to be an effective Marketer, include some Procurement thinking. (And read the full interview with Jeanette Hübsch.) Remember that value has a cost side as well as a revenue side and both parties MUST manage both.

Today’s Procurement Leaders Aren’t Enough for Tomorrow

Mr. Matthew Buckingham recently posted on LinkedIn that the strongest Procurement leaders today share three traits:

  • (commercial) curiosity — and an understanding of where value is
  • (constructive) courage — and the willingness to challenge the business
  • (crystal) clarity — and the ability to simplify complexity

These are all great, and necessary, skills, but not enough to survive tomorrow where supply chains break daily, technology is in flux, and your processes can’t adapt (fast enough).

In order to survive the simultaneous supply chain (due to unpredictable, and constantly escalating, geopolitical situations) and technology (due to the Agentic AI [Hype] wave) turmoil that is coming, tomorrow’s procurement leader is also going to need:

  • (colossal) creativity — to build a flexible supply chain that can change on a moment’s notice
  • (constant) crusader — to convince the C-Suite that traditional Procurement is dead

The organization is going to have to

  • dual/tri-source everything from at least two/three locales,
  • have contracts with primary and secondary couriers in each locale,
  • be aware of alternate ports / commercial air cargo carriers out of alternate airports for shipping (and have them on speed dial in case of need),
  • have potential back up suppliers (who came in second) in case of supplier failure,
  • near (real)-time monitoring in place not just for communications, missed communications, missed milestone dates, and other indicate KPIs but events that are likely to impact a supplier’s performance and/or availability,
  • pre-defined response plans for region, supplier, carrier, [air]port, etc. availability, and
  • the ability to reallocate and change plans literally overnight …
  • while treating long-term contracts (or at least long-term expectations of fulfillment) as a thing of the past … there is no guaranteed supply, or even price protection, if the supplier becomes unavailable or goes bankrupt

Proactively building a supply chain and supporting technology infrastructure capable of being reactive in real time is going to take a lot more creativity and crusading than what was ever needed before in Procurement.

Curiosity, Clarity, and Courage is just the baseline.

Find a leader who’s ready!

Next Generation Analytics NEEDS to Surface Root Cause Analysis …

… but relationship modelling alone is NOT going to get us there!

In another great article by Xavier Olivera of Hackett Spend Matters, he dives into the topic of how procurement analytics needs to work – from visibility to orientation because current procurement analytics offerings, while reasonably good and actionable at the process level compared to where they were a few years ago, are poor at helping users orient themselves when a specific goal or problem comes into focus.

He notes that when a procurement leader decides they want to improve X, the challenge is no longer visibility. It is knowing which analytics matter for that objective and which do not. But all the analytics platforms give them today is metrics, they don’t give them direction. Even if the user knows what metric to drill into first (because it is the highest, lowest, or outlier), all they can see is the data that contributed to that metric. For spend, the transactions. For a supplier rating, the Net Promoter Scores. For a process, the time in each step.

The users see the immediate “what”, but not the “why”. Why were the transactions high? Is this market price, has the quantity gone up, or is the supplier charging above the agreed upon rate. For a rating, is it because the performance wasn’t up to spec, the delivery is consistently late, or the service/interactions are very poor. For a process, which time was too long (compared to average), unless you can dig into another level (and even then, why it was too long).

According to Xavier, in situations like these, analytics has to work different. When a procurement leader wants to improve contract compliance, the starting point should not be a full review of all compliance metrics, benchmarks and dashboards. It should be a guided path that surfaces the specific reports, KPIs and comparisons most likely to explain the gap, given the organization’s operating context.

Which is a great start, but just surfacing those reports, KPIs, and comparisons that are statistically relevant or deviations from a norm doesn’t explain the gap, it just captures the gap. Not only is it the case that a KPI only becomes meaningful once it is examined in the right context, but it only becomes useful if there is enough data to allow the system to determine, with high statistical likelihood, the root cause and actions to take that could address the root cause (and not just the symptom these systems surface today).

Xavier than tells us that the ability to orient analytics effectively depends on the data’s structure, which is partially right, but doesn’t quite capture the entire requirement. He goes onto state that Procurement outcomes do not arise from isolated transactions … they emerge over time from relationships and analytics is most effective when the underlying data model can express these relationships explicitly. Which is closer. But the reality is that this still isn’t enough for proper root cause analysis.

It’s critical, because without relationships you can’t trace the end metric back to the source data, but just being able to identify the source data only tells you what is fundamentally wrong, not why, or what you need to do about it.

That’s where analytics needs to get to.

If your steel category transactions are high, you can trace back to the contracts and whether or not the rates are per contract, the shipping is per carrier quote, the tonnage as expected, and the breakdown across steel categories appropriate for your current product lines or construction products. If any rates or tonnage don’t add up, you know the issue is the invoices — but you don’t know why they are being paid. Were the new rates not properly encoded? Were the tolerances within acceptable limits and the automatic OK-to-Pay issued despite the mismatch? Are category managers blindly overriding the system because the supplier was threatening late shipments if payments didn’t appear on time?

In Xavier’s example, if contract compliance is low, why? Is it just a few suppliers, or even a single supplier, across a category. If just a few suppliers, are they unaware of the contract because of personnel changeover? Did a new industry regulation adversely affect them? Was it actually the fault of a carrier or sub-tier supplier they had no control over? This is what you need to determine to ensure that compliance actually improves and stays improved.

In other words, you need more than the data, you need models that capture what the data element used in a KPI is, who or what creates the data in the first place (and how they create that data), what the data range and typical mean/median/mode values are, what positively or negatively impacts the data, and what can be done if a shift is desired in the data.

Without this baked in intelligence into the model, even if the root data in the system can be uncovered, a user won’t understand what it means or where to start doing something about it. That’s where analytics needs to get to for analysts to be proactive instead of reactive.

And this is another area where the Busch-Lamoureux approach to Exact Purchasing will help. When you define your categories at a granular level appropriate to to the quadrant of the pocket cube they occupy, you not only know what influences their cost, but what also influences their supply, what defines their quality, and what role third parties (that you may have to monitor) play. You have the foundations for doing real proactive analysis and identifying not only what “good” is but what is most likely contributing to a “not good” metric or data point and what standard options exist to address, and try to improve, the data point (as you need to mitigate high risk and manage high complex categories at a detailed level).

In other words, the future is knowledge-based models that capture more than data points and calculations, but what the data points actually mean and what factors (represented by other data points) directly influence the data points you are analyzing.