< All Posts

Demand Planning and Advanced Planning Systems: How to Run an RFx Process That Doesn't End in Regret

Demand Planning and Advanced Planning Systems: How to Run an RFx Process That Doesn't End in Regret
Written by:
Trace Insights
Publish Date:
Feb 2026
Topic Tag:
Technology

Ready to turn insight into action?

We help organisations transform ideas into measurable results with strategies that work in the real world. Let’s talk about how we can solve your most complex supply chain challenges.

Trace Logo

Choosing an Advanced Planning System is one of those decisions that looks straightforward on the surface and turns out to be fiendishly difficult in practice. The vendor landscape is broad. The functionality overlaps are significant. Every platform claims AI-powered demand forecasting, scenario modelling, inventory optimisation and seamless integration with your ERP. The demos are polished. The reference customers are carefully curated.

And yet, a significant proportion of APS implementations either fail outright, deliver far less value than the business case projected, or land in a state where the organisation is technically "live" but planners are still running their actual decisions through spreadsheets because the system doesn't work the way they need it to.

The root cause, in most cases, isn't bad technology. It's a selection process that failed to answer the right questions. The RFx was structured as a technology procurement exercise — feature lists, architecture requirements, integration specifications, pricing schedules — when it should have been structured as a planning capability assessment: does this system, configured for our data, our products, our network and our planning process, actually produce better plans than what we're doing today?

That distinction — between buying a technology product and selecting a planning capability — is what separates organisations that get genuine value from their APS investment from those that spend 18 months implementing a system nobody trusts.

Why APS selection is different from other technology procurement

Before diving into how to structure the RFx process, it's worth understanding why demand planning and APS selection doesn't follow the same playbook as procuring an ERP, a WMS, a TMS or most other enterprise technology.

The problem you're solving is harder to define. When you select a warehouse management system, the functional requirements are relatively concrete: receive stock, put it away, pick it, pack it, ship it, count it. The processes are physical, observable and well-understood. When you select a demand planning or supply chain planning system, the requirements are inherently more abstract. You're trying to improve the quality of decisions about uncertain futures — how much to make, where to hold stock, when to order, how to allocate constrained supply. The "right answer" changes daily. The value is in the system's ability to help planners navigate ambiguity, not execute a defined process.

Vendor capabilities genuinely vary in ways that are hard to assess. The Gartner Magic Quadrant for Supply Chain Planning Solutions now includes more than 20 vendors. The Leaders — Kinaxis, Blue Yonder, o9 Solutions, Logility and others — all claim comprehensive functionality across demand planning, supply planning, inventory optimisation, S&OP and production scheduling. But underneath those claims, the actual capabilities differ substantially. Some platforms are genuinely strong in statistical and machine-learning forecasting but weaker in constraint-based supply planning. Others excel at interactive scenario modelling but struggle with the algorithmic optimisation that drives inventory decisions. Some are purpose-built for specific industries — retail, FMCG, process manufacturing — while others are horizontal platforms that require significant configuration to work well in any specific context.

A standard RFx feature checklist — "does the system support promotional uplift modelling? Yes/No" — doesn't surface these differences. Every vendor will tick the box. The question isn't whether the feature exists but whether it works well enough, in your context, with your data, to make materially better decisions than your current approach.

The implementation matters as much as the software. APS aren't systems you install and switch on. They're systems you configure, calibrate and continuously tune — and the quality of that work determines whether the system produces plans worth following. The same platform can deliver transformative results in one organisation and be a $2 million shelf ornament in another, depending on how it was implemented, how the data was structured, how the planning processes were designed, and whether planners were genuinely brought along through the change.

This means the RFx process needs to evaluate not just the software but the implementation approach, the partner ecosystem, the support model and the vendor's track record in environments comparable to yours.

The six mistakes that derail most APS selection processes

Having supported organisations across FMCG and manufacturing, retail and consumer and other sectors through APS selection and implementation, we see the same patterns of failure repeatedly. Understanding them is the first step to designing a process that avoids them.

Mistake one: starting with the vendor shortlist instead of the planning problem

Too many APS selection processes begin with a list of vendors and a request for information rather than a clear articulation of what the organisation actually needs from its planning capability. What are the specific planning decisions that need to improve? Where is forecast accuracy weakest and why? Which inventory positions are structurally wrong — too much in some places, not enough in others? Where are supply and demand decisions disconnected? What does the S&OP process need that it doesn't currently have?

Without clear answers to these questions — grounded in data, not assumptions — the RFx process has no anchor. It becomes a general-purpose technology evaluation rather than a targeted assessment of which system best addresses the organisation's specific planning challenges.

This upfront work — what we'd describe as planning maturity assessment and requirements definition — is the most important phase of the entire selection process. It determines everything that follows: the evaluation criteria, the weighting, the scenarios for demonstration, and ultimately the commercial justification for the investment.

Mistake two: evaluating against feature checklists

The traditional approach to technology RFx involves compiling a requirements matrix — sometimes running to hundreds of rows — with each requirement scored on a scale (e.g. "fully met, partially met, not met, requires customisation"). Vendors complete the matrix, an evaluation panel scores them, and the highest-scoring platform is identified.

This approach works reasonably well for transactional systems with well-defined functional requirements. It works poorly for APS, for two reasons. First, the requirements that matter most in planning — forecast accuracy, scenario modelling quality, user experience for planners, speed of exception management — are qualitative and contextual, not binary. Second, vendors are skilled at completing these matrices in ways that maximise their score without necessarily reflecting how the system will perform in your environment. A planning system might technically "support" promotional uplift modelling, but if it requires three months of custom configuration and a data science team to maintain, that's a very different proposition from one where it's a native, well-tested capability.

Mistake three: accepting canned demos

Every APS vendor has a demonstration environment loaded with sample data and pre-configured scenarios designed to make the system look spectacular. These demos are useful for understanding the user interface and general approach, but they tell you almost nothing about how the system will perform with your data, your products, your demand patterns and your network complexity.

The organisations that make the best APS selection decisions insist on scripted demonstrations using representative samples of their own data. This doesn't mean a full proof-of-concept at the RFx stage — that comes later — but it does mean providing shortlisted vendors with a defined dataset and a set of planning scenarios that reflect your real-world challenges, then evaluating how each system handles them. The difference in insight between a canned demo and a data-driven demonstration is enormous.

Mistake four: underweighting implementation and change

Most RFx evaluation frameworks allocate the majority of their weighting to software functionality and price. Implementation approach, partner capability, change management methodology and ongoing support model might get 10-15% of the total score. This is backwards for APS, where the implementation and change journey typically determines 60-70% of the eventual outcome.

A system with excellent algorithmic capabilities but a weak implementation partner, a rigid configuration approach, or no credible change management methodology is a worse bet than a system with good-enough algorithms backed by a team that understands your industry, your data challenges, and how to get planners actually using the system to make decisions.

Mistake five: treating cost as licensing cost

APS pricing models vary enormously — from per-user licensing to volume-based pricing to flat annual subscriptions with modular add-ons. But the licensing cost is typically 30-40% of the total cost of ownership. Implementation services, data engineering, integration development, testing, training, change management, and ongoing support and optimisation make up the rest.

RFx processes that evaluate cost on licensing alone — or that allow vendors to quote implementation at unrealistically low levels to win the deal — set the organisation up for budget overruns and scope compromises during implementation. The RFx should require vendors and implementation partners to provide a realistic total cost of ownership estimate across a defined horizon (typically five years), including all the components that actually drive spend.

Mistake six: ignoring the planning process and people dimension

An APS is a tool. Its value depends entirely on the planning process it supports and the people who use it. Yet most RFx processes focus almost exclusively on the tool and pay limited attention to whether the organisation is ready to use it effectively.

Questions about planning and operations process design — how will the S&OP process change? What will the planner's daily workflow look like? How will exceptions be managed? What governance will ensure the system stays calibrated? — are rarely part of the vendor evaluation. Neither are questions about strategic workforce planning — do we have the right planning roles? Do our planners have the analytical skills to use an advanced system effectively? What training and development is needed?

These aren't implementation details to be figured out later. They're fundamental to whether the investment will deliver returns, and they should inform the selection decision.

How to structure an APS RFx process that works

With those pitfalls in mind, here's a practical framework for running an APS selection process that actually surfaces the information needed to make a good decision.

Phase one: planning capability assessment (4-6 weeks)

Before engaging the market, invest in understanding your own planning capability — where it's strong, where it's weak, and where the biggest opportunities for improvement sit.

This means conducting a structured assessment of current planning processes: demand planning, supply planning, inventory management, S&OP, production scheduling. It means analysing forecast accuracy by product segment, customer and time horizon. It means quantifying the inventory opportunity — how much working capital is tied up that shouldn't be, and where service levels are being missed despite high stock levels. It means mapping the data landscape: what's available, what's reliable, what's missing, and what integration is needed.

The output of this phase is a planning capability baseline and a set of prioritised requirements — not a generic feature list, but a specific articulation of the planning decisions that need to improve, the data and analytical capabilities required to improve them, and the process and organisational changes that will be needed alongside the technology.

This is strategy and network design work as much as it is technology work. It ensures the RFx is anchored in business value rather than vendor marketing.

Phase two: market scan and shortlisting (2-3 weeks)

With clear requirements in hand, conduct a structured market scan to identify the vendors and implementation partners most likely to be a good fit. This should consider the vendor's strength in your specific industry (FMCG, manufacturing, retail, resources), the maturity of their demand planning and forecasting capabilities relative to your needs, their track record in the ANZ market — including local implementation partners and support, their architecture and integration approach relative to your ERP and data landscape, and their pricing model and how it scales with your business.

The goal is to shortlist three to four vendors for detailed evaluation — enough for genuine competitive tension, not so many that the evaluation becomes unmanageable. Gartner's Magic Quadrant and Critical Capabilities reports are useful inputs here, but they should be one data point among several, not the sole basis for shortlisting. As independent analysis has noted, the Magic Quadrant format tends to reward vendors with broad market presence and comprehensive feature claims without necessarily distinguishing between deep capability and surface-level functionality.

Phase three: RFx design and issue (3-4 weeks)

The RFx document itself should be designed to generate the specific information needed for a high-quality evaluation — not to demonstrate procurement rigour through volume of documentation.

The most effective APS RFx documents we've seen include several key components. First, a business context section that gives vendors genuine insight into the organisation's planning challenges, data environment and improvement priorities — because better-informed vendors produce better responses. Second, a focused set of functional requirements organised around planning capability areas (demand planning, supply planning, inventory optimisation, S&OP, analytics) rather than exhaustive feature checklists. Third, a set of planning scenarios that vendors will be asked to demonstrate during evaluation — specific, data-driven scenarios that reflect the organisation's real challenges, such as "forecast a product with high promotional variability," "optimise safety stocks across a three-tier distribution network," or "model the impact of a supply constraint on customer service." Fourth, clear requirements for implementation approach, team composition, methodology, timeline and change management. Fifth, a total cost of ownership template that requires vendors to itemise licensing, implementation, integration, data engineering, training, change management and ongoing support costs over a defined horizon.

The scenarios are critical. They're what transform the evaluation from a paper exercise into a practical test of capability. Providing a representative dataset to shortlisted vendors — anonymised if necessary — and requiring them to demonstrate their system against defined scenarios reveals more in a two-hour session than a 200-page written response ever will.

Phase four: evaluation (4-6 weeks)

The evaluation phase should combine three streams of assessment.

Written response evaluation against the RFx criteria, focused on approach, methodology, team capability and total cost rather than feature compliance. This provides the baseline comparative view.

Scripted demonstrations where each shortlisted vendor works through the defined planning scenarios using the provided dataset. These sessions should be attended by planners, not just IT and procurement, because planners are the ones who will recognise whether the system handles their real-world challenges effectively. Evaluation should focus on the quality of outputs (does the forecast look sensible? Does the inventory recommendation align with service targets?), the usability of the planner interface (can a planner understand and act on what the system produces?), and the transparency of the system's logic (can planners see why the system is recommending what it's recommending, and override it when they have better information?).

Reference checks and site visits with comparable organisations that have implemented the same platform. These should go beyond the vendor's curated reference list — ask for references in your industry, your geography, your scale, and at a similar stage of planning maturity. The questions that matter most are: how long did implementation actually take versus plan? What was the real total cost versus budget? How quickly did planners adopt the system? What worked well and what would they do differently?

The evaluation criteria should weight practical demonstration performance, implementation credibility and total cost of ownership at least as heavily as written response quality. A common weighting structure might allocate 30% to demonstrated planning capability, 25% to implementation approach and team, 20% to total cost of ownership, 15% to architecture, integration and scalability, and 10% to vendor viability and support model.

Phase five: proof of concept (4-8 weeks, optional but recommended)

For high-value APS investments, a proof of concept with the preferred vendor — or in some cases, two finalists — provides a final layer of validation before committing. The PoC should use a defined subset of the organisation's actual data and test a specific set of planning capabilities in realistic conditions.

This isn't a full implementation. It's a structured test that answers the question: does this system, with our data, produce materially better planning outputs than our current approach? If the answer is yes — demonstrably, measurably — the business case for investment is validated. If the answer is equivocal, that's critical information to have before signing a multi-year contract.

Phase six: commercial negotiation and contracting (3-4 weeks)

With a preferred vendor identified and validated through demonstration and potentially PoC, the commercial negotiation phase should address the full scope of the relationship — not just licensing terms.

Key negotiation areas include licensing structure and scalability (what happens as user numbers or data volumes grow?), implementation pricing and risk sharing (fixed fee for defined scope versus time-and-materials), performance commitments (are there contractual commitments to system performance, forecast accuracy improvement, or implementation timeline?), data ownership and portability (if you leave, can you take your configuration, models and data?), support model and SLAs (response times, availability commitments, upgrade frequency), and contract term and exit provisions.

This is procurement work that benefits from experience with technology contracts specifically — understanding the commercial levers that matter in SaaS agreements, the risks that need to be allocated, and the governance mechanisms that protect the organisation's interests over a multi-year relationship.

The Australian and New Zealand context

For organisations operating in Australia and New Zealand, several characteristics of the local environment make APS selection particularly consequential.

Long inbound supply chains with variable lead times — particularly for imported products coming through congested port infrastructure — mean that demand planning accuracy and inventory optimisation have a disproportionate impact on both working capital and service levels. The buffer you need across a 12-week ocean lane from Asia is fundamentally different from what a European business needs across a 3-day road lane from a neighbouring country.

Seasonal and promotional demand patterns in Australian markets — particularly in FMCG, retail and agriculture — create concentrated demand peaks that expose the limitations of basic forecasting approaches. An APS that handles promotional uplift well can be worth millions in reduced waste and improved on-shelf availability.

The relatively small scale of the ANZ market means that some global APS vendors have limited local presence — fewer implementation consultants, smaller support teams, longer response times. The RFx process should explicitly evaluate local capability, not just global credentials. An outstanding platform with no credible ANZ implementation partner is a risk, not an asset.

Working capital pressure in the current interest rate environment makes inventory optimisation — one of the core APS capabilities — directly material to the balance sheet. Every dollar of unnecessary inventory carries a real financing cost that didn't matter as much when rates were near zero.

These factors reinforce why the RFx process needs to be tailored to ANZ operating conditions rather than run as a generic global technology evaluation. The scenarios used in evaluation should reflect Australian supply chain realities: long lanes, seasonal peaks, promotional volatility, concentrated retail customers and distributed geographic networks.

For more on how APS capabilities map to ANZ supply chain challenges, Trace has published a detailed guide on Advanced Planning Systems and how they transform supply chain planning in Australia and New Zealand.

Process readiness matters as much as system selection

One of the most important — and most frequently overlooked — aspects of APS selection is the question of process readiness. The best demand planning system in the world won't deliver value if the organisation doesn't have a functioning S&OP process, if planners don't have time to work with the system because they're buried in manual data manipulation, or if there's no governance rhythm for reviewing and acting on system outputs.

The RFx process should include an honest assessment of organisational readiness: do we have the planning processes, the roles, the data discipline and the executive sponsorship to implement and sustain an APS effectively? If the answer is "not yet," that doesn't mean abandoning the investment — but it does mean sequencing process improvement and capability building alongside (or ahead of) the technology implementation, and reflecting that reality in the implementation plan and timeline.

The organisations that get the most from their APS investment typically spend as much effort on project and change management — redesigning planning processes, upskilling planners, building governance mechanisms, securing stakeholder buy-in — as they do on configuring and deploying the technology. The RFx process should evaluate vendors and implementation partners on their ability to support this change dimension, not just their technical capability.

How Trace Consultants can help

At Trace Consultants, we help Australian organisations navigate APS selection and implementation from a position of genuine planning expertise — not just technology procurement capability.

We work across the full selection lifecycle:

Planning capability assessment. We assess your current demand planning, supply planning, inventory management and S&OP processes against best practice, quantify the improvement opportunity, and define the specific requirements that should drive your APS selection. This is planning and operations consulting grounded in practical supply chain experience.

Market scan and shortlisting. We help you navigate the vendor landscape with an informed, independent view — identifying which platforms are genuinely strong for your industry, your scale and your planning maturity, and which are better suited to different contexts.

RFx design and management. We design and manage the RFx process end to end — from document development through evaluation, demonstration management, reference checking and recommendation. Our approach emphasises scenario-based evaluation over feature checklists, ensuring the selection is based on demonstrated capability rather than written claims.

Commercial negotiation. We support procurement negotiations with APS vendors and implementation partners, bringing benchmarks and experience with technology contract structures to ensure the commercial terms protect your interests and reflect realistic expectations.

Implementation readiness. We help you prepare for implementation — designing the target planning processes, defining the organisational structure and roles, building the data readiness plan, and establishing the change management approach that will determine whether planners adopt the system or revert to spreadsheets.

Ongoing optimisation. Post-implementation, we support organisations in tuning and optimising their APS — recalibrating forecasting models, refining inventory parameters, improving S&OP process integration, and building internal capability to sustain and extend the system's value over time.

Our independence from any APS vendor means our recommendation is based entirely on which platform best fits your requirements, your data, your organisation and your budget — not on partnership arrangements or reseller margins. We've worked with organisations implementing platforms across the spectrum, from the major Leaders through to focused niche players, and our advice reflects that breadth of experience.

Getting this decision right

An APS investment is typically a multi-year, multi-million dollar commitment. The system you select will shape how your organisation plans demand, manages inventory, allocates supply and makes trade-off decisions for years to come. The planners who use it daily will either be empowered to make better decisions faster — or frustrated by a tool that doesn't fit their reality.

The RFx process is where you create the conditions for one of those outcomes or the other. A process that's rigorous about the right things — planning capability, practical demonstration, implementation credibility, total cost, change readiness — will consistently lead to better decisions than one that's rigorous about the wrong things: feature checklists, compliance documentation and licensing price.

If your organisation is considering an APS investment and you want to make sure the selection process is designed to deliver a system your planners will actually use, we'd welcome the conversation.

Ready to turn insight into action?

We help organisations transform ideas into measurable results with strategies that work in the real world. Let’s talk about how we can solve your most complex supply chain challenges.

Trace Logo