Plan vs Actual: Retail (POS)

Outline: The Forecast vs Actual: Retail explore is intended to compare product-level weekly forecasts to actual retail sales (sell-out) at a granular level (likely RMA/market level or retailer level)【26†Description】. It should be used alongside the Daasity Forecasting tool to track forecast accuracy and update future forecasts. Key aspects include forecasting metrics, actual retail performance, variances, and relevant dimensions like product and market.

What I Can Outline with Known Context:

  • It’s new (💭 new) and likely ties into a forecasting solution (maybe an internal Daasity forecasting module or external).

  • Data likely included:

    • Weekly Forecasted Sales (units or dollars) for retail channels.

    • Weekly Actual Sales (from Unified Retail Sales or Retail Competitive data).

    • Variance calculations (difference or % error).

    • Dimensions: Product, perhaps Market (Total vs by retailer or channel), possibly at least by major category or brand if at product level.

  • Use Cases:

    • Track how accurate your retail sales forecasts were per product or category.

    • Identify where actuals fell short or exceeded, and by how much.

    • Feed those insights back to adjust forecasting (the description: “track and update your forecast from the bottom up”【26†Description】 suggests this is used to inform forecast adjustments at a detailed level).

  • Likely measures:

    • Forecast Units, Actual Units, Variance Units, Variance %.

    • Possibly cumulative or YTD variance.

  • Team likely to use: Demand planning / Sales ops.

What’s Unknown / Needs Input:

  1. Is this focusing on retail sell-out only, or does it include wholesale shipments? (It says Retail, so likely sell-out.)

  2. Are forecasts coming from an external system or a Daasity input (perhaps a Brand Supplied Data sheet for forecast, or an integration with a planning tool)? Clarification on source of forecast values.

  3. Time horizon: how far out forecasts and how often updated? (But likely just focusing historically).

  4. Are there product hierarchies? (Forecasts often done at SKU or category; the description says product-level at RMA-level).

  5. Should we mention anything about forecast adjustments or integration with workflows?

Questions for Daasity Team (Forecast vs Actual: Retail):

  • Forecast Data Source: Where do the forecast figures come from (Daasity’s forecast tool or imported from an external plan)? Understanding this helps explain how users might input/adjust forecasts.

  • Level of Detail: Are forecasts and actuals at the SKU level by retail market, or aggregated by category? The description says product-level, RMA-level weekly.

  • Variance Calculation: Does the explore already calculate variance and error %, or is it just showing two series (forecast vs actual) for the user to compute?

  • Usage Guidance: After identifying variances in this explore, how do users update the forecast? (Perhaps outside the explore, but context might say “then you can adjust your forecast in the forecasting tool accordingly.”)

  • Time Frame: Does it cover historically past weeks and also future weeks? Possibly to see upcoming forecast vs nothing actual yet (which would just be forecast).

  • Integration with Plans: If there’s a “Forecast tool” in Daasity, should we mention it explicitly? (E.g., “This explore ties into Daasity’s Forecasting module – see that documentation.”)

  • Example scenario: To ensure proper usage, confirm a scenario like “For product X in Region Y, forecast for week 35 was 100 units but actual sales were 80, so variance -20%” is the kind of insight this provides.

  • Data Refresh: Is the actual updated weekly and forecast static? Or can forecast change and we capture original vs current forecast? (Might be too deep; likely static baseline forecast vs actual.)

Last updated

Was this helpful?