Our client is a privately held chain of supermarkets that operates 176 stores in 24 states located in the Southeast, Midwest, Mid-Atlantic and Northeast, with plans for continued expansion throughout the country.
Our client had implemented Snowflake on Azure and wanted to automate the demand forecasting process for all its stores by using machine learning tools and methods.
Forecasting had previously been a manual process conducted by the financial planning and analysis team using Excel, but an inconsistent schedule for updating the forecast led to inaccurate estimates and untrustworthy data. The lack of valuable insights forced sales managers to only use historical sales data or research information.
We helped identify and deploy platforms, tools, methods, and algorithms to manage gigabytes of product data and support the client’s forecasting automation goal
After carefully reviewing the client’s needs and challenges, we concluded that Amazon Forecast was the optimal solution to its forecasting challenge. Ultimately, it was the only tool on the market that could support the client’s automation goal.
The client needed a user-friendly data science tool that closely aligned to its business needs. The models available through Amazon Forecast delivered quick business value with a user-friendly interface accessible to the company’s analytics team without hiring specialized data scientists.
In order to orchestrate the automation process for 176 stores and over 8,000 products, we used the tools S3, Jupyter Notebook, Snowflake, and SageMaker. Historical data was pulled from Snowflake into S3 for storage and we triggered Amazon Forecasting to provide forecast output back to S3. From S3 the output was loaded back into Snowflake again.
All of this was done using Python script, therefore it can be deployed again as more data is loaded into Snowflake. Jupyter Notebook was used to manage code and SageMaker helped with testing and validation.