Stay tuned – Receive JSM-news !

Join the JSM mailing list to receive our latest updates.
Email address
All too often companies have only the vaguest idea about what kind of data they’re holding; because such data is very often hidden deeply away in a variety of databases and fragmented across different departments. We identify this data and bring it to light, making it visible, cohesive, comparable and easy to understand so that it really does support YOU in making the right decisions. And if need be, we can also identify any lacking data and define a concept to fill in the gap.

Assessing market demand using Neural Networks

Posted by on Jan 29, 2018 in Case Studies | No Comments


The objective was to assess and forecast demand for a retail outlet, using Neural Networks.


In order to conduct the analytics, we applied the concept of Multi-Layer Perceptron (MLP) in Neural Networks which can classify and predict the patterns accurately. The inherently non-linear structure of neural networks is particularly useful for capturing the underlying complex relationship in many real-world problems. Neural networks are, perhaps, more versatile methods for forecasting applications. In such a case, not only can they find non-linear structures in a problem, they can also model linear processes.

Before a neural network can be used for forecasting, it must be trained. Neural network training refers to the estimation of connection weights. Although the estimation process is very similar to that of linear regression, where we minimize the Sum of Squared Errors (SSE), the ANN training process is more difficult and complicated, due to the nature of non-linear optimization involved. Various training algorithms have been developed and recorded in literature and the most influential one is the back propagation algorithm by Werbos (1974) and Rumelhart et al. (1986). The basic idea of back propagation training is to use a gradient-descent approach to adjust and determine weights such that an overall error function such as SSE can be minimized.

Before training the FFNN (Feed Forward Neural Network) model, the Scaled Independent variables are rescaled using adjusted normalized method. The data is first partitioned into Training, Testing, and Holdout samples, in order to evaluate the model accuracy. And then, applied MLP (FFNN) architecture with one ‘Input’, one ‘Hidden’, and one ‘Output’ layer with the below specifications:

Input layer contains units like lag1 volume (yesterday’s sales), lag7 volume (last week’s sales) and some more units which are used to capture the seasonality as well as any promotional activity, etc. The ‘Hidden’ layer contains two units and the ‘Output’ layer contains one unit with volume.

Mentioned below is the network information:

The coefficient estimates that show the relationship between the units in a given layer to the units in the following layer are called weights and can be used to predict the future volumes by using the below mentioned model:


The fitted FFNN models produced around 95% prediction accuracy for both ‘Train’ and ‘Test’ data sets, and the obtained FFNN model can be used to predict the sales in the future. This data enabled the client to gauge the demand.

Venugopala Rao Manneni

A doctor in statistics from Osmania University. I have been working in the fields of data analysis and research for the last 14 years. My expertise is in data mining and machine learning – in these fields I’ve also published papers. I love to play cricket and badminton.

More Posts

Leave a Reply