Data analytics projects: From theory to practice

In my last blogpost, I described the initial important steps to ensure a successful outcome of a data analytics project:

First, close cooperation between the ideas provider (department) and the data scientists is an absolute must to achieve the defined project goal. Second, it’s necessary to verify the quality and quantity of the data before the data scientists get started.

In this post, I’d like to update you with some recommendations: How do data analytics projects work in practice?  How can the Bosch prediction model be applied in use cases?

1. Was the goal of the data analytics project achieved?


A successful data analytics project can help you reduce costs and make your production more efficient.

Sometimes even at the end of a project, you may realize that the project has fallen short of or completely missed its goal. As you work towards attaining your defined project objectives, there are a number of points to consider. Find some typical mistakes here:

a. Inaccurate prediction model

At the end of the project, you may realize that the result (e.g. a prediction model) fails to meet the required accuracy or to deliver the hoped-for new insights.

Why might that be?

The first question you need to ask is whether the required accuracy of the model was defined at the start of the project. This is, of course, a basic prerequisite and should be considered right from the project planning phase. The aspects relating to data quality and quantity mentioned earlier can also lead to an inaccurate result because the data “doesn’t yield more information.”

Thus, it becomes clear that the project implementation is not to blame for the less-than-satisfactory project outcome, but that pitfalls are present in the planning phase which significantly influence the result.

b. Pursuing an unsustainable use case for too long

Often, at the start of a project, all the stakeholders are euphoric. The technical and commercial objectives sound promising. “The project has to be a success!”

However, this sentence conceals a risk. Despite all the euphoria, it’s important to maintain a certain neutrality and skepticism regarding the (intermediate) results. Doggedly pursuing an unsustainable use case can mean you end up investing a lot of time and money in a project without achieving the hoped-for result.

It is therefore crucial to analyze intermediate results with a critical and open mind as regards the feasibility of achieving the project goal.

We strongly recommend you heed warning signs and do not pursue a project goal that is realistically unattainable simply because you “have to” get there!

Let me mention here the “fail fast” or “change it” mantra, which states that it is better to recognize or adapt an unattainable goal quickly than to pursue it and invest unnecessary resources – without adding value or achieving a useful result.

We provide support in several phases. After each phase, it analyzes the results achieved up to that point. This makes it possible to adapt the project goals or underlying data at various points in the process – making risks transparent and avoidable.


2. Theory and practice – from Proof of Concept (PoC) to operational use case

Not everything that works under “laboratory conditions” turns out to be effective in practice. Unfortunately, this is the conclusion we sometimes have to draw at the end of a project. But first we have to test it.

The developed prediction model has functioned in line with the requirements on the basis of the historical training data. Now it’s time to integrate it into the operational environment.

Disillusionment can set in even at this early stage. The prediction model is designed to run on a control system and make predictions in real time (in the millisecond range). However, this requirement wasn’t known when the model was developed. The algorithms are complex, as they have to meet high standards regarding accuracy, but limited resources mean they can’t be applied on the target hardware.

So what initially appeared to be a great project result ultimately cannot be integrated into the real use case. The reason is once again shortcomings in the planning phase.

We conduct data analysis projects in line with the CRISP-DM standard – with one crucial addition: We place a particular emphasis on achieving an expert-level understanding of the customer’s problem. (See also question 3 of the blog post: How to start a data analytics project in manufacturing.)

To do this, during the Initial Insights phase, our data analytics engineers learn more about the customer’s production processes and the specific problem to be solved. They also ask a lot of questions to enable them to develop an in-depth understanding. This phase is extremely important for the project’s success because it lays the groundwork for establishing the correlations between the real processes, problems, and the data. You can’t find solutions in the digital world if you don’t understand the process and problem in the real world.



Best practices for the success of data analytics projects

There are many reasons why data analytics projects can fail. Often there isn’t one reason alone; instead, the problem lies in the sum of individual aspects.

Devote the necessary time and attention to the preparation phase and planning of your data analytics project. If you observe and follow the main rules and best practices, you will be on the road to success with your project.

Register for workshop

About the author

Denis Court

Denis Court

Denis Court is working as a Manufacturing Analytics Engineer and Manufacturing Scout within the Product Group Industry & Logistics at Bosch Software Innovations since May 2016. Since joining Bosch in 2006, he has held a number of different positions within Bosch Rexroth and Bosch Software Innovations. These positions include process development for automotive customers with very close cooperation from shopfloor level to management, product and project management in IT projects as well as Data Analytics projects.