Mismatches in customer's billing settings in a national utility company

The company has a few million customers with active contracts. They have found that a small fraction of them have mismatches in their billing settings which prevented correct billing in the ERP system. In the worst-case scenario, they had to write off all debts related to these errors because of non-billing.

They were able to find about 20K customer accounts with all sorts of mismatches using SQL rules. But they could not identify the actual state of each of these accounts because there was no easy way to determine it from the data they had. Their data analysts had to investigate each of the identified accounts manually, which would take years of their time.

We spent about a week for an end-to-end one-off analysis of their data, most of which was spent on manual one-time data extraction from the ERP system. The algorithm’s total runtime was just over an hour on a good corporate laptop.

Our algorithm not only determined the status of every customer but also made recommendations on what needs to be changed in every account record to make them all coherent. It also found 14K more accounts with problems that were below their radar.

This one-off analysis made the company more convinced and we setup automatic daily monitoring of all problems in account billing settings to track data quality KPIs such as issue lifetime, daily in and outs, the total number of issues.

Pricing leakage in a multi-billion global FMCG company

The company sells globally and has a very diverse pricing strategy that changes from country to country and region to region. Historically, they delegated many powers to local markets to set their pricing processes and to interfere manually into pre-configured pricing process.

At one point, the global function decided to take more control over pricing back from local markets. Because it was an immense internal political challenge, they chose to work with every market separately. To approach each market, they wanted some evidence of inefficiencies in local market pricing practices.

The problem the central team had is that the task they were up to was enormous. Also, the team didn’t want to involve local markets into their investigations for obvious reasons.

This problem didn’t look like a Data Quality problem from the very beginning, and they contacted us for another reason. But when they shared all their data problems in a workshop session, we realised that this could be a good fit for our algorithm. The algorithm could not find inefficiencies in their existing pricing processes, but it could find all sorts of off-process transactions due to manual interventions, failures or misconfigurations. They called losses related to these issues a pricing leakage.

It was a perfect test for value recommending feature of our algorithm because differences it highlighted would allow estimating the bottom line of all issues found. It was also a good example when a client didn’t have process documentation, which isn’t a problem for the algorithm.

The analysis did find that only a tiny fraction of data had issues with pricing. But that still was a great success, because we were looking for problems in revenue of a multi-billion company and our findings still worthed millions.

Unfortunately for us, the middle of our project, the company have reorganised the team with which we worked. As a result, we could not finish the work we started so positively.