When Excel Fails
How we tamed a data monster with Big Data tools
Hardly any other buzzword has made itself as comfortable as Big Data in recent years. It not only determines our everyday lives, but has also found its way into political discourse, Hollywood films, and literature. Perhaps because Big Data prefers to play the role of the villain in pop culture, the term is increasingly connoted with something threatening. In this context, the construct Big Data initially no longer describes a large amount of data that is too extensive or complex to be evaluated using conventional methods of data processing. However, the buzzword Big Data also encompasses the entirety of technologies used to evaluate them, as well as their impact on digital society. Either way, Big Data has come to stay.
At TALLENCE, we come into contact with the practice behind the word, especially when data must essentially be flexibly evaluated in real time. As we are not bound to the table structures and indexes of classic databases, data sources can be easily linked using different formatting and formats.
We have used Big Data technology in a major project, among other things. Our customer carried out a nationwide network switchover, during which data was dropped on a large scale. Several tens of millions of lines from different systems, each with its own file format, were previously read manually into the Excel tool and combined into progress reports.
In the past, the data was first processed by a team of about three people and then pre-aggregated in an Oracle database. The result was exported, imported into Excel and processed there. However, Excel did not prove to be the ideal solution, as a great deal of programming was required every time file formats changed or new evaluations were to be requested. The complexity of the data also meant that processes were often counted twice or not recorded at all, which made statements about the project status unreliable. A high, cost-intensive effort was thus contrasted with only partially satisfactory results. The classic Excel simply reached its limits with such a large amount of data.
We encountered this problem in the course of another project and proactively offered our customer a solution: an approach using Big Data technologies was to put an end to the resource-consuming data chaos. Using the Splunk tool as a basis, Tallence AG developed reader, processor and writer components that can read and process this special data and convert it into an understandable form.
We continuously read the data into the software we create. Our customer has the possibility to create regular reports. Among other things, the system makes it easy to create management presentations with understandable visualisations without much effort. The software thus automatically generates various management reports on a daily basis. These include:
- Detailed statements on the progress of the project
- Alarm in case of deviations from the planned target
- Warnings when technical and organisational problems occur, which the system automatically recognises by analysing the data
- Project forecast based on older data
In addition, the system is fully flexible and can generate reports on demand, for example, limited to a certain region. It is also possible to request partial reports, for example, by filtering out the progress of individual work steps or certain technical characteristics of the switchover. If the requirements are to be changed, no major adjustment effort is required – a new request to the system is sufficient. The reports are automatically supplemented by various graphics that clearly present the results.
Thanks to Big-Data, our customer’s reporting tool has thus not only become more efficient and resource-saving in all workflow steps, but also much clearer and more flexible.