The client is a corporate-level Telco company that analyzes huge data sets and provides direct Insights-as-a-Service consulting. Seeing the immense value of their work (and the natural limitations of consulting business) they built a self-serving Data Analytics SaaS to cover a wider audience.
Our task was to improve upon the existing SaaS and help the client grab a greater share of the market. The potential of the tool is to become a key product in the client’s portfolio and give a huge competitive edge over more traditionally focused companies.
The tested business model was a success, but…
They started with an internally developed MVP. The first version proved to be quite useful and clients wanted to pay top dollar for it. Yet, as with most MVPs that try to scale up to a full-fledged product, the code was messy. That version couldn’t effectively handle data at their level of standards.
Some features that work on a small scale perfectly, become an obstacle on a larger scale.
Besides, the core of their strategy was to purchase large data sets and provide a whole range of insights, but the software couldn’t process it fast enough. The upload was clumsy. Scaling with that version would cost a fortune (and it still wouldn’t be efficient enough).
What they needed to achieve
- Need for speed in data upload and processing: A data business grows at the speed of upload and processing. One insight earlier (or later) might make a difference between profitability and loss.
- Build a scalable infrastructure: Purchasing large data sets, integrating them with other data sets, and analyzing all at once on demand requires strong mechanisms under the hood.
- Implement adequate data modeling and organization processes: The algorithms and models don’t do much if the data isn’t organized well.
What were the “real” challenges
The real challenge wasn’t to implement the technology or make something work more efficiently. That’s been part of our job since we founded the company. The daunting task is to help clients manage the changes that come with new data strategies.
Integrating a data solution that deviates from the “usual ways” requires many meetings and Powerpoints, as well as intuitive UI-UX that includes the entire organization if needed (even the non-tech personnel).
The real challenge was to prevent huge opportunity loss. Their business growth was stalling because the tech couldn’t scale up. Finding the middle ground between development time, in-house adoption, quality assurance, and serving the market with as few hiccups as possible – that might have been the biggest challenge on this project.
Technical solution
Tech stack: Azure Cloud, Microsoft cloud, SQL server, C#
We implemented a Data Lakehouse – a data agnostic solution that works in the cloud. It allowed the client’s team to store, handle, and access the data at one place – all while being able to scale (both up and down) per need.
Their tool extracted the data from the Lakehouse directly, resulting in:
- Improved data upload process: By designing and implementing a streamlined data upload process, the client could integrate new data sets without the usual hassle of collecting data from many sources manually. This liberated the growth of the project (and so much time for the team).
- Enhanced data modeling and organization: By revamping the data modeling and organization, we reduced the number of procedures and created an efficient data management process. This means that the software could generate more insights on a higher level.
Change management solution
We consulted the client about data strategy and implemented a tried and true data governance process. At the end of our project, the client had a sufficient internal knowledge base – allowing us to jump in as a plug-and-play service provider whenever they needed it with minimum roadblocks.
The client now had a value system that covers the entire organization and could be applied to multiple departments.
Smart Tip
If a client has limited technical expertise in-house, it is essential to create a solution that prioritizes user-friendliness and ease of maintenance.
This means designing a solution that is intuitive and easy to use, even for individuals without a technical background or dedicated IT personnel.
This proves vital when presenting huge infrastructural changes to the managing board.
Last but not least, it’s important to address potential scalability issues that arise as the product or dataset grows in size. Certain features and procedures that were effective on a smaller scale may become bottlenecks and hinder further growth.
Smart Fact
The client’s primary product was a final data set sold to customers, including predictions and trends.