One of the key challenges faced by organizations deploying an enterprise-wide analytics solution is the maintenance and upgrade of its applications. Most organizations follow an agile development methodology that entails frequent releases with new content as well as routine upgrades, patches, fixes, and security updates.
Depending on the complexity of the application, you need to invest a significant amount of time, energy, and manpower to ensure that none of the existing reports, dashboards, or underlying data is adversely impacted by any of these maintenance tasks. Any degradation in performance or accuracy of data in these applications not only reflect poorly on the system administrators, but also leads to a lower level of reliability in the analytics solution and ultimately negatively impacts user adoption and business value throughout the organization.
Hence, it is critical for system administrators to ensure that the application and the data within it remains consistent and reliable for its end users, irrespective of the ongoing maintenance tasks that they have to perform on the system.
A typical testing methodology adopted by most organizations involves manual testing and “eye-balling” of a subset of reports and data after major maintenance tasks such as patches and updates. Organizations with more resources may create custom test scripts and automate certain parts of the testing and QA process.
Upgrades are typically more involved and take a lot more time and testing to ensure consistency. When your analytics application grows to thousands of users and tens of thousands of reports and dashboards, it is usually cost prohibitive to test every single report for every user. Hence, automation of this testing process is critical to the long-term success of an analytics application.
Here are five things to keep in mind when automating testing of analytics applications:
1. Real-world applications: Make sure that tests are run on as many real-world production applications as possible. Testing on just one or a handful of sample environments is not ideal and can lead to unforeseen issues when deploying the update or an upgrade. The applications on which tests are run need to be representative of real-world applications that your users or customers will be using.
2. Replica of live production system: Ensure that there is no impact to the actual live production system at the time of testing. To run a series of tests at any time of the day, you need a replica of the production system with the same hardware and software, but in an isolated environment that is as similar to the production system as possible. This way, as your users report new issues, you can analyze them and assess their impact by running tests in a separate environment so system performance for users is not affected by the ongoing testing. Using a cloud platform makes it easier to quickly provision a replicated environment for testing purposes.
3. Platform approach to testing: It is really important to design the automated testing system as a platform for running a variety of tests, rather than creating disjointed automation scripts for different scenarios. The testing process also needs to incorporate changes when it fails to identify certain issues. With a single platform, you can achieve economies of scale and optimize and share the testing infrastructure across multiple scenarios and applications.
4. Granularity of test execution data: Test results should not be simply binary in terms of pass or fail. Irrespective of whether an application passes or fails a particular test, it is important to capture detailed statistics and information from every step of the testing process. This will help you identify and anticipate future issues and fine tune the testing process.
5. Analysis of test results: Depending on the complexity of the testing process, the analysis of test results can be a full-fledged analytics application in itself. The test results should be stored in an optimized format (for example, in a data warehouse) that makes it easy to analyze in detail to gain further insights into the application performance. This will also help analyze historical system test results and monitor the performance over a period of time.
At MicroStrategy, we have created a single automated testing platform to quickly spin up new cloud-based environments for testing purposes and conduct a variety of tests on real customer applications.
We collect extremely granular data as part of the test executions and analyze the results using state-of-the-art dossiers on the MicroStrategy platform. We built a fully automated testing framework that provides a user-friendly GUI, with which any developer or system administrator can trigger a test scenario in just a few clicks.
We use this system extensively during our own development process. With every new build of our platform, our developers can run a series of tests on real-world analytics applications with metadata and data warehouses shared by our customers.
This enables us to quickly analyze the impact of changes in our code and helps us see an upgraded system from a customer’s perspective. Over the years we have grown this automated testing platform to run tests over 20 different customer applications. This has significantly increased our product quality and reliability with every new release.
Pradyut Bafna is an analytics and Business Intelligence executive at MicroStrategy.
7 november (online seminar op 1 middag)Praktische tutorial met Alec Sharp Alec Sharp illustreert de vele manieren waarop conceptmodellen (conceptuele datamodellen) procesverandering en business analyse ondersteunen. En hij behandelt wat elke data-pr...
11 t/m 13 november 2024Praktische driedaagse workshop met internationaal gerenommeerde trainer Lawrence Corr over het modelleren Datawarehouse / BI systemen op basis van dimensioneel modelleren. De workshop wordt ondersteund met vele oefeningen en pr...
18 t/m 20 november 2024Praktische workshop met internationaal gerenommeerde spreker Alec Sharp over het modelleren met Entity-Relationship vanuit business perspectief. De workshop wordt ondersteund met praktijkvoorbeelden en duidelijke, herbruikbare ...
26 en 27 november 2024 Organisaties hebben behoefte aan data science, selfservice BI, embedded BI, edge analytics en klantgedreven BI. Vaak is het dan ook tijd voor een nieuwe, toekomstbestendige data-architectuur. Dit tweedaagse seminar geeft antwoo...
De DAMA DMBoK2 beschrijft 11 disciplines van Data Management, waarbij Data Governance centraal staat. De Certified Data Management Professional (CDMP) certificatie biedt een traject voor het inleidende niveau (Associate) tot en met hogere niveaus van...
3 april 2025 (halve dag)Praktische workshop met Alec Sharp [Halve dag] Deze workshop door Alec Sharp introduceert conceptmodellering vanuit een non-technisch perspectief. Alec geeft tips en richtlijnen voor de analist, en verkent datamodellering op c...
10, 11 en 14 april 2025Praktische driedaagse workshop met internationaal gerenommeerde spreker Alec Sharp over herkennen, beschrijven en ontwerpen van business processen. De workshop wordt ondersteund met praktijkvoorbeelden en duidelijke, herbruikba...
15 april 2025 Praktische workshop Datavisualisatie - Dashboards en Data Storytelling. Hoe gaat u van data naar inzicht? En hoe gaat u om met grote hoeveelheden data, de noodzaak van storytelling en data science? Lex Pierik behandelt de stromingen in ...
Deel dit bericht