Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

LoadSplunker - Dashboard

Szuter, Martin and Rees, Justin (2016)
Computer Science and Engineering (BSc)
Abstract
At the IT department of IKEA, test specialists who are responsible for software testing are currently using a suite from Hewlett Packard called HP Application Lifecycle Management that includes an analysis tool for test result evaluation. During the implementation of this thesis work, the real world demand for a replacement of the analysis tool became apparent, as the interface of the analysis tool is regarded as being unnecessarily complicated to use, yet the test reports it generates are too simplistic and aesthetically unappealing. Furthermore, these reports are unduly static and lacks in interactivity if amendments are needed further down the line. In turn, this often leads to misunderstandings between test specialists and... (More)
At the IT department of IKEA, test specialists who are responsible for software testing are currently using a suite from Hewlett Packard called HP Application Lifecycle Management that includes an analysis tool for test result evaluation. During the implementation of this thesis work, the real world demand for a replacement of the analysis tool became apparent, as the interface of the analysis tool is regarded as being unnecessarily complicated to use, yet the test reports it generates are too simplistic and aesthetically unappealing. Furthermore, these reports are unduly static and lacks in interactivity if amendments are needed further down the line. In turn, this often leads to misunderstandings between test specialists and stakeholders, creating a feedback loop where a report is submitted, sent back for clarification, reworked and finally re-submitted. Because of this, test reporting becomes a very time consuming process and test specialists often find themselves spending more time writing reports than actually performing tests. Employees find this frustrating as well as ungratifying as the reports are perceived as being of lower quality than expected. The authors witnessed employees regularly having to copy and paste test result data into excel spreadsheets and bounce manually written test reports back and forth between stakeholders for weeks at a time, with their hands tied during the downtime. Previous students from Lund University have developed a substitute for the HP Analysis Tool called “LoadSplunker” for their thesis work. LoadSplunker utilises an analytical tool called “Splunk” and has been configured to extract the raw test result information used by HP Analysis Tool and produce XML-files containing the test results. However, the previous students did not develop a suitable dashboard in Splunk to present the test results in. Additionally, the previous solution depended on Windows, and a requirement from IKEA was that LoadSplunker should run on Linux. This thesis work reviewed and modified the previous solution to run within the Linux sphere. After conducting a survey and gathering requirements from IKEA, three configurable Splunk dashboards were implemented, with the aim of presenting the test results in a manner that is both easy to read and understand. The three dashboards developed are each categorized and aimed at stakeholders with varying technical proficiency and give an overview of selected test runs. Replacing the Analysis Tool with Splunk could eliminate almost all of the downtime associated with the test reporting process and the authors feel this thesis assignment has successfully implemented a proof-of-concept with LoadSplunker. While there is potential for test reports being completely replaced with Splunk for recurring or regularly scheduled tests, test reports can probably never be fully automated for large projects or new test runs. For large projects, focus should instead be shifted towards creating individual dashboards in Splunk on a per-project basis. (Less)
Please use this url to cite or link to this publication:
author
Szuter, Martin and Rees, Justin
organization
year
type
M2 - Bachelor Degree
subject
keywords
ikea, hp, alm, performance center, loadrunner, analysis, splunk, loadsplunker, dashboard
language
English
id
8878491
date added to LUP
2016-06-08 04:09:26
date last changed
2018-10-18 10:33:01
@misc{8878491,
  abstract     = {{At the IT department of IKEA, test specialists who are responsible for software testing are currently using a suite from Hewlett Packard called HP Application Lifecycle Management that includes an analysis tool for test result evaluation. During the implementation of this thesis work, the real world demand for a replacement of the analysis tool became apparent, as the interface of the analysis tool is regarded as being unnecessarily complicated to use, yet the test reports it generates are too simplistic and aesthetically unappealing. Furthermore, these reports are unduly static and lacks in interactivity if amendments are needed further down the line. In turn, this often leads to misunderstandings between test specialists and stakeholders, creating a feedback loop where a report is submitted, sent back for clarification, reworked and finally re-submitted. Because of this, test reporting becomes a very time consuming process and test specialists often find themselves spending more time writing reports than actually performing tests. Employees find this frustrating as well as ungratifying as the reports are perceived as being of lower quality than expected. The authors witnessed employees regularly having to copy and paste test result data into excel spreadsheets and bounce manually written test reports back and forth between stakeholders for weeks at a time, with their hands tied during the downtime. Previous students from Lund University have developed a substitute for the HP Analysis Tool called “LoadSplunker” for their thesis work. LoadSplunker utilises an analytical tool called “Splunk” and has been configured to extract the raw test result information used by HP Analysis Tool and produce XML-files containing the test results. However, the previous students did not develop a suitable dashboard in Splunk to present the test results in. Additionally, the previous solution depended on Windows, and a requirement from IKEA was that LoadSplunker should run on Linux. This thesis work reviewed and modified the previous solution to run within the Linux sphere. After conducting a survey and gathering requirements from IKEA, three configurable Splunk dashboards were implemented, with the aim of presenting the test results in a manner that is both easy to read and understand. The three dashboards developed are each categorized and aimed at stakeholders with varying technical proficiency and give an overview of selected test runs. Replacing the Analysis Tool with Splunk could eliminate almost all of the downtime associated with the test reporting process and the authors feel this thesis assignment has successfully implemented a proof-of-concept with LoadSplunker. While there is potential for test reports being completely replaced with Splunk for recurring or regularly scheduled tests, test reports can probably never be fully automated for large projects or new test runs. For large projects, focus should instead be shifted towards creating individual dashboards in Splunk on a per-project basis.}},
  author       = {{Szuter, Martin and Rees, Justin}},
  language     = {{eng}},
  note         = {{Student Paper}},
  title        = {{LoadSplunker - Dashboard}},
  year         = {{2016}},
}