The foundation of Cloud Computing is sharing computing resources dynamically allocated and released per\ndemand with minimal management effort. Most of the time, computing resources such as processors, memory and\nstorage are allocated through commodity hardware virtualization, which distinguish cloud computing from others\ntechnologies. One of the objectives of this technology is processing and storing very large amounts of data, which\nare also referred to as Big Data. Sometimes, anomalies and defects found in the Cloud platforms affect the\nperformance of Big Data Applications resulting in degradation of the Cloud performance. One of the challenges in\nBig Data is how to analyze the performance of Big Data Applications in order to determine the main factors that\naffect the quality of them. The performance analysis results are very important because they help to detect the\nsource of the degradation of the applications as well as Cloud. Furthermore, such results can be used in future\nresource planning stages, at the time of design of Service Level Agreements or simply to improve the applications.\nThis paper proposes a performance analysis model for Big Data Applications, which integrates software quality\nconcepts from ISO 25010. The main goal of this work is to fill the gap that exists between quantitative (numerical)\nrepresentation of quality concepts of software engineering and the measurement of performance of Big Data\nApplications. For this, it is proposed the use of statistical methods to establish relationships between extracted\nperformance measures from Big Data Applications, Cloud Computing platforms and the software engineering\nquality concepts.
Loading....