Event

PhD Defense: Performance Evaluation and Modelling of Saas Web Services in the Cloud

  • Conférencier  Abdallah Ali Zainelabden Abdallah Ibrahim

  • Lieu

    Room MSA 4.530 Maison du Savoir 2, avenue de l'Université L-4365 Esch-sur-Alzette

    LU

Members of the defense committee:

  • Prof. Dr Ulrich SORGER, University of Luxembourg, chairman
  • Prof. Dr El-Ghazali TALBI, Polytech Lille, vice-chairman
  • Prof. Dr Pascal BOUVRY, University of Luxembourg, supervisor
  • Dr Sebastien VARRETTE, University of Luxembourg, member
  • Dr Dzmitry KLIAZOVICH, ExaMotive S.A., member

Abstract:

This thesis studies the problem of performance evaluation and assurance of communications and services quality in cloud computing. Cloud computing paradigm has significantly changed the way of doing business. With cloud computing, companies and end-users can access the vast majority of services online through a virtualized environment. The three main services typically consumed by cloud users are Infrastructure-as-a-Service ( IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS). Cloud Services Providers (CSPs) deliver cloud services to cloud customers on a pay-per-use model while the quality of the provided services are defined using Service Level Agreements (SLAs). Unfortunately, there is no standard mechanism that verifies and assures that delivered services satisfy the signed SLA agreement in an automatic way, which impedes the possibility to measure accurately the Quality of Service (QoS). In this context, this thesis aims at offering an automatic framework to evaluate the QoS and SLA compliance of Web Services ( WSs) offered across several CSPs. Yet unlike other approaches, the framework aims at quantifying fairly the performance and scalability of the delivered WS adopting a stealth way. Stealthiness refers to the capacity of evaluating a given Cloud service through multiple workload patterns that makes them indistinguishable from regular user traffic from the provider point of view. This thesis work is motivated by recent scandals in the automotive sector, which demonstrates the capacity of solution providers to adapt to the behaviour of their product when submitted to an evaluation campaign to improve the performance results. The framework defines a set of common performance metrics handled by a set of agents within a customized clients for measuring the behavior of cloud applications on top of a given CSP. Once modelled accurately, the agent behaviour can be dynamically adapted to hide the true nature of the framework client to the CSP. In particular, the following contributions are proposed:

  • A novel framework of performance metrics for communication systems of cloud computing SaaS
  • The proposed framework evaluates and classifies in a fair and by stealth way the performance and scalability of the delivered WS across multiple CSPs.
  • Analysis of the performance metrics for the cloud SaaS Web Services (WS) by analyzing all the possible metrics, which could be used to evaluate and monitor the behavior of the cloud applications.
  • Benchmarking the cloud SaaS applications and web services by using referenced benchmarking tools and frameworks.
  • Modelling the SaaS WS by providing a set of Gaussian models. These models can be used to help researchers to generate data representing the CSP’s behaviour under high load and under the normal usage in just a couple of minutes without any experiments.
  • A novel optimization model to obfuscate the testing from the CSP and to achieve stealthiness. The optimization process relies on meta-heuristic and machine learning algorithms, such as Genetic Algorithm and Gaussian Process Regression accordingly.  
  • A virtual QoS aggregator and SLA checker that takes care of evaluating the QoS and SLA compliance of the WS offered across the considered CSPs.