学位论文详细信息
An automated approach to create, manage and analyze large- scale experiments for elastic n-tier application in clouds
Automation;Benchmarking;Cloud;Experiments;Performance measurements;n-Tier Application
Jayasinghe, Indika D. ; Pu, Calton Computer Science Liu, Ling Omiecinski, Ed Navathe, Shamkant B. Ferreira, João E. ; Pu, Calton
University:Georgia Institute of Technology
Department:Computer Science
关键词: Automation;    Benchmarking;    Cloud;    Experiments;    Performance measurements;    n-Tier Application;   
Others  :  https://smartech.gatech.edu/bitstream/1853/49098/1/JAYASINGHE-DISSERTATION-2013.pdf
美国|英语
来源: SMARTech Repository
PDF
【 摘 要 】
Cloud computing has revolutionized the computing landscape by providing on-demand, pay-as-you-go access to elastically scalable resources. Many applications are now being migrated from on-premises data centers to public clouds; yet, the transition to the cloud is not always straightforward and smooth. An application that performed well in an on-premise data center may not perform identically in public computing clouds, because many variables like virtualization can impact the application's performance. By collecting significant performance data through experimental study, the cloud's complexity particularly as it relates to performance can be revealed. However, conducting large-scale system experiments is particularly challenging because of the practical difficulties that arise during experimental deployment, configuration, execution and data processing. In spite of these associated complexities, we argue that a promising approach for addressing these challenges is to leverage automation to facilitate the exhaustive measurement of large-scale experiments.Automation provides numerous benefits: removes the error prone and cumbersome involvement of human testers, reduces the burden of configuring and running large-scale experiments for distributed applications, and accelerates the process of reliable applications testing. In our approach, we have automated three key activities associated with the experiment measurement process: create, manage and analyze. In create, we prepare the platform and deploy and configure applications. In manage, we initialize the application components (in a reproducible and verifiable order), execute workloads, collect resource monitoring and other performance data, and parse and upload the results to the data warehouse. In analyze, we process the collected data using various statistical and visualization techniques to understand and explain performance phenomena. In our approach, a user provides the experiment configuration file, so at the end, the user merely receives the results while the framework does everything else. We enable the automation through code generation. From an architectural viewpoint, our code generator adopts the compiler approach of multiple, serial transformative stages; the hallmarks of this approach are that stages typically operate on an XML document that is the intermediate representation, and XSLT performs the code generation. Our automated approach to large-scale experiments has enabled cloud experiments to scale well beyond the limits of manual experimentation, and it has enabled us to identify non-trivial performance phenomena that would not have been possible otherwise.
【 预 览 】
附件列表
Files Size Format View
An automated approach to create, manage and analyze large- scale experiments for elastic n-tier application in clouds 6255KB PDF download
  文献评价指标  
  下载次数:11次 浏览次数:31次