Tuesday, 12 December 2017

Poster Session

Analysing the Fidelity of Measurements Performed with Hardware Performance Counters

Authors:

Michael Kuperberg (Karlsruhe Institute of Technology)
Ralf Reussner (Karlsruhe Institute of Technology)

Abstract:

Performance evaluation requires accurate and dependable measurements of timing values. Such measurements are usually made using timer methods, but these methods are often too coarse-grained and too inaccurate. Thus, direct usage of hardware performance counters is frequently used for fine-granular measurements due to higher accuracy. However, direct access to these counters may be misleading on multicore computers because cores can be paused or core affinity changed by the operating system, resulting in misleading counter values. The contribution of this paper is the demonstration of an additional, significant flaw arising from the direct use of hardware performance counters. We demonstrate that using JNI and assembler instructions to access the Timestamp Counter from Java applications can result in grossly wrong values, even in single-threaded scenarios.

DOI: 10.1145/1958746.1958804

Full text: PDF

[#][]

Reusable QoS Specifications for Systematic Component-based Design

Authors:

Lucia Kapova (Karlsruhe Institute of Technology)

Abstract:

For successful and effective software development the ability to predict impact of design decisions in early development stages is crucial. Typically, to provide accurate predictions the models have to include low-level details such as used design patterns (e.g., concurrency design patterns) and underlying middleware platform. These details influence Quality of Service (QoS) metrics, thus are essential for accurate prediction of extra-functional properties such as performance and reliability. Existing approaches do not consider the relation of actual implementations and performance models used for prediction. Furthermore, they neglect the broad variety of implementations and middleware platforms, possible configurations, and varying usage scenarios. To allow more accurate performance predictions, we extend classical performance engineering by automated model refinements based on a library of reusable performance completions.

DOI: 10.1145/1958746.1958805

Full text: PDF

[#][]

Benchmarking Database Design for Mixed OLTP and OLAP Workloads

Authors:

Anja Bog (Hasso Plattner Institute, University of Potsdam)
Kai Sachs (SAP AG)
Alexander Zeier (Hasso Plattner Institute, University of Potsdam)

Abstract:

Current database benchmarks are either focusing on online transaction processing (OLTP) or on online analytical processing (OLAP) systems. This traditional separation has to be reevaluated to reflect current trends in the design of database systems. We see a need for a realistic benchmark workload taking both aspects into account. Therefore, we defined a mixed workload and illustrate ways to apply our workload to evaluate the influence of database design on system performance.

DOI: 10.1145/1958746.1958806

Full text: PDF

[#][]

A New Approach to Introduce Aspects in Software Architecture

Authors:

Khider Hadjer (Saad Dahlab University)
Bennouar Djama (Saad Dahlab University)

Abstract:

The techniques of programming and methodologies strongly evolved throughout the history of data processing with the evolution of the software systems, these systems indeed tend to become increasingly complex. Component-Based Software Development proved its interests in the control of the complexity of the conceived software, and became a critical factor in the success of development of the software projects by facilitating the maintenance and the evolution of the software and authorizing the development of the bulky systems in terms of size but also of complexity.

This style of programming promises the re-use, but is confronted with the problems of code scattering and tangling. The application of Aspect-Oriented Programming on the software components makes it possible to face these problems. Programming called by aspect allowing managing, in a modular way, these concerns by separating them from the basic code. Aspect-Oriented Programming, a new paradigm of the programming which made possible to simplify the writing of the programs data-processing, while making them more modular and easier has to make evolve.

Today, the software Aspects and components are two very promising paradigms; who support the re-use and simplify the software development. To date, implementation the simultaneous of these two paradigms remains a field of research very slightly explored. To date no model of component supports in an explicit way the aspects and several questions remain open. Among them: How to integrate the representation of the aspects in the software components? How to manage the interactions and overlappings between aspects?

We present in this paper 3ADL, an extension of the model of component IASA defined in the laboratory LRDSI which supports the Aspect-Oriented Programming. This extension consists in equipping approach IASA with the aspect components and aspect ports.

The objective of work is to make supports to the model of component IASA the concept of aspect in its entire dimension: Once this concept supported, an architect could define his own Aspect components which it instantiated in the part controls of a component.

DOI: 10.1145/1958746.1958807

Full text: PDF

[#][]

Performance Cockpit: Systematic Measurements and Analyses

Authors:

Dennis Westermann (SAP Research)
Jens Happe (SAP Research)

Abstract:

Measurement-based performance evaluations are heavily used in practice to test system behavior under load, identify resource bottlenecks, or size system landscapes. Existing literature provides guidelines on how to conduct performance evaluations correctly. Many tools (e.g. for load generation, monitoring, or statistical analyses) provide basic assets to conduct such evaluations. However, the wide range of knowledge required to conduct performance evaluations and control the available tools restricts the group of users to a small set of performance experts. Additionally, the large effort to set up systems for performance evaluations often limits their application. In this demo paper, we present a framework that encapsulates best practices and allows for separation of concerns regarding the different aspects of a performance evaluation. The Performance Cockpit provides a single point of configuration for performance analysts and orchestrates plug-ins provided by corresponding experts. The resulting flexibility and automation enables new approaches for quality assurance and lowers the hurdles for conducting performance evaluations.

DOI: 10.1145/1958746.1958808

Full text: PDF

[#][]

FORGE: Friendly Output to Results Generator Engine

Authors:

J. Llodrà (Universitat de les Illes Balears)
C. M. Lladó (Universitat de les Illes Balears)
R. Puigjaner (Universitat de les Illes Balears)
Connie U. Smith (Performance Engineering Services)

Abstract:

Performance model interchange formats provide a mechanism for automatically moving performance models among modelling tools. The Experiment Schema Extension (ExSE) specifies performance studies to be run on the model and the XML performance metrics desired. Moreover, the Results Schema Extension (Results-SE) specifies how to automatically transform the output metrics into useful results. This paper presents the tool FORGE (Friendly Output to Results Generator Engine), a GUI based transformation tool from output to results which simplifies the specification of common tables for the output files the user provides. A case study demonstrates the use of the tool.

DOI: 10.1145/1958746.1958809

Full text: PDF

[#][]