ICSOFT 2021 Abstracts


Area 1 - Foundational and Trigger Technologies

Full Papers
Paper Nr: 15
Title:

A Novel and Dedicated Machine Learning Model for Malware Classification

Authors:

Miles Q. Li, Benjamin C. M. Fung, Philippe Charland and Steven H. H. Ding

Abstract: Malicious executables are comprised of functions that can be represented in assembly code. In the assembly code mining literature, many software reverse engineering tools have been created to disassemble executables, search function clones, and find vulnerabilities, among others. The development of a machine learning-based malware classification model that can simultaneously achieve excellent classification performance and provide insightful interpretation for the classification results remains to be a hot research topic. In this paper, we propose a novel and dedicated machine learning model for the research problem of malware classification. Our proposed model generates assembly code function clusters based on function representation learning and provides excellent interpretability for the classification results. It does not require a large or balanced dataset to train which meets the situation of real-life scenarios. Experiments show that our proposed approach outperforms previous state-of-the-art malware classification models and provides meaningful interpretation of classification results.
Download

Short Papers
Paper Nr: 49
Title:

Multi-document Arabic Text Summarization based on Thematic Annotation

Authors:

Amina Merniz, Anja H. Chaibi and Henda H. Ben Ghézala

Abstract: Reduce document(s) by keeping keys and significant sentences from a set of data is called text summarization. It has been around for a long time in natural language processing research, it is improving over the years due to a considerable number of methods and research in this area. The paper suggests Arabic multi-document text summarization. The originality of the approach is that the summary based on thematic annotation such as input documents are analyzed and segmented using LDA. Then segments of each topic are represented by a separate graph because of the redundancy problem in multi-document summarization. In the last step, the proposed approach applies a modified pagerank algorithm that utilizes cosine similarity measure as a weight between edges. Vertices that have high scores are essential. Therefore, they construct the final summary. To evaluate summary systems, researchers develop serval metrics divided into three categories, namely: automatic, semi-automatic and manual. This study research chooses automatic evaluation methods for text summarization, mainly Rouge measure (Rouge-1, Rouge-2, Rouge-L, and Rouge-SU4).
Download

Paper Nr: 82
Title:

A Framework for Security Monitoring of Real IoT Testbeds

Authors:

Vinh H. La, Edgardo Montes de Oca, Wissam Mallouli and Ana R. Cavalli

Abstract: Internet of Things (IoT) has been acknowledged as a novel transformation technology because of its wide range of applications in various domains, namely connected agriculture, industrial control, smart buildings and home automation. It promises innovative business models and improved user experience. However, the devices are prone to failures and malicious attacks on account of their resource-constrained characteristics. In this paper, we present a framework for security monitoring of IoT systems. It is based on MMT-IoT, which is a reactive monitoring tool to be deployed in a running IoT environment to address malicious behaviors, failures and attacks. In this paper we also present the experiments conducted on two practical IoT-6LoWPAN testbeds. The preliminary results confirmed the efficiency of the proposed solution.
Download

Paper Nr: 117
Title:

Towards Efficient Hashing in Ethereum Smart Contracts

Authors:

Emanuel Onica and Cosmin-Ionuţ Schifirneţ

Abstract: Ethereum is a popular public blockchain platform and currently the most significant featuring smart contract functionality. Smart contracts are small programs that are executed on the blockchain nodes, which can be used to implement complex transaction logic. Several high-level programming languages are available for writing Ethereum smart contracts, the most used being Solidity. The high-level code is further translated into a bytecode executed by a dedicated runtime environment, the Ethereum Virtual Machine (EVM). A few operations are, however, externalized as precompiled contracts, and run by the native implementation of the Ethereum node. These are typically computationally intensive operations such as cryptographic hash functions. Various smart contract patterns require hash computations. In such contexts, the current hash functions supported by Ethereum have a direct impact in both the performance and cost inflicted on the blockchain users. In this paper we investigate the available options for hashing in smart contracts, we discuss the implications regarding some patterns and we evaluate possible improvements. In particular we focus on the recent Blake family of cryptographic hash functions, which show promising performance results, but has yet limited support in the Ethereum platform.
Download

Paper Nr: 36
Title:

An Enhanced Image Compression Codec using Spline-based Directional Lifting Wavelet Transform and an Improved SPIHT Algorithm

Authors:

Rania Boujelbene and Yousra Ben Jemaa

Abstract: A novel lossy image-compression scheme is proposed in this paper. A two-step structure is embedded in this codec. A spline-based directional lifting wavelet transform is used to decorrelate the image data in the first step. Then in the second step, an improved Set Partitioning in Hierarchical Trees (SPIHT) algorithm based on binary tree is developed to code the wavelet coefficients. The numerical results demonstrated the efficiency of the proposed approach to achieve significant gains in terms of PSNR, BD-PSNR and SSIM for all test images. It offers better results compared to the existing ones.
Download

Paper Nr: 110
Title:

A Model-Driven Engineering: From Relational Database to Document-oriented Database in Big Data Context

Authors:

Fatima Z. Belkadi and Redouane Esbai

Abstract: In today’s world multiple players in digital technology produce infinite amounts of data. Sensors, social networks or e-Commerce, they all generate information that is incremented in real time according to Gartner's 5V: in Volume, Velocity, Variety, Value, and Veracity. The digital transformation of companies leads an evolution of databases towards Big Data whose power has become increasingly strategic. The exploitation of the data is a key to a better understanding and management of the company and its markets. This obviously imposes an ability to generate data, to store it, to give it meaning and then to exploit it. At the same time, the modernization of the Big Data platform is essential to automate this process. In this paper, we explain how to design and apply transformation rules to transfer from an SQL relational database to a Big Data solution within NoSQL. For this, we use the Model Driven Architecture and the transformation languages as MOF 2.0 QVT (Meta-Object Facility 2.0 Query-View-Transformation) and Acceleo for describing the meta-models for the development of transformation model. The transformation rules defined in this work can generate, from the class diagram, a JSON files for creation document-oriented NoSQL database.
Download

Area 2 - Software Engineering and Systems Development

Full Papers
Paper Nr: 7
Title:

Automatic Feedback Generation for Supporting User Interface Design

Authors:

Jenny Ruiz and Monique Snoeck

Abstract: Although the interest for User Interfaces (UI) has increased their study, their design is a difficult process to learn. Novel UI designers, therefore, need guidance through the learning process of UI design to obtain better results. Feedback is a key factor to improve knowledge and skills acquisition. However, providing individual feedback is a complex and time-consuming task and requires a fair amount of expertise. This paper describes a solution to this problem: Feedback ENriched user Interface Simulation (FENIkS). FENIkS is a model-driven engineering UI design simulation tool able to automatically provide instant feedback to the students about how they apply UI design principles. While designing the UI, the novice designer receives feedback on how design principles are applied through the options he/she selects. Then, when generating a working prototype from the models, feedback explaining the application of principles is incorporated in the prototype. An experimental evaluation was conducted, demonstrating FENIkS improves students' understanding of UI design principles. The perceived usability was also positively evaluated. This paper explains FENIkS' design: the meta-model, how design options, design principles and types of feedback are used to generate automated feedback on the observation of design principles in the tool and the generated prototype.
Download

Paper Nr: 13
Title:

Tales from the Code #1: The Effective Impact of Code Refactorings on Software Energy Consumption

Authors:

Zakaria Ournani, Romain Rouvoy, Pierre Rust and Joel Penhoat

Abstract: Software maintenance and evolution enclose a broad set of actions that aim to improve both functional and non-functional concerns of a software system. Among the non-functional concerns, energy consumption is getting more and more traction in the industry, no matter the software is mobile or deployed in the cloud. In this context, the impact of code refactorings on energy consumption remains unclear, though. In particular, while the state of the art investigated the impact of some specific code refactorings on dedicated benchmarks, we miss an assessment that those apply to more comprehensive and complex software. To address this threat, this paper studies the evolution of the energy consumption of 7 open-source software developed for more than 5 years. Then, by focusing on the impact on energy consumption of changes involving code refactorings, we intend to assess the effects induced by such code refactorings in practice. For all these software systems we studied, our empirical results report that the code refactorings we mined do not substantially impact energy consumption. Interestingly, these results highlight that i) structural code refactorings bring energy-preserving changes to the code, and ii) major energy variations seem to be related to functional and computational code evolutions.
Download

Paper Nr: 16
Title:

Power Consumption Estimation in Model Driven Software Development for Embedded Systems

Authors:

Marco Schaarschmidt, Michael Uelschen and Elke Pulvermüller

Abstract: Due to the resource-constrained nature of embedded systems, it is crucial to support the estimation of their power consumption as early in the development process as possible. Non-functional requirements based on power consumption directly impact the software design, e.g., watt-hour thresholds and expected lifetimes based on battery capacities. Even if software affects hardware behavior directly, these types of requirements are often overlooked by software developers because they are commonly associated with the hardware layer. Modern trends in software engineering such as Model-Driven Development (MDD) can be used in embedded software development to evaluate power consumption-based requirements in early design phases. However, power consumption aspects are currently not sufficiently considered in MDD approaches. In this paper, we present a model-driven approach using Unified Modeling Language profile extensions to model hardware components and their power characteristics. Software models are combined with hardware models to achieve a system-wide estimation, including peripheral devices, and to make the power-related impact in early design stages visible. By deriving energy profiles, we provide software developers with valuable feedback, which may be used to identify energy bugs and evaluate power consumption-related requirements. To demonstrate the potential of our approach, we use a sensor node example to evaluate our concept and to identify its energy bugs.
Download

Paper Nr: 24
Title:

A Meta-model for the Guideline Definition Language

Authors:

Reyes Grangel, Cristina Campos Sancho, Begoña Martínez-Salvador and Mar Marcos

Abstract: Computer-Interpretable Guidelines (CIGs) are a key issue to implement decision support systems that could help clinical practice. To represent these guidelines numerous modeling languages, such as PROforma, Asbru, or GLIF, have been defined and they are currently used in different contexts with more or less acceptance. The Guideline Definition Language (GDL) is a rule-based modeling language for CIGs proposed recently by the openEHR Foundation. This language might have a wider acceptance because of the fact that it is defined as an open standard and, as such, open detailed specifications are provided. In this paper, we focus on the most recent specification of GDL, GDL2. Our objective is to gain insight and knowledge about this language and its specifications for software engineering application purposes, such as Model-Driven solutions. In this context, we present a proposal for a GDL meta-model as an attempt to formalise the GDL2 specification in a UML meta-model. Additionally, in order to validate this proposal, we present a sample GDL model, based on a clinical guideline for the diagnosis of heart failure, developed using the GDL meta-model implemented in Eclipse.
Download

Paper Nr: 26
Title:

A Meta-level Approach for Multilingual Taint Analysis

Authors:

Damian M. Lyons and Dino Becaj

Abstract: It is increasingly common for software developers to leverage the features and ease-of-use of different languages in building software systems. Nonetheless, interaction between different languages has proven to be a source of software engineering concerns. Existing static analysis tools handle the software engineering concerns of monolingual software but there is little general work for multilingual systems despite the increasing visibility of these systems. While recent work in this area has greatly extended the scope of multilingual static analysis systems, the focus has still been on a primary, host language interacting with subsidiary, guest language functions. In this paper we propose a novel approach that does not privilege any one language and has a modular way to include new languages. We present an approach to multilingual taint analysis (a security oriented static analysis method) as a ‘meta-level’ algorithm which includes monolingual static analysis as a special case. A complexity analysis of the taint analysis algorithm is presented along with a detailed ‘deep’ multilingual example with Python and C/C++ software. A performance analysis is presented on a collection of 20 public, multilingual repositories selected from github. Our results show an average of 76% improved coverage using our algorithm when compared to monolingual taint analysis.
Download

Paper Nr: 48
Title:

Refactoring Monolithic Object-Oriented Source Code to Materialize Microservice-oriented Architecture

Authors:

Pascal Zaragoza, Abdelhak-Djamel Seriai, Abderrahmane Seriai, Hinde-Lilia Bouziane, Anas Shatnawi and Mustapha Derras

Abstract: The emergence of the microservice-oriented architecture (MSA) has led to increased maintainability, better readability, and better scalability. All these advantages make migrating a monolithic software towards an MSA an attractive prospect for organizations. The migration process is recognized to be complex and consequently risky and costly. This process is composed of two phases: (1) the microservice-based architecture recovery phase and (2) the transformation (i.e. materialization) phase. In this paper, we propose a systematic approach to transform an object-oriented monolithic application towards an MS-oriented one by applying a set of transformation pattern. To validate our approach we automated it with our tool MonoToMicro, and applied it on a set of monolithic Java applications to be migrated towards microservices-based ones.
Download

Paper Nr: 84
Title:

Towards Automatically Generating a Personalized Code Formatting Mechanism

Authors:

Thomas Karanikiotis, Kyriakos C. Chatzidimitriou and Andreas L. Symeonidis

Abstract: Source code readability and comprehensibility are continuously gaining interest, due to the wide adoption of component-based software development and the (re)use of software residing in code hosting platforms. Consistent code styling and formatting across a project tend to improve readability, while most code formatting approaches rely on a set of rules defined by experts, that aspire to model a commonly accepted formatting. This approach is usually based on the experts’ expertise, is time-consuming and does not take into account the way a team develops software. Thus, it becomes too intrusive and, in many cases, is not adopted. In this work we present an automated mechanism, that, given a set of source code files, can be trained to recognize the formatting style used across a project and identify deviations, in a completely unsupervised manner. At first, source code is transformed into small meaningful pieces, called tokens, which are used to train the models of our mechanism, in order to predict the probability of a token being wrongly positioned. Preliminary evaluation on various axes indicates that our approach can effectively detect formatting deviations from the project’s code styling and provide actionable recommendations to the developer.
Download

Paper Nr: 106
Title:

Neural Networks based Software Development Effort Estimation: A Systematic Mapping Study

Authors:

Fatima E. Boujida, Fatima A. Amazal and Ali Idri

Abstract: Developing an efficient model that accurately predicts the development effort of a software project is an important task in software project management. Artificial neural networks (ANNs) are promising for building predictive models since their ability to learn from previous data, adapt and produce more accurate results. In this paper, we conducted a systematic mapping study of papers dealing with the estimation of software development effort based on artificial neural networks. In total, 80 relevant studies were identified between 1993 and 2020 and classified with respect to five criteria: publication source, research approach, contribution type, techniques used in combination with ANN models and type of the neural network used. The results showed that, most ANN-based software development effort estimation (SDEE) studies applied the history-based evaluation (HE) and solution proposal (SP) approaches. Besides, the feedforward neural network was the most frequently used ANN type among SDEE researchers. To improve the performance of ANN models, most papers employed optimization methods such as Genetic Algorithms (GA) and Particle Swarm Optimization (PSO) in combination with ANN models.
Download

Paper Nr: 109
Title:

A Software Framework for Context-aware Secure Intelligent Applications of Distributed Systems

Authors:

Soumoud Fkaier, Mohamed Khalgui and Georg Frey

Abstract: Future distributed reconfigurable systems need to provide smarter services. Therefore the used software need to include advanced mechanisms such as the context-awareness, artificial intelligence, collaboration between distributed parts of the system, as well as the secure data exchange. Most of the existing context-aware frameworks are restricted to a part of the mentioned scopes and are generally not suitable to reconfigurable systems. Hence, there is a need for a software engineering solution that reconciles all the said requirements. In this paper, we propose a software framework for developing collaborative, intelligent, and secure applications of distributed systems. This paper extends an existing framework with the mentioned features and shows its new structure and design. A software tool developing the proposed contributions is implemented using Java programming language. An example of microgrids software applications is used to show the suitability of the contributions.
Download

Short Papers
Paper Nr: 10
Title:

A Model Driven Method to Design Educational Cyber Physical Systems

Authors:

Samia Bachir, Laurent Gallon, Philippe Aniorte and Angel Abenia

Abstract: Instructional design is a major concern in TELE (Technology Enhanced Learning Environments) research, especially since the beginning of the Covid-19 health crisis. Since the beginning of this crisis, emergency remote teaching has been widely used. Accordingly, the primary objective in these circumstances is not to re-create a robust educational ecosystem, but rather to provide adapted access to instructional support, learning materials, services and objects. However, design connectedness in such environments is still required regarding the emergence of IoT (Internet of things) and CPS (Cyber Physical Systems) in everyday life and thus in educational environments. In this paper, we propose a model-driven engineering method for the design of Educational Cyber Physical Systems (ECPS). Our method deals with the separation of concerns when it comes to considering a Platform Independent Model (educational aspect) and a Platform Description Model (connected aspect). This practice could then be adopted in order to design further environments by adapting the required models.
Download

Paper Nr: 21
Title:

Deconstructing yield Operator to Enhance Streams Processing

Authors:

Diogo Poeira and Fernando Miguel Carvalho

Abstract: Customizing streams pipelines with new user-defined operations is a well-known pattern regarding streams processing. However, programming languages face two challenges when considering streams extensibility: 1) provide a compact and readable way to express new operations, and 2) keep streams’ laziness behavior. From here, we may find a consensus around the adoption of the generator operator, i.e. yield, as a means to fulfil both requirements, since most state-of-the-art programming languages provide this feature. Yet, what is the performance overhead of interleaving a yield-based operation in streams processing? In this work we present a benchmark based on realistic use cases of two different web APIs, namely: Last.fm and world weather online, where custom yield-based operations may degrade the streams performance in twofold. We also propose a purely functional and minimalistic design, named tinyield, that can be easily adopted in any programming language and provides a concise way of chaining extension operations fluently, with low overhead in the evaluated benchmarks. The tinyield proposal was deployed in three different libraries, namely for Java (jayield), JavaScript (tinyield4ts) and .Net (tinyield4net).
Download

Paper Nr: 31
Title:

On Effects of Applying Predictive Caching for State Machines

Authors:

James P. Akyüz and Tolga Ovatman

Abstract: State machines are frequently used in software development, in many different contexts, ranging from modeling control software to distributed applications that operate in cloud environments. We have implemented and experimented on basic execution path-based predictive caching approaches for state machines to show that due to the limited number of paths that can be taken during a state machine run better pre-fetching can be achieved for state machine caches. We have applied our predictive approaches over least frequently used (LFU) and least recently used (LRU) replacement on two different state machine instances run with real-world execution traces.
Download

Paper Nr: 32
Title:

Component Ensemble-based UML/MARTE Extensions for the Design of Dynamic Cyber-Physical Systems

Authors:

Nissaf Fredj, Yessine H. Kacem, Olfa Kanoun and Mohamed Abid

Abstract: Cyber-Physical Systems (CPS) are open-ended distributed systems which incorporate autonomous components that interact with clear responsibilities to fulfill a defined objective. They are subject to several issues related to the interaction between components and dynamic behavior of their status as well as real-time constraints. Their design requires effective modeling formalisms for functional verification and real-time analysis of CPS at early stages of development. Unified Modeling languages and especially the Modeling Analysis of Real-Time and Embedded systems (MARTE) profile fail to capture the dynamics of CPS. They focus on individual components and are unable to provide such QoS guarantees. Moreover, they again assume static component architectures, which effectively hinder their direct application in open-ended systems. In this respect, we proposed to extend MARTE profile with the concept of ensemble semantics for the design of a dynamic grouping of CPS components. The objective is to save resources consumption such as battery life and allow for scalable QoS guaranties. Moreover, the present approach proposes designs patterns for the specification of CPS closed-loop concerns which addresses the components composition, adaptation as well as the verification of system constraints.
Download

Paper Nr: 33
Title:

Extending the Fast Healthcare Interoperability Resources (FHIR) with Meta Resources

Authors:

Timoteus Ziminski, Steven Demurjian and Thomas Agresta

Abstract: The Fast Healthcare Interoperability Resources (FHIR) from the international Health Language Seven (HL7) organization has been mandated by the United States Office of National Coordinator to promote the secure exchange of healthcare data for patients through the use of cloud-based APIs. FHIR reformulated the HL7 XML standard by defining 135+ resources that conceptualize the different aspects of healthcare data such as patients, practitioners, organizations, services, appointments, encounters, diagnostic data, and medications. Developers of healthcare applications select a subset of the resources that are required to solve their problems. However, the standard provides no way to effectively organize a subset of resources into a higher-level construct similar to software design patterns. This paper leverages the design pattern concept to extend the FHIR standard by defining meta resources that are a conceptual construct that clearly defines the involved resources and their interactions into one unified artifact. To illustrate the concepts of this paper, we use a mobile health application for medication reconciliation that integrates information from multiple electronic health records. We leverage FHIR extension mechanisms such as profiles and Bundle resources to integrate the meta resource into the resource contextualization layer of the FHIR standard.
Download

Paper Nr: 52
Title:

Improving Vulnerability Prediction of JavaScript Functions using Process Metrics

Authors:

Tamás Viszkok, Péter Hegedűs and Rudolf Ferenc

Abstract: Due to the growing number of cyber attacks against computer systems, we need to pay special attention to the security of our software systems. In order to maximize the effectiveness, excluding the human component from this process would be a huge breakthrough. The first step towards this is to automatically recognize the vulnerable parts in our code. Researchers put a lot of effort into creating machine learning models that could determine if a given piece of code, or to be more precise, a selected function, contains any vulnerabilities or not. We aim at improving the existing models, building on previous results in predicting vulnerabilities at the level of functions in JavaScript code using the well-known static source code metrics. In this work, we propose to include several so-called process metrics (e.g., code churn, number of developers modifying a file, or the age of the changed source code) into the set of features, and examine how they affect the performance of the function-level JavaScript vulnerability prediction models. We can confirm that process metrics significantly improve the prediction power of such models. On average, we observed a 8.4% improvement in terms of F-measure (from 0.764 to 0.848), 3.5% improvement in terms of precision (from 0.953 to 0.988) and a 6.3% improvement in terms of recall (from 0.697 to 0.760).
Download

Paper Nr: 55
Title:

Reduction-assisted Fault Localization: Don’t Throw Away the By-products!

Authors:

Dániel Vince, Renáta Hodován and Ákos Kiss

Abstract: Spectrum-based fault localization (SBFL) is a popular idea for automated software debugging. SBFL techniques use information about the execution of program elements, recorded on a suite of test cases, and derive statistics from them, which are then used to determine the suspiciousness of program elements, thus guiding the debugging efforts. However, even the best techniques can face problems when the statistics are unbalanced. If only one test case causes a program failure and all other inputs execute correctly, as is typical for fuzz testing, then it may be hard to differentiate between the program elements suspiciousness-wise. In this paper, we propose to utilize test case reduction, a technique to minimize unnecessarily large test cases often generated with fuzzing, to assist SBFL in such scenarios. As the intermediate results, or by-products, of the reduction are additional test cases to the program, we use these by-products when applying SBFL. We have evaluated this idea, and our results show that it can help SBFL precision by up to 49% on a real-world use-case.
Download

Paper Nr: 64
Title:

A Holistic Methodology for Model-based Design of Mechatronic Systems in Digitized and Connected System Environments

Authors:

Xiaobo Liu-Henke, Sven Jacobitz, Sören Scherler, Marian Göllner, Or A. Yarom and Jie Zhang

Abstract: This paper presents a holistic methodology for model-based design of mechatronic systems in digitized and connected system environments. On the one hand, this includes system structuring for the controllability of complex system relations. On the other hand, it comprises the extension of the design and safeguarding process by requirement, data and evaluation management as well as the integration of novel technologies (e.g. Driving-Simulator-in-the-Loop (DSiL) simulations) for the execution of closed-loop simulations under realistic and at the same time reproducible operation conditions. Furthermore, a low-cost rapid control prototyping development platform (LoRra) is presented, with which the presented methodology can be applied. The new holistic methodology is verified by case studies.
Download

Paper Nr: 73
Title:

Towards the End of Agile: Owing to Common Misconceptions in the Minds of Agile Creators

Authors:

Necmettin Ozkan and Mehmet Ş. Gök

Abstract: The Agile Software Development movement emerged from practice just like most of the works in the Agile Software Development evolved through practice. Thus, the creators and consultants of the Agile world may evangelize it with commercial concerns, resulting in “selling agility” to organizations as an object in the form of packaged practices (of methods/models/frameworks). Owing to the “sold practices” of the market and misleading misconceptions in the minds of Agile creators, there are issues in Agile like regarding it as a “holy” product and everything, binary thinking, trade-offs, and determinism that do not support agility in an absolute sense and even inhibit it, which ultimately lead to the end of AgileTM. This study handles and discusses such seven prominent misconceptions and makes a prediction about the possible course of AgileTM and rise of agility.
Download

Paper Nr: 75
Title:

Seamless Integration of Hardware Interfaces in UML-based MDSE Tools

Authors:

Lars Huning, Timo Osterkamp, Marco Schaarschmidt and Elke Pulvermüller

Abstract: Model-Driven Software Engineering (MDSE) promotes the use of models for software development. One application of MDSE is the development of embedded systems, whose size and complexity are growing steadily. Usage of MDSE for embedded systems often consists of creating high-level architectures, e.g., with the Unified Modeling Language (UML), while the actual implementation of the system is done manually. One reason for this is the semantic gap between high-level UML models and the low-level programming associated with microcontrollers, i.e., imperative programming at the register level. This paper proposes an approach for the seamless integration of hardware interfaces, e.g., GPIOs or UARTs, in UML-based MDSE tools. This enables developers to create their application continously in the MDSE tool, instead of resorting to manual programming outside the environment of the MDSE tool. For this, we present an approach that describes how object-oriented hardware abstraction layers may be seamlessly integrated in MDSE tools. Furthermore, we provide a GUI tool for hardware interfaces that enables the initial configuration of these interfaces. An automatic code generation approach may subsequently be used to generate the initialization code for the hardware interfaces of a microcontroller. We present a use case for our approach in which the software application of an embedded system is ported to several other microcontrollers from different manufacturers.
Download

Paper Nr: 76
Title:

Improved Software Product Reliability Predictions using Machine Learning

Authors:

Sanjay Joshi and Yogesh Badhe

Abstract: Reliability is one of the key attributes of software product quality. Popular software reliability prediction models are targeted to specific phases of software product development life cycle. After studying, reliability models, authors could conclude that they have limitations in predicting software product reliability. A recent industrial survey performed by the authors identified several factors which practitioners perceived to have influence in predicting reliability. Subsequently authors conducted set of experiments to find out influential factors to reliability. In this paper, authors presented model definition approach using most influential parameters such as review efficiency, skill level of developer/tester and post-delivery defects.
Download

Paper Nr: 79
Title:

A Formal Approach Combining Event-B and PDDL for Planning Problems

Authors:

Sabrine Ammar and Mohamed Tahar Bhiri

Abstract: In artificial intelligence, the goal of automatic planning is to structure actions in the form of a plan to achieve an expressed goal. The PDDL (Planning Domain Definition Language) was designed to allow the common representation of planning problems during ICAPS (International Conference on Automated Planning and Scheduling) competitions. PDDL has many verification and validation tools allowing the description, resolution and validation of planning problems. But they only allow the reliability of PDDL descriptions a posteriori. In this article, we recommend a rigorous approach coupling Event-B and PDDL favoring obtaining PDDL descriptions deemed correct, a priori, from an ultimate Event-B model. The formal Event-B method allows us to obtain, by successive refinements with mathematical proofs, correct by construction formal models of planning problems. A refinement strategy appropriate to planning problems is, then, proposed. The ultimate Event-B model, correct by construction, is automatically translated into PDDL using our MDE Event-B2PDDL tool. The obtained PDDL description is submitted to efficient planners for generation of correct and efficient plan-solutions.
Download

Paper Nr: 81
Title:

Software Task Importance Prediction based on Project Management Data

Authors:

Themistoklis Diamantopoulos, Christiana Galegalidou and Andreas L. Symeonidis

Abstract: With the help of project management tools and code hosting facilities, software development has been transformed into an easy-to-decentralize business. However, determining the importance of tasks within a software engineering process in order to better prioritize and act on has always been an interesting challenge. Although several approaches on bug severity/priority prediction exist, the challenge of task importance prediction has not been sufficiently addressed in current research. Most approaches do not consider the meta-data and the temporal characteristics of the data, while they also do not take into account the ordinal characteristics of the importance/severity variable. In this work, we analyze the challenge of task importance prediction and propose a prototype methodology that extracts both textual (titles, descriptions) and meta-data (type, assignee) characteristics from tasks and employs a sliding window technique to model their time frame. After that, we evaluate three different prediction methods, a multi-class classifier, a regression algorithm, and an ordinal classification technique, in order to assess which model is the most effective for encompassing the relative ordering between different importance values. The results of our evaluation are promising, leaving room for future research.
Download

Paper Nr: 89
Title:

Optimizing the Usability of User Interfaces based on Multi-objective Evolutionary Algorithms

Authors:

Marwa Hentati, Abdelwaheb Trabelsi and Adel Mahfoudhi

Abstract: Solving the software system problems using optimization algorithms stands for an intrinsic area of research whose aim is to find an optimal solution according to a set of conflicting objectives. One of the most prominent problems is optimizing the software quality such as usability of user interfaces following the model-driven engineering (MDE). One of the main challenges of MDE process is identifying the highly-usable model according to a set of desired usability aspects. Although models may be equivalent from the functional viewpoint, they may differ from the non-functional perspectives. Besides, they do not fulfil the same usability properties. In this context, we adressed this issue by combining the power of model engine and the optimization algorithms. In this study, we propose to integrate a multi-objective evolutionary algorithm at the conceptual level of the MDE process. It allows to find an optimal (or near-optimal) model from a large search space according to a set of usability aspects and taking into account the context of use.
Download

Paper Nr: 90
Title:

Smart Techniques for Flying-probe Testing

Authors:

Andrea Calabrese, Stefano Quer and Giovanni Squillero

Abstract: In the production of printed circuit boards, in-circuit tests verify whether the electric and electronic components of the board have been correctly soldered. When the test is performed using flying-probes, several probes are simultaneously moved on the board to reach and touch multiple test points. Taking into consideration the layout of the board, the characteristics of the tester, and several other physical constraints, not all movements of the probes are mutually compatible nor they can always be performed through simple straight lines. As the cost of the test is mainly related to its length, and patching the path of one probe may create new incompatibilities with the trajectory of the other probes, one should carefully trade off the time required to find the trajectories with the time required by the probes to follow them. In this paper, we model the movements of our flying probes as a multiple and collaborative planning problem. We describe an approach for detecting invalid movements and we design a strategy to correct them with the addition of new intermediate points in the trajectory. We report the entire high-level procedure and we explore the optimizations performed in the more expensive and complex steps. We also present parallel implementations of our algorithms, either relying on multi-core CPU devices or many-cores GPU platforms, when these units may be useful to achieve greater speedups. Experimental results show the effectiveness of the proposed solution in terms of elapsed computation times.
Download

Paper Nr: 98
Title:

Towards a Neural Network based Reliability Prediction Model via Bugs and Changes

Authors:

Camelia Şerban and Andreea Vescan

Abstract: Nowadays, software systems have become larger and more complex than ever. A system failure could threaten the safety of human life. Discovering the bugs as soon as possible during the software development and investigating the effect of a change in the software system are two main concerns of the software developers to increase system’s reliability. Our approach employs a neural network to predict reliability via post-release defects and changes applied during the software development life cycle. The CK metrics are used as predictors variables, whereas the target variable is composed of both bugs and changes having different weights. This paper empirically investigates various prediction models considering different weights for the components of the target variable using five open source projects. Two major perspectives are explored: cross-project to identify the optimum weight values for bugs and changes and cross-project to discover the best training project for a selected weight. The results show that for both cross-project experiments, the best accuracy is obtained for the models with the highest weights for bugs (75% bugs and 25% changes) and that the right fitted project to be used as training is the PDE project.
Download

Paper Nr: 102
Title:

Transfer Learning for Just-in-Time Design Smells Prediction using Temporal Convolutional Networks

Authors:

Pasquale Ardimento, Lerina Aversano, Mario L. Bernardi, Marta Cimitile and Martina Iammarino

Abstract: This paper investigates whether the adoption of a transfer learning approach can be effective for just-in-time design smells prediction. The approach uses a variant of Temporal Convolutional Networks to predict design smells and a carefully selected fine-grained process and product metrics. The validation is performed on a dataset composed of three open-source systems and includes a comparison between transfer and direct learning. The hypothesis, which we want to verify, is that the proposed transfer learning approach is feasible to transfer the knowledge gained on mature systems to the system of interest to make reliable predictions even at the beginning of development when the available historical data is limited. The obtained results show that, when the class imbalance is high, the transfer learning provides F1-scores very close to the ones obtained by direct learning.
Download

Paper Nr: 11
Title:

Designing Operational Safety Procedures for UAV According to NATO Architecture Framework

Authors:

Wojciech Stecz and Piotr Kowaleczko

Abstract: The article presents the principles of designing unmanned aerial platforms, which belong to the group of near real-time systems. The correct and legally compliant design process of such systems requires adherence to the principles of designing operational safety procedures for UAVs in accordance with the NATO Architecture Framework. The NAF-compliant approach presented in the article enables meeting the requirements for the certification of flying systems in accordance with the guidelines DO-178 and DO-254, which are the basic documents on the basis of which the airworthiness of the system is assessed. The article presents the most important stages of designing unmanned systems that were used in a military project. An example of a system modeling method in UML and its extension, which is SysML, was also presented.
Download

Paper Nr: 41
Title:

Detection of Security Vulnerabilities Induced by Integer Errors

Authors:

Salim Y. Kissi, Yassamine Seladji and Rabéa Ameur-Boulifa

Abstract: Sometimes computing platforms, e.g. storage device, compilers, operating systems used to execute software programs make them misbehave, this type of issues could be exploited by attackers to access sensitive data and compromise the system. This paper presents an automatable approach for detecting such security vulnerabilities due to improper execution environment. Specifically, the advocated approach targets the detection of security vulnerabilities in the software caused by memory overflows such as integer overflow. Based on analysis of the source code and by using a knowledge base gathering common execution platform issues and known restrictions, the paper proposes a framework able to infer the required assertions, without manual code annotations and rewriting, for generating logical formulas that can be used to reveal potential code weaknesses.
Download

Paper Nr: 56
Title:

Requirement Engineering in Startups

Authors:

Shatadru Shikta, Sowvik K. Das, Somania N. Mahal, H. M. Shahriyar, Kazi B. Al Jannat and Mahady Hasan

Abstract: Startup companies are usually negligent when it comes to formal requirement engineering or proper collection of requirements required for their projects which leads to their early demise before gaining enough traction in the market. This paper tries to explore the reasons as to why startups fail in context to requirement engineering and gather experiences from the industry to try and propose a framework that can help startups work with a feasible and cost-effective method towards implementation of formal requirement engineering processes to ensure a long and successful tenure in the software industry.
Download

Paper Nr: 77
Title:

Modern Code Reviews: Preliminary Results of an Analysis of the State of the Art with Respect to the Role Played by Human Factors

Authors:

Aygul Malikova and Giancarlo Succi

Abstract: Modern Code Reviewing has shown to be an effective mechanism to identify bugs in the code; however, given their intrinsic subjectivity, they can be significantly affected by human factors such as interpersonal relationships. This paper focuses on exploring such issues, with specific attention to social iterations and personal factors. Future work includes experimental evaluations to verify the research hypothesis related to improving the quality of the process under the study.
Download

Paper Nr: 95
Title:

An Analysis of Gamification Elements for a Solving Proposal of Software Process Improvement Problems

Authors:

Elziane M. Soares and Sandro B. Oliveira

Abstract: As seen in the specialized literature, during the implementations of a Software Process Improvement (SPI) program, many cases of failure occur, caused on a recurring basis by problems and difficulties in SPI. In view of this, the need to adopt strategies and approaches to support the implementation of such initiatives is noticeable. Thus, the use of gamification in the context addressed can allow us to define mechanisms that drive people's motivation and commitment to the development of tasks in order to stimulate and accelerate the acceptance of an SPI initiative. In this context, this work aims to present strategies for using elements of gamification, present in the Octalysis Framework, regarding the treatment of the problems and difficulties evidenced. The strategies developed must be seen as possible solutions to be used by organizations to assist them when they encounter situations, in which SPI problems occur.
Download

Paper Nr: 104
Title:

Meeting Digital Preservation Requirements for Software through an Emulation Strategy

Authors:

Christophe Ponsard

Abstract: Digital preservation aims at ensuring digital artefacts remain accessible and usable. This includes preserving application software for the resulting experience but also software required to provide access to valuable digital artefacts. This paper surveys different preservation strategies of such software with a focus on the use of emulation which is gaining momentum over the more traditional migration approach. We highlight some requirements to consider when selecting emulators. We illustrate the process on the preservation of software on a micro-computers of the 1980’s. We also discuss how to design software architectures for the long term preservation of the emulators themselves.
Download

Paper Nr: 107
Title:

IoT Fuzzing using AGAPIA and the River Framework

Authors:

Eduard Stăniloiu, Rares Cristea and Bogdan Ghimis

Abstract: As the number of Internet of Things (IoT) systems continues to grow, so does the security risk imposed by interconnecting heterogeneous devices from different vendors. Testing and validating the security of IoT systems is difficult, especially due to the fact that most of the software is proprietary (closed-source) and the system’s embedded nature makes it hard to collect data, such as memory corruptions. This paper proposes to extend the novel AGAPIA language to enable IoT developers to write safer programs that can be tested and validated with state of the art fuzzers, such as RiverIoT. We present how simple additions can enable AGAPIA modules to be integrated with the RiverIoT architecture, thus facilitating better device testing. The proposed approach also enables users, not just developers, to perform system wide, black-box, testing, increasing the reliability of the system. We show how the abstractions provided by the AGAPIA language enable the fast development of an Air Quality Monitoring application and how small additions to existing programming languages can improve the testing and validation of IoT systems.
Download

Paper Nr: 112
Title:

A Pragmatic View on Resolving Conflicts in Goal-oriented Requirements Engineering for Socio-technical Systems

Authors:

Ishaya Gambo and Kuldar Taveter

Abstract: Requirements engineering has critical importance in the significant and successful number of software development projects involving multiple stakeholders to deliver high-quality software-intensive systems. The stakeholders' statements concerning the desired systems are expressed as goals to be achieved by the system in goal-oriented requirements engineering (GORE). In socio-technical systems (STS), the goals are achieved by cooperating with man-made agents within the software-to-be and human agents. However, as stakeholders often chase after mismatching goals subjectively, identifying and resolving conflicts in requirements becomes an inevitable part of GORE. This paper outlines the urgent need and processes required to investigate conflicts in the agile agent-oriented modeling (AAOM) methodology for engineering STS. We present a pragmatic view of our proposed strategy in a framework from a deductive and qualitative research perspective. The proposed strategy can attach stakeholders' corresponding roles to the hierarchical goal model's goals, which naturally brings out the stakeholder's needs and intentions. Additionally, it can relate the goal models to the most popular artifacts of agile software engineering. Thus, our pragmatic view builds upon well-established STS, especially in utilizing AAOM methodology.
Download

Area 3 - Software Systems and Applications

Full Papers
Paper Nr: 14
Title:

Linked Data as Stigmergic Medium for Decentralized Coordination

Authors:

Torsten Spieldenner and Melvin Chelli

Abstract: Algorithms inspired by nature have gained focus in research as a solution to classic coordination and optimization problems. A certain type of these algorithms employs principles of stigmergy: in stigmergic systems, coordination arises from agents leaving traces of their actions in the environment, or medium, that they work on. Other agents instinctively adapt their behavior based on the traces, by which, in the end, the fulfillment of a higher goal emerges from elementary actions of many, rather than thorough planning of complex actions of a few. Despite the perceivable uptake of stigmergic algorithms for coordination in various domains, a common clear understanding of a suitable digital stigmergic medium is lacking. It should however be assumed that a well-defined, properly modelled, and technically sound digital medium provides a crucial basis for correct, efficient and transferable stigmergic algorithms. In this paper, we motivate read-write Linked Data as generic medium for decentralized stigmergic coordination algorithms. We show how Linked Data fulfills a set of core requirements that we derived for stigmergic media from relevant literature, provide an application example from the domain of digital manufacturing, and finally provide a working example algorithm for stigmergic decentralized coordination.
Download

Paper Nr: 42
Title:

Context-aware and Ontology-based Recommender System for e-Tourism

Authors:

Gustavo Castellanos, Yudith Cardinale and Philippe Roose

Abstract: Frequently, travelers try to collect information for planing a trip or when being at the destination. Usually, tourists depend on places’ reviews to make the choice, but this implies prior knowledge of the touristic places and explicit search for suggestions through interaction with applications (i.e., PULL paradigm). In contrast, a PUSH approach, in which the application proactively triggers a recommendation process according to users’ preferences and when necessary, seems to be a more reasonable solution. Recommender systems have become appropriate applications to help tourists in their trip planning. However, they still have limitations, such as poor consideration of users’ profiles and their contexts, their predictable suggestions, and the lack of a standard representation of the knowledge managed. We propose a user-centric recommender system architecture, that supports both PULL and PUSH approaches, assisted by an ontology-based spreading activation algorithm for context-aware recommendations, with a focus on decreasing predictable outputs and increasing serendipity, based on an aging-like approach. To demonstrate its suitability and performance, we develop a first prototype of the architecture and simulate different scenarios, varying users’ profiles, preferences, and context parameters. Results show that the ontology-based spreading activation and the proposed aging system provide relevant and varied recommendations according to users’ preferences, while considering their context and improving the serendipity of the system when comparing with a state-of-the-art work.
Download

Paper Nr: 51
Title:

Object Parsing Grammars with Composition

Authors:

Stefan Sobernig

Abstract: An Object Parsing-Expression Grammar (OPEG) is an extension of parsing expression grammars (PEG) including generator expressions to directly produce object graphs from parsed text. This avoids typical abstraction mismatches of intermediate parse representations (e.g., decomposition mismatches). To develop language families via extension, unification, and extension compositions, OPEGs can be composed—without preplanning and with unmodified reuse. Composability is established by supporting both forming basic grammar unions and performing grammar transformations between two or more OPEGs (e.g., rule extraction, symbol rewriting). These transformation operators assist developers in mitigating the consequences of the non-disjointness under composition of parsing expressions (e.g., language hiding). An implementation of OPEGs is available as part of the multi-DSL development system DjDSL.
Download

Paper Nr: 78
Title:

AH-CID: A Tool to Automatically Detect Human-Centric Issues in App Reviews

Authors:

Collins Mathews, Kenny Ye, Jake Grozdanovski, Marcus Marinelli, Kai Zhong, Hourieh Khalajzadeh, Humphrey Obie and John Grundy

Abstract: In modern software development, there is a growing emphasis on creating and designing around the end-user. This has sparked the widespread adoption of human-centred design and agile development. These concepts intersect during the user feedback stage in agile development, where user requirements are re-evaluated and utilised towards the next iteration of development. An issue arises when the amount of user feedback far exceeds the team’s capacity to extract meaningful data. As a result, many critical concerns and issues may fall through the cracks and remain unnoticed, or the team must spend a great deal of time in analysing the data that can be better spent elsewhere. In this paper, a tool is presented that analyses a large number of user reviews from 24 mobile apps. These are used to train a machine learning (ML) model to automatically generate the probability of the existence of human-centric issues, to automate and streamline the user feedback review analysis process. Evaluation shows an improved ability to find human-centric issues of the users.
Download

Paper Nr: 86
Title:

Robust and Hybrid Crypto-watermarking Approach for 3D Multiresolution Meshes Security

Authors:

Ikbel Sayahi and Chokri Ben Amar

Abstract: Since the release of the first 3D watermarking algorithm, several approaches have grown up with a diversity of techniques used during the embedding of information into meshes. The main objective is always to secure data shared by remote users. The originality of the present work is issued from combining encryption and hybrid watermarking algorithm to secure 3D multiresolution meshes. The new crypto-watermarking system is composed of three parts: the first part is said watermark preparation and it aims to prepare data to be inserted. During this step, the logo (which refers to copyright information) is encrypted using RSA (Rivest, Shamir,Adleman) algorithm and then encoded by applying a convolutional encoder to the encrypted logo already transformed into a binary sequence. As for the second part, it is called mesh preparation and it consists on decomposing the 3D multiresolution mesh by applying wavelet transform to generate wavelet coefficient vector. Finally, the third part of our algorithm, called hybrid watermarking, occurs to insert encrypted logo and RSA keys into both multiresolution and spatial presentations of the mesh. In fact, the encrypted logo is inserted into resulting wavelet coefficients after applying the transformation to spherical coordinate system, modulation and demodulation. As for RSA key, it is inserted into the mesh resulting from the first watermarking around by modifying geometric information of vertices. Found results prove that we are able to insert a high amount of data without influencing the mesh quality. The application of the most popular attacks does not prevent a correct extraction of data already inserted which is justified by the use of the RSA to encode the watermark and the convolutional error correcting code to retrieve the corrupted information . Our algorithm is, then, robust against these attacks.
Download

Paper Nr: 114
Title:

WLNI-LPA: Detecting Overlapping Communities in Attributed Networks based on Label Propagation Process

Authors:

Imen B. El Kouni, Wafa Karoui and Lotfi Ben Romdhane

Abstract: Several networks are enriched by two types of information: the network topology and the attributes information about each node. Such graphs are typically called attributed networks, where the attributes are always as important as the topological structure. In these attributed networks, community detection is a critical task that aims to discover groups of similar users. However, the majority of the existing community detection methods in attributed networks were created to identify separated groups in attributed networks. Therefore, detecting overlapping communities using a combination of nodes attributes and topological structure is challenging. In this paper, we propose an algorithm, called WLNI-LPA, based on label propagation for detecting efficient community structure in the attributed network. WLNI-LPA is an extension of LPA that combines node importance, attributes information, and topology structure to improve the quality of graph partition. In the experiments, we validate the performance of our method on synthetic weighted networks. Also, a part of the experiment focuses on the impact of detecting significantly overlapping communities in the recommender system to improve the quality of recommendation.
Download

Paper Nr: 115
Title:

Mixed Software/Hardware based Neural Network Learning Acceleration

Authors:

Abdelkader Ghis, Kamel Smiri and Abderezzak Jemai

Abstract: Neural Network Learning is a persistent optimization problem due to its complexity and its size that increase considerably with system size. Different optimization approaches has been introduced to overcome the memory occupation and time consumption of neural networks learning. However, with the development of modern systems that are more and more scalable with a complex nature, the existing solutions in literature need to be updated to respond to the recent changes in modern systems. For this reason, we propose a mixed software/hardware optimization approach for neural network learning acceleration. The proposed approach combines software improvement and hardware distribution where data are partitioned in a way that avoid the problem of local convergence. The weights are updated in a manner that overcome the latency problem and learning process is distributed over multiple processing units to minimize time consumption. The proposed approach is tested and validated by exhaustive simulations.
Download

Paper Nr: 118
Title:

A Methodology for Integrated Process and Data Mining and Analysis towards Evidence-based Process Improvement

Authors:

Andrea Delgado, Daniel Calegari, Adriana Marotta, Laura González and Libertad Tansini

Abstract: The socio-technical system supporting an organization’s daily operations is becoming more complex, with distributed infrastructures integrating heterogeneous technologies enacting business processes and connecting devices, people, and data. This situation promotes large amounts of data in heterogeneous sources, both from their business processes and organizational data. Obtaining valuable information and knowledge from this is a challenge to make evidence-based improvements. Process mining and data mining techniques are very well known and have been widely used for many decades now. However, although there are a few methodologies to guide mining efforts, there are still elements that have to be defined and carried out project by project, without much guidance. In previous works, we have presented the PRICED framework, which defines a general strategy supporting mining efforts to provide organizations with evidence-based business intelligence. In this paper, we refine such ideas by presenting a concrete methodology. It defines phases, disciplines, activities, roles, and artifacts needed to provide guidance and support to navigate from getting the execution data, through its integration and quality assessment, to mining and analyzing it to find improvement opportunities.
Download

Short Papers
Paper Nr: 8
Title:

Do the Scaled Agile Practices from S@S Help with Quality Requirements Challenges and If So, How Do They Do It?

Authors:

Wasim Alsaqaf, Maya Daneva and Roel Wieringa

Abstract: Quality Requirements (QRs) pose challenges in many agile large-scale distributed enterprise systems. Often, enterprises counter such challenges by borrowing some heavyweight practices, e.g. adding more documentation. At the same time, agile methodologists proposed several scaled agile frameworks to specifically serve agile enterprises working on large and distributed systems. Little is known about the extent to which the proposed scaled frameworks address QRs and the specific ways in which this happens. Moreover, do these frameworks approach the QRs challenges in ways consistent with the Agile Manifesto? This paper treats these questions by analyzing one well-documented scaled framework, namely Scrum@Scale. We evaluated the alignment of Scrum@Scale with the Agile Manifesto, by means of the 4-Dimentional Analytical Tool proposed by other researchers. We then analyzed the practices of Scrum@Scale from the perspective of practitioners responsible for the QRs in a project, in order to understand how the Scrum@Scale practices mitigate those QRs challenges reported in previous work. Our analysis indicated that Scrum@Scale supports the agile values defined by the Agile Manifesto. Plus, we identified 12 Scrum@Scale practices that could (partially) mitigate one or more of the reported QRs challenges. Four of the reported QRs challenges got no remedy offered by Scrum@Scale.
Download

Paper Nr: 9
Title:

A Multi-modal Visual Emotion Recognition Method to Instantiate an Ontology

Authors:

Juan A. Heredia, Yudith Cardinale, Irvin Dongo and Jose Díaz-Amado

Abstract: Human emotion recognition from visual expressions is an important research area in computer vision and machine learning owing to its significant scientific and commercial potential. Since visual expressions can be captured from different modalities (e.g., face expressions, body posture, hands pose), multi-modal methods are becoming popular for analyzing human reactions. In contexts in which human emotion detection is performed to associate emotions to certain events or objects to support decision making or for further analysis, it is useful to keep this information in semantic repositories, which offers a wide range of possibilities for implementing smart applications. We propose a multi-modal method for human emotion recognition and an ontology-based approach to store the classification results in EMONTO, an extensible ontology to model emotions. The multi-modal method analyzes facial expressions, body gestures, and features from the body and the environment to determine an emotional state; this processes each modality with a specialized deep learning model and applying a fusion method. Our fusion method, called EmbraceNet+, consists of a branched architecture that integrates the EmbraceNet fusion method with other ones. We experimentally evaluate our multi-modal method on an adaptation of the EMOTIC dataset. Results show that our method outperforms the single-modal methods.
Download

Paper Nr: 28
Title:

A Machine Learning Approach for NDVI Forecasting based on Sentinel-2 Data

Authors:

Stefano Cavalli, Gabriele Penzotti, Michele Amoretti and Stefano Caselli

Abstract: The Normalized Difference Vegetation Index (NDVI) is a well-known indicator of the greenness of the biomes. NDVI data are typically derived from satellites (such as Landsat, Sentinel-2, SPOT, Plèiades) that provide images in visible red and near-infrared bands. However, there are two main complications in satellite image acquisition: 1) orbits take several days to be completed, which implies that NDVI data are not daily updated; 2) the usability of satellite images to compute the NDVI value of a given area depends on the local meteorological conditions during satellite transit. Indeed, the discontinuous availability of up to date NDVI data is detrimental to the usability of NDVI as an indicator supporting agricultural decisions, e.g., whether to irrigate crops or not, as well as for alerting purposes. In this work, we propose a multivariate multi-step NDVI forecasting method based on Long Short-Term Memory (LSTM) networks. By careful selection of publicly available but relevant input data, the proposed method has been able to predict with high accuracy NDVI values for the next 1, 2 and 3 days considering regional data of interest.
Download

Paper Nr: 47
Title:

A Resizable C++ Container using Virtual Memory

Authors:

Blaž Rojc and Matjaž Depolli

Abstract: Thread safety is required for shared data structures in shared-memory parallel approaches, but cannot be done efficiently for standard C++ containers with continuous memory storage, such as std::vector. Dynamically resizing such a container may cause pointer and reference invalidation and therefore cannot be done in parallel environments without exclusive access protection to the container. We present a thread-safe no-copy resizable C++ container class that can be used to store shared data among threads of a program on a shared-memory system. The container relies on the virtual memory controller to handle allocation as needed during execution. A block of memory of almost arbitrary size can be allocated, which is only mapped to physical memory during the first access, providing hardware-level thread blocking. All synchronization costs are already included in the operating system memory management, so using the container in parallel environment incurs no additional costs. As a bonus, references and pointers to elements of the container work as expected even after the container is resized. The implementation is not general however, and relies on the specifics of the operating system and computer architecture. Memory overhead can be high as the allocations are bound to the granularity of the virtual memory system.
Download

Paper Nr: 59
Title:

A Specific Language for Developing Business Process by Refinement based on BPMN 2.0

Authors:

Salma Ayari, Yousra B. Hlaoui and Leila Ben Ayed

Abstract: This paper deals with the development by refinement of BPMN(Business Process Model and Notations) models. Indeed, Business Process (BP) development based on step-wise refinement (i) facilitates the understanding of complex BP (ii) specifies all the BP semantics that the BPMN model has to describe by focusing on the smallest detail of the BP since details are added gradually to the model under development. Hence, we propose an approach assisting a business process developer to gradually build his/her BPMN model ensuring an automatic syntax and semantic property checking. The approach allows in one hand a BPMN syntax driven refinement based on a BPMN context-free grammar and in the other hand a formal verification of semantic properties. To be validated, the proposed approach is illustrated throughout the development of an on line flight booking BP.
Download

Paper Nr: 67
Title:

GUIMETRICS: An Extensible Cloud-based Application for Automatic Computation of GUI Visual Design Measures

Authors:

Nicolas Burny and Jean Vanderdonckt

Abstract: The visual quality of graphical user interfaces can be estimated by software measurement, which consists of measuring visual design formulas on a dataset of interfaces and interpreting them for improving their overall quality. When performed manually, this process becomes very tedious and error prone, especially for large datasets. When performed with existing software, this process is accelerated, but tied to a particular set of measures with their own interpretation, making them inflexible. To overcome these shortcomings, GUIMETRICS improves this process by automatically collecting screenshots in various platform configurations and resolutions and automatically computing and interpreting measures on-demand. The cloud-based architecture of GUIMETRICS can be extended with external modules for computing any visual measure, even in different programming languages, thus making it more flexible.
Download

Paper Nr: 69
Title:

Decentralized Application for Rating Internet Resources

Authors:

Andreea Buţerchi and Andrei Arusoaie

Abstract: A lot of existing web applications use systems for rating internet resources (e.g., YouTube uses like/dislike-based rating, IMDb uses star-based rating). Given the received ratings, the most popular internet resources can generate large amounts of money from advertising. One issue arising from this is that the existing rating systems are entirely controlled by a single entity (i.e., the owner of the web application). In this paper we present a blockchain-based decentralized application for rating internet resources. The proposed solution provides a transparent rating mechanism, given that no central authority is involved and the rating operations are handled by a specialized smart contract. We present an implementation of our solution, where we combine authentication methods with blockchain specific features, so that the users’ anonymity is preserved. We show that our approach is better than the existing rating systems in terms of transparency and reliability.
Download

Paper Nr: 87
Title:

Towards an Approach for Translation Validation of Thread-level Parallelizing Transformations using Colored Petri Nets

Authors:

Rakshit Mittal, Rochishnu Banerjee, Dominique Blouin and Soumyadip Bandyopadhyay

Abstract: Software applications often require the transformation of an input source program into a translated one for optimization. In this process, preserving the semantics across the transformation also called equivalence checking is essential. In this paper, we present ongoing work on a novel translation validation technique for handling loop transformations such as loop swapping and distribution, which cannot be handled by state-of-the-art equivalence checkers. The method makes use of a reduced size Petri net model integrating SMT solvers for validating arithmetic transformations. The approach is illustrated with two simple programs and further validated with a programs benchmark.
Download

Paper Nr: 97
Title:

A Novel Recommender System based on Two-level Friendship Ties within Social Learning

Authors:

Sonia Souabi, Asmaâ Retbi, Mohammed K. Idrissi and Samir Bennani

Abstract: Nowadays, social networks are starting to emerge as a huge part of e-learning. Indeed, learners are more attracted to social learning environments that foster collaboration and interaction among learners. To enable learners to handle their time and energy more effectively, recommendation systems tend to address these issues and provide learners with a set of recommendations appropriate to their needs and requirements. To this end, we propose a recommendation system based on the correlation and co-occurrence between the activities performed by the learners on one hand, and on the other hand, based on the community detection based on two-level friendship ties. The idea is to detect communities based on friends and friends of friends, and then generate recommendations for each community detected. We test our approach on a database of 3000 interactions and it turns out that the two-level recommendation system based on friendships reaches a high accuracy and performs better results than the recommendation system based one level friendship ties in terms of precision as well as accuracy. It turns out that expanding the detected communities to generate new communities leads to more relevant and reliable results.
Download

Paper Nr: 108
Title:

Extending DEAP with Active Sampling for Evolutionary Supervised Learning

Authors:

Sana Ben Hamida and Ghita Benjelloun

Abstract: Complexity, variety and large sizes of data bases make the Knowledge extraction a difficult task for supervised machine learning techniques. It is important to provide these techniques additional tools to improve their efficiency when dealing with such data. A promising strategy is to reduce the size of the training sample seen by the learner and to change it regularly along the learning process. Such strategy known as active learning, is suitable for iterative learning algorithms such as Evolutionary Algorithms. This paper presents some sampling techniques for active learning and how they can be applied in a hierarchical way. Then, it details how these techniques could be implemented into DEAP, a Python framework for Evolutionary Algorithms. A comparative study demonstrates how active learning improve the evolutionary learning on two data bases for detecting pulsars and occupancy in buildings.
Download

Paper Nr: 12
Title:

Guided Bee Colony Algorithm Applied to the Daily Car Pooling Problem

Authors:

Mouna Bouzid, Ines Alaya and Moncef Tagina

Abstract: The foraging behavior of bees has been adapted in a Bee Colony Optimization algorithm (BCO). This approach is a simple and an efficient metaheuristic that has been successfully used to solve many complex optimization problems in different domains, mostly in transportation, location and scheduling fields. In this study, we develop two algorithms for the Daily Car Pooling Problem based on the BCO approach. The developed algorithms are experimentally tested on benchmark instances of different sizes. The computational results show that the proposed approaches can produce good solutions when compared with an exact method.
Download

Paper Nr: 66
Title:

Designing and Implementing Software Systems using User-defined Design Patterns

Authors:

Mert Ozkaya and Mehmet A. Kose

Abstract: Software design patterns are the design-level solutions for the commonly occurring problems in software development. Design patterns are applied in many industries where problems repeat with slight changes, and applying the same solution that is proven to be quality reduces the development time and maximises the software re-use. DesPat is a modeling toolset that offers a modeling notation set based on UML’s class diagram for the users to design their software systems using the well-known 6 design patterns proposed by Gamma et al. (abstract factory, singleton, composite, observer, visitor, and facade). DesPat also supports the combinations of different pattern models for any software system, analysis of the pattern-centric models, and their automated generation into Java skeleton code. In this paper, we extend DesPat with a new toolset that enables users to define their own patterns. A pattern is defined with the types of components, component interfaces, and relationships (i.e., generalisation, dependency, realisation, and composition). Any pattern definitions can then be imported into the DesPat modeling toolset, through which one may specify software design models in accordance with the pattern definitions, check the models against the pattern rules, and transform their models in Java. We illustrate our extension with the gas station case-study.
Download

Paper Nr: 83
Title:

Microservices Adaptation using Machine Learning: A Systematic Mapping Study

Authors:

Anouar Hilali, Hatim Hafiddi and Zineb El Akkaoui

Abstract: The Microservice architecture is increasingly becoming the preferred architecture of modern applications. The logically distinct components that make up microservices make continuous delivery easier compared to monolithic architectures. This feature however makes it difficult for engineers to control the underlying services and properly adapt them at run-time. Designing our microservices as self-adaptive systems helps us tackle this issue. Each microservice can then dynamically monitor and adapt its behavior to change certain aspects of itself to achieve self-adaptive goals. The use of statistical and Machine Learning (ML) techniques helps in this area in a lot of ways (e.g., predicting resource usage, anomaly detection, etc.). This paper aims to provide a state of the art of the use of ML in microservice adaptation, the main goal is to provide an overview of the field and identify the most frequent adaptation goals and the types of adaptation techniques used. In order to carry out a comprehensive analysis, a well-defined method of systematic mapping is performed to categorize, according to a detailed scheme, every paper relevant to this topic. The results can potentially shed light on areas where further investigation might be warranted.
Download

Paper Nr: 92
Title:

Identification of Critical Links within Complex Road Networks using Centrality Principles on Weighted Graphs

Authors:

Nirupam Bidikar, Yunpeng Zhang and Fengxiang Qiao

Abstract: Building resilient infrastructure has become a necessity in modern times. If a system can efficiently deal with failures, it is considered resilient. Roadways are some of the most vital infrastructures in the world. Their collapse due to unprecedented calamities would disrupt the normal functioning of society and cause significant financial loss. To minimize traffic jams and keep traffic flowing during such times, it is essential to identify important roads within a network and plan alternate routes to divert traffic. This study aims to find critical links in a road network and study their relationships with important nodes in the same network. It also highlights some traditional approaches and applies graph-theory concepts to measure node and edge importance within a network. An approach based on variable centrality is proposed. We have implemented our proposed system and evaluated its performance on multiple networks including a large scale statewide road network in Texas. Our preliminary experiments show promising results.
Download

Paper Nr: 93
Title:

Edge Detail Analysis of Wear Particles

Authors:

Mohammad S. Laghari, Ahmed Hassan and Mubashir Noman

Abstract: Tribology is the study of wear particles that are generated in all machines with interacting mechanical parts. Particles are separated from the surfaces due to friction and relative motion. These microscopic particles vary in certain characteristics of size, quantity, composition, and morphology. Wear particles or wear debris are categorized by six morphological attributes of shape, edge details, texture, color, size, and thickness ratio. Particles can be identified with the help of some or all of these attributes however, only edge details analysis is considered in this paper. The objective is to classify these particles in a coherent way based on these attributes and by using the acquired knowledge to predict wear failure modes in machinery. There are two procedures described in this work; one is the angle calculation between equidistance points on the particle boundary and the other the computation of centroids’ distance from the boundary points. These procedures will classify particle edges as smooth, rough, straight, or spherical (curved).
Download

Paper Nr: 94
Title:

Safety-based Platoon Driving Simulation with Variable Environmental Conditions

Authors:

Youngjae Kim, Nazakat Ali and Jang-Eui Hong

Abstract: In platoon driving, a group of autonomous vehicles drives by forming one platoon to achieve advantages such as fuel efficiency and traffic congestion reduction. Ensuring the safety of such a platooning system is very challenging due to unexpected driving conditions e.g., adverse weather and obstacles on the road. Therefore, the safety of a platooning system should be guaranteed even in variable weather conditions. In this paper, we investigate the platooning system's unexpected behavior due to adverse weather conditions and provide safety guards to avoid potential hazards. Simulation techniques are essential to confirm that the designed safety guards work correctly, because testing such systems in a real situation can be highly expensive. Therefore, we extended VENTOS, an open-source platoon driving simulator to verify the provided safety guards, which can prevent risks under diverse weather scenarios e.g., fog, rain, snow, etc. Our simulation results show that the proposed safety guards for adverse weather conditions can enhance the safety of the platooning systems.
Download

Paper Nr: 111
Title:

Genetic Programming based Constructive Algorithm with Penalty Function for Hardware/Software Cosynthesis of Embedded Systems

Authors:

Adam Górski and Maciej Ogorzałek

Abstract: In this work we present a constructive genetic programming method with penalty function for hw/sw cosynthesis of embedded systems. The genotype is a tree which contains in its nodes system construction options. Unlike existing solutions in this approach individuals which violate time constrains are investigated during evolution process. Therefore the algorithm is even more able to escape local minima of optimizing parameters.
Download

Paper Nr: 113
Title:

Hierarchical Clustering Driven Test Case Selection in Digital Circuits

Authors:

Conor Ryan, Meghana Kshirsagar, Krishn K. Gupt, Lukas Rosenbauer and Joseph P. Sullivan

Abstract: The quality assurance of circuits is of major importance as the complexity of circuits is rising with their capabilities. Thus a high degree of testing is required to guarantee proper operation. If, on the other hand, too much time is spent in testing then this prolongs development time. The work presented in this paper proposes a methodology to select a minimal set of test cases for validating digital circuits with respect to their functional specification. We do this by employing hierarchical clustering algorithms to group test cases using a hamming distance similarity measure. The test cases are selected from the clusters, by our proposed approach of distance-based selection. Results are tested on the two circuits viz. Multiplier and Galois Field multiplier that exhibit similar behaviour but differ in the number of test cases and their implementation. It is shown that on small fraction values, distance-based selection can outperform traditional random-based selection by preserving diversity among the chosen test cases.
Download

Paper Nr: 116
Title:

Inclusion of User Behavior and Social Context Information in ML-based QoE Prediction

Authors:

Fatima Laiche, Asma Ben Letaifa and Taoufik Aguili

Abstract: The widespread use of online video content in every area of the connected world increases the interest in Quality of Experience (QoE). QoE plays a crucial role in the success of video streaming services. However, QoE prediction is challenging as many compelling factors (i.e., human and context factors) impact the QoE and QoE management solution often neglect the impact of social context and user behavior factors on the end-user’s QoE. To address these challenges, we have developed a web application to conduct subjective study and collect data from application-layer, user-level, and service-level. The collected data is then used as training set for machine learning models including decision tree, K-nearest neighbor, and support vector machine for the purpose of QoE prediction.
Download

Paper Nr: 120
Title:

IoT, Risk and Resilience based Framework for Quality Control: Application for Production in Plastic Machining

Authors:

Khaled Bahloul and Nejib Moalla

Abstract: The definition of defect prediction models in manufacturing emerges as an attractive alternative supported by industry 4.0 concepts and solutions. We propose in this paper an IoT-based approach for a global quality control mechanism in industry. We cover in this work the in-process quality control inspection, the production machines as well as the production environment monitoring. Our framework addresses data analytics algorithms using monitoring data, risk assessment models, resilience parameters and acceptance criteria for prediction models. The proposed concepts are implemented to control the manufacturing processes of a plastic product where the distinction between irregularity and nonconformity needs to be supported by a smart decision system.
Download