Banner
Home      Log In      Contacts      FAQs      INSTICC Portal
 
Documents

Keynote Lectures

Introducing AI/ML Components into Software Systems
Dominik Slezak, University of Warsaw, Poland

Process Mining as the Superglue between Data and Process Management
Wil van der Aalst, RWTH Aachen University, Germany

Software Architects are Dead! Long Live Software Architects!
Frank Buschmann, Siemens AG, Germany

 

Introducing AI/ML Components into Software Systems

Dominik Slezak
University of Warsaw
Poland
 

Brief Bio
Dominik received M.Sc. degree in Mathematics (1996) and Ph.D. degree in Computer Science (2002) from University of Warsaw. In his early academic career, he worked as Teaching Assistant at Polish-Japanese Institute of Information Technology. In 2003-2006, he worked as Assistant Professor at University of Regina in Canada. He also cooperated as Adjunct Professor with McMaster and York University. In 2008, he moved back to Poland and re-joined University of Warsaw, where he currently holds professorship position at Institute of Informatics. In 2011, he received D.Sc. degree from Institute of Computer Science of Polish Academy of Sciences. Dominik is continually engaged in both academic and commercial R&D initiatives, in the fields of Artificial Intelligence, Machine Learning and Big Data. In 1999, he co-founded QED Software with the aim of developing new, easy-to-interpret Data Exploration methods. In 2005, he co-founded Infobright - a database software company whose Granular Analytics technology was acquired by Security On-Demand in 2017. Currently, apart from his work for QED Software and University of Warsaw, he serves as Chief Scientist at Security On-Demand and he is an expert in several EU-funded projects in the areas including Recommendation and Advisory Systems. degree from Institute of Computer Science of Polish Academy of Sciences. Dominik co-authored over 200 scientific articles and he is co-inventor in six US patents. He organized over 20 conferences in Europe, Asia and both Americas. He delivered plenary talks at over 20 international congresses including keynote at IEEE/WIC/ACM Conference on Web Intelligence in 2015. He is Associate Editor for several scientific journals and he was one of Founding Editors of Springer’s CCIS Series. In 2012-2014, he served as President of International Rough Set Society. Currently, he serves as Vice-President of Polish Artificial Intelligence Society and Vice-President of IEEE CS Technical Committee on Intelligent Informatics.


Abstract
Adoption of Artificial Intelligence (AI) and Machine Learning (ML) solutions is on the bucket list of many companies which want to harness the power of the data, hence increasing their competitive advantage. When it comes to actual implementation of these solutions in the company’s day to day operation, the major revelation usually occurs: the realization that there is a sea of difference between launching a lab-based data science project aimed at building AI/ML models over historical data and making those models ``alive’’, i.e., working in a production environment, adapting to changes and truly supporting the users.

In this talk, we discuss some examples of modern AI/ML-related solutions from the perspective of their functionality and the ways of integrating them within bigger systems. As a case study, we consider the new software architecture for continuous, human-expert-driven improvement of data quality and AI/ML performance – called Label In The Loop (LITL) – designed by QED Software (www.qed.pl). We also refer to several other solutions aimed at improvement of interpretability and scalability of AI/ML methods.

As a parallel thread, we discuss the role of online data mining contests in the AI/ML adoption cycle. We refer to the contest held by QED Software and SOD (www.securityondemand.com) at the KnowledgePit platform (www.knowledgepit.ml), where the task was to learn a scoring model assisting the security operators in making decisions about patterns of suspicious behaviors in network traffic. We show a path for the contest winning models – submitted as scripts executed on anonymized data – to become a well-functioning part of the SOD’s infrastructure. We also discuss the case of SOD from the perspective of the aforementioned LITL architecture, whereby the security operators can play the role of experts who label new network events and cooperate with AI/ML to optimize the overall cybersecurity analytics process.



 

 

Process Mining as the Superglue between Data and Process Management

Wil van der Aalst
RWTH Aachen University
Germany
 

Brief Bio
Prof.dr.ir. Wil van der Aalst is a full professor at RWTH Aachen University leading the Process and Data Science (PADS) group. He is also part-time affiliated with the Fraunhofer-Institut für Angewandte Informationstechnik (FIT) where he leads FIT's Process Mining group and the Technische Universiteit Eindhoven (TU/e). Until December 2017, he was the scientific director of the Data Science Center Eindhoven (DSC/e) and led the Architecture of Information Systems group at TU/e. Since 2003, he holds a part-time position at Queensland University of Technology (QUT). Currently, he is also a distinguished fellow of Fondazione Bruno Kessler (FBK) in Trento and a member of the Board of Governors of Tilburg University. His research interests include process mining, Petri nets, business process management, workflow management, process modeling, and process analysis. Wil van der Aalst has published more than 230 journal papers, 22 books (as author or editor), 530 refereed conference/workshop publications, and 80 book chapters. Many of his papers are highly cited (he one of the most cited computer scientists in the world and has an H-index of 148 according to Google Scholar with over 100,000 citations) and his ideas have influenced researchers, software developers, and standardization committees working on process support. He has been a co-chair of many conferences including the Business Process Management conference, the International Conference on Cooperative Information Systems, the International Conference on the Application and Theory of Petri Nets, and the IEEE International Conference on Services Computing. He is also editor/member of the editorial board of several journals, including Business & Information Systems Engineering, Computing, Distributed and Parallel Databases, Software and Systems Modeling, Computer Supported Cooperative Work, the International Journal of Business Process Integration and Management, the International Journal on Enterprise Modelling and Information Systems Architectures, Computers in Industry, IEEE Transactions on Services Computing, Lecture Notes in Business Information Processing, and Transactions on Petri Nets and Other Models of Concurrency. He is also a member of the Council for Physics and Technical Sciences of the Royal Netherlands Academy of Arts and Sciences and serves on the advisory boards of several organizations, including Fluxicon, Celonis, Processgold, and Bright Cape. In 2012, he received the degree of doctor honoris causa from Hasselt University in Belgium. He also served as scientific director of the International Laboratory of Process-Aware Information Systems of the National Research University, Higher School of Economics in Moscow. In 2013, he was appointed as Distinguished University Professor of TU/e and was awarded an honorary guest professorship at Tsinghua University. In 2015, he was appointed as honorary professor at the National Research University, Higher School of Economics in Moscow. He is also an IFIP Fellow and elected member of the Royal Netherlands Academy of Arts and Sciences (Koninklijke Nederlandse Akademie van Wetenschappen), Royal Holland Society of Sciences and Humanities (Koninklijke Hollandsche Maatschappij der Wetenschappen), and the Academy of Europe (Academia Europaea). In 2018 he was awarded an Alexander-von-Humboldt Professorship, Germany’s most valuable research award (five million euros).


Abstract
Process mining is able to reveal how people and organizations really function. Often reality is very different and less structured than expected. Process discovery exposes the variability of real-life processes. Conformance checking is able to pinpoint and diagnose compliance problems. Task mining exploits user-interaction data to enrich traditional event data. All these different forms of process mining can and should support Robotic Process Automation (RPA) initiatives. Process mining can be used to decide what to automate and to monitor the cooperation between software robots, people, and traditional information systems. In the process of deciding what to automate, the Pareto principle plays an important role. Often 80% of the behavior in the event data is described by 20% of the trace variants or activities. An organization can use such insights to "pick its automation battles", e.g., analyzing the economic and practical feasibility of RPA opportunities before implementation. This paper discusses how to leverage the Pareto principle in RPA and other process automation initiatives.



 

 

Software Architects are Dead! Long Live Software Architects!

Frank Buschmann
Siemens AG
Germany
 

Brief Bio
Frank Buschmann is Senior Principal Engineer at Siemens Corporate Technology in Munich, where he is leading research on modern software architecture and development approaches for industrial digitization. Current focus of his activities is on architectures for Cyber-Physical (Production) Systems, the Internet of Things and Intelligent Systems, and industrial-grade DevOps. Frank also advises Siemens management and product development organizations in the efficient application of these technologies to develop innovative products. Frank has more than 40 years of professional experience in software engineering, regularly speaks at renown conferences, and is co-author of four volumes of 'Pattern-Oriented Software Architecture' published by John Wiley & Sons.


Abstract
In an ideal world, agile teams do all design work collectively, microservices allow a system's architecture to emerge. The agile crowd is king. There is no need for software architects. They are dead, really dead! But many systems consists of hundreds of microservices. They must handle millions of events fast. End to end. And they must be reliable, safe and secure. Users expect Martini availability of the system — any time, any place, anywhere! Emergent architectures can rarely handle such complexity without becoming a big ball of mud. The agile crowd is puzzled, really puzzled. Long live software architects! This talk outlines the new role and responsibilities of architecture and architects in the face of IoT, the cloud and the ongoing digitalization of business.



footer