T01 - Information Systems Architecture: Stakeholders, Viewpoints and Perspectives
Eoin Woods and Nick Rozanski
Zuhlke Engineering Limited, email@example.com or firstname.lastname@example.org
personal web sites: www.eoinwoods.info and www.nick.rozanski.com.
Software architecture is made
challenging by the number of stakeholders the architect must
consider and the inherently multi-dimensional nature of a
system. This situation has led to the definition of the
viewpoint and view” based approach to architectural design and
A number of sets of viewpoints have been developed, that guide architects through the process of designing and describing the different structures of their systems (such as functional structure & deployment environment), each viewpoint guiding the creation of a particular type of view. More recently, the presenters have introduced a complimentary concept, that of the “perspective”, sets of which are used to guide an architect through the process of designing systems with particular quality properties. This tutorial will present a set of viewpoints and perspectives for information systems architecture and will explain how they can help an architect to design a system to meet the needs of its stakeholders.
The intended audience for this tutorial is practitioners or researchers interested in how to approach the problem of creating an architectural design for complex information systems. Some prior appreciation or experience of dealing with large scale system design or conflicting stakeholder requirements would be useful, but is not required
· To provide an overview of the viewpoint oriented approach to software architecture to allow people to go away and use it after the conference.
· To introduce and explain a particular set of architectural viewpoints, defined by the presenters, which are suitable for guiding the architect of a large scale information system.
· To introduce a new concept (the “perspective”) that can guide the practitioner’s consideration of the quality properties of their system.
· To introduce and explain a particular set of perspectives that can be used with the viewpoints presented earlier, to guide the consideration of the quality properties of a large information system.
Eóin Woods has been working in the enterprise IT field for about 15 years, working for companies including Ford, Groupe Bull, Sybase, InterTrust and currently Zuhlke Engineering where he works as a principal consultant, based in their London office. Eóin works primarily as a consultant IT systems architect and has led applied research, product development and large scale information systems implementation work. He particularly specialises in the financial markets domain, working with a range of investment and trading organisations.
Eóin’s main professional interests are software architecture, distributed systems, IT security and data management. He has published a number of technical papers in these areas and regularly presents sessions at industry conferences. He is co-author of the book “Software Systems Architecture: Working With Stakeholders Using Viewpoints and Perspectives”, published by Addison Wesley (2005).
Nick Rozanski is an Enterprise Technical Architect for Marks and Spencer, the United Kingdom’s largest clothing and food retailer. His current architectural portfolio is focused on integration and workflow, but he also has a strong interest in information architecture and in the role and practice of the software architect.
Nick has worked in IT since 1980 for several large and small systems integrators, including Logica, Capgemini, and Sybase. He has taken senior roles on a wide range of programs for clients in finance, retail, manufacturing, and government. His technology background includes enterprise application integration, package implementation, relational database, data replication, and object-oriented software development. He is also an experienced technical instructor and certified internal project auditor. Nick’s personal web site is http://www.nick.rozanski.com.
T02 - Using Dependency Models to Manage Software Architecture
Lattix, Inc., email@example.com
This tutorial will present a practical technique for managing the architecture of large software systems using Dependency Models. We will demonstrate that the matrix representation used by these models provides a unique view of the architecture and is highly scalable compared to the directed graph approaches that are common today. We will also show a variety of matrix algorithms that can be applied to analyze and organize the system into a form that reflects key architectural patterns and highlights problematic dependencies.
During the tutorial we will illustrate our approach by applying it to real applications each consisting of hundreds or thousands of subsystems (files). The dependency model will be formally specified using design rules which enable architectural violations to be identified automatically. Finally actual dependency models will be created for multiple generations of a software application to highlight how architecture evolves and how it often begins to degrade.
This tutorial will be of interest to researchers, architects and senior developers involved in the design of large software systems. Participants should have significant software development experience. However, no specific knowledge of an application domain or a design methodology is required.
The primary aim of this tutorial is to teach attendees on how dependency models can be used to visualize, communicate and manage the development centric view of software architecture. They will learn about the science of Dependency Structure Matrices, how it has evolved significantly over the last decade and how it has recently been extended to software systems,
Neeraj Sangal is President of Lattix, a company specializing in Software Architecture Management solutions and services. He has analyzed many large proprietary and open source systems. Prior to Lattix, Neeraj was President of Tendril Software which pioneered model driven EJB development and synchronized UML models for Java. Tendril was acquired by BEA/WebGain. Prior to Tendril, Neeraj managed a distributed development organization at HP. Neeraj has published and presented papers; his most recent work on architecture was a joint effort with MIT and will be presented at OOPSLA 2005.
T03 - Value Modeling and Architecture Strategy
Technical Director, Foliage Software Systems, Inc., firstname.lastname@example.org
Complexity in software-intensive systems is growing at an accelerating rate. As the price-performance of hardware improves, expectations expand. As off-the-shelf and open source components become more plentiful and the Internet provides an easy way to procure them, expectations multiply. And as solutions are deployed for the relatively easy problems, the more difficult ones move to the head of the list.
Value Models are a technique that goes beyond business requirements to model the sources of value in the various deployment environments for a product or system. Architecture Strategy focuses on the core principles of organization, operation, variability, and evolution that enable a system to satisfy its value models to the best degree possible.
This half-day tutorial covers the core Value Models concepts, a process for creating them, techniques for discovering architecture challenges, and an approach for using these challenges to formulate effective architecture strategies.
Target audience includes software architecture practitioners, researchers, and students, as well as product marketing managers and software program managers. A basic understanding of the concepts of software engineering and market segmentation are pre-requisites.
1. Improve the effectiveness of communication between the business and engineering sides of an organization.
2. Describe field-tested techniques for creating value models for a system’s deployment contexts, and analyzing the variations between them.
3. Describe approaches for identifying and prioritizing architecture challenges arising from a context’s value model, and analyzing these challenges across contexts.
4. Understand how to use architecture challenges to inform the process of making the critical set of early architecture decisions for a system.
5. Observe how architecture challenges are a mirror image of approaches used to assess software architectures, such as ATAM(TM) and how they provide traceability back to these approaches.
6. Understand how and why a foundation set of core architecture principles is a powerful device to establish and maintain focus on a system’s purpose and the plan to achieve it.
7. Learn how systems dynamics can be used to understand the value contexts of surrounding systems (operations, organizations, competitive markets, etc.), and assess how change is likely to affect these systems. In turn, this leads to a better understanding of the risks of change faced by the system being developed.
Charlie Alfred has 25 years of experience as a software engineer and architect, on a wide range of application areas including: medical devices, semiconductor control systems, electronic payment systems, realtime options trading, and workflow management. His main interests are in the development of software architecture formulation and evaluation methods and he has published several articles and white papers in these areas. In addition, he has extensive experience in distributed architectures, relational and object database management systems, object-oriented design and programming, design patterns, real-time control systems and multi-threaded techniques. Charlie has lead architecture consulting engagements in the following industries:
· Semiconductor control system framework architecture definition
· Hemodialysis machine architecture assessment
· Radiology workflow architecture definition
· Pharmaceutical clinical trial data integration and project management
· Endoscopy data management and hospital integration
· Electronic funds transfer switch architecture assessment
In addition, he has been the technical project leader for several systems:
· Endoscopy data management and integration platform
· Electronic collections (payments) engine for the U.S. Treasury
· Technical planning workflow for EBeam lithography mask manufacturing
· Realtime quote distribution and options trading system
· Control system for an electron beam lithography tool
· Airline inconvenienced passenger rebooking system
· Realtime import of securities data feeds into a spreadsheet application
· Routing, scheduling, and dispatching for local-area trucking operations
Charlie has also been involved in the area of training and education, as the Education Manager for Object Design, Inc., the market leader in object database management systems. In this role, he taught the courses Programming with ObjectStore and was the author and principle instructor for the Designing Object Databases course. In his current role, Charlie has developed and taught several internal training programs at Foliage Software Systems. The most recent is an intensive two-week course designed to develop architecture, value modeling, systems thinking, facilitation, and communication skills in prospective consultants.
T04 - Quality Evaluation by QADA
Eila Niemelä and Mari Matinlassi
VTT Technical Research Centre of Finland, Embedded Software
Software quality is one of the major issues with software intensive systems. Quality is especially important in software product families exploiting a single set of components and common architecture in a set of products. Therefore, software quality should be evaluated as early as possible, for example, using the descriptions of software architecture.
Quality properties, such as maintainability and extensibility, can be evaluated in the development phase. These properties are called evolution qualities. Others, like reliability and performance, are intertwined with the functionality of a system and thus observable only at run-time. In order to be able to evaluate quality at the architectural level, quality properties have to be defined and represented in architectural models, as derived from the requirements specifications of a product (family).
Current modelling approaches do not support representation of variability and quality requirements, or traceability of quality from requirements to designs and code. Our contribution is the QADA® (Quality-driven Architecture Design and quality Analysis) methodology that provides a set of methods and techniques for developing high-quality software architectures for single systems and system families. The methodology is initially targeted at the development of service architectures applied in distributed networked systems.
The tutorial is intended for research and industrial professionals interested in architecture-centric software development and software quality assessment. Thus, the tutorial is especially intended for software architects, product family architects and other software engineering professionals.
The main objective of the tutorial is to present a systematic way to transform quality requirements to software architecture, and furthermore, to analyze how quality requirements are met in architecture design. QADA promotes iterative and incremental architecture-centric software development, where a particular view of design decisions is presented as a model or a set of models. Modeling assists in sharing visions, knowledge and common understanding about the properties of the product under development and of the means of realizing these properties.
Eila Niemelä obtained the MSc degree in 1995 and the PhD degree in 2000 in information processing science from the University of Oulu, Finland. Before graduation she worked fifteen years as a software engineer of embedded systems and from 1995 to 1998 as a senior research scientist in the Embedded Software research area at VTT Technical Research Centre of Finland. In 1998-99 she was a visiting researcher at Napier University, Edinburg, UK. After that she led the Software Architectures Group at VTT until September 2002. Since 2001 she has been working as a research professor at VTT and since 2002 also as a docent of software architectures and components at the University of Oulu.. Her main research interests include software architecture design, quality analysis on the architecture level and service architectures of pervasive computing environments.
Mari Matinlassi received her MScTech degree in Information Engineering from the University of Oulu in 2002. Currently she is working as a research scientist at VTT Technical Research Centre of Finland where her areas of research interest include architectural design methods, modeling and documenting software product-line architectures. She is also a Ph.D. student in the Department of Information Processing Science at the University of Oulu, Finland.
T05 - Large-Scale Software Architecture: A Practical Guide Using UML
CrystalClear Software, Inc, email@example.com
General Dynamics Decision Systems, Richard.Anthony@gdds.com
Dealing with the complexity of large-scale systems is a challenge for even the most experienced software designers. Large software systems can contain millions of elements, which interact to achieve the system functionality. This tutorial presents an overview of software architecture terminology and approaches, focusing on a set of UML viewpoints that represent the important aspects of a large-scale software architecture. These viewpoints include context, component, subsystems, process and deployment. The viewpoints leverage the recent IEEE 1471 standard for software architecture representations providing a description of the purpose, stakeholders, and techniques associated with each viewpoint. In addition, the tutorial describes other practical techniques essential to developing effective software architectures including:
· Techniques for handling large complex / enterprise systems
· Modeling software subsystem dependencies and interfaces
· Modeling components, component interactions, and component integration
· Modeling process communication and software/hardware deployment
· Fitting architecture development into development processes
Target audience includes architects, project managers, and software developers. Attendees should have a basic understanding of UML and an interest in software architecture. Experience on a large-scale software system is beneficial but not required.
· Understanding of several useful software architecture views using UML
· An understanding of how and when to apply these views
· How the views aid in understanding software architecture attributes such as maintainability, performance, and scalability
Tutorial info URL
Jeff Garland has worked on many large-scale software projects over the past 20 years, in many different domains, including telephone switching, industrial process control, satellite ground control, and financial systems. He has served as both the lead architect and a member of the architecture team on several of these projects. Mr. Garland holds a Master's degree in Computer Science from Arizona State University and a Bachelor of Science in Systems Engineering from the University of Arizona. He is currently President and Chief Technology Officer of CrystalClear Software. CrystalClear Software is a consulting firm that specializes in the development of software architectures for large-scale systems.
Richard Anthony has 20 years experience working on large-scale software development efforts. The systems are from application areas such as satellite and network operations systems, telephony base station control, manufacturing, and simulation. He has served in the role of chief software architect, design engineering technical lead, software design lead, software system engineer, and developer on projects in these application areas. Mr. Anthony holds Master's degrees in Computer Science and Mathematics, as well as a Bachelor's degree in Mathematics Education, all from the University of Wyoming. He is currently a Senior Software Architect at General Dynamics Decision Systems.
T06 - Variability Management in Software Product-Lines
Klaus Pohl, Andreas Metzger
Software Systems Engineering, University of Duisburg-Essen, Germany
One challenge during software design is to map varying requirements to a flexible architecture and efficiently manage this mapping throughout the entire software life cycle. Software product-line engineering, with its techniques for explicitly managing variability, provides a systematic approach to tackle this challenge. This tutorial provides a comprehensive introduction to variability management in software product-lines. First, product-line engineering is motivated as a development approach that leads to lower costs, shorter time-to-market, and higher product quality. Then, the processes domain engineering and application engineering are introduced and their activities are explained following a refined software product-line engineering framework. In the main part of the tutorial, we go into details on the concept of variability and discuss the need for the explicit documentation and management of variability. We offer solutions for the documentation and the management of variability across all product-line artifacts (including requirements and design artifacts) by introducing our orthogonal variability modeling technique (OVM-T). Special attention is given to the kind of variability that becomes relevant when the product-line architecture is considered, and therefore has to be managed in addition to the variability in requirements.
This tutorial is directed towards software professionals, researchers, and students who are interested in software product-line engineering and on how to apply its principles to the development of software systems. In particular this includes interest in notations for modeling variability and techniques for managing variability.
The tutorial assumes a basic knowledge of the development process for single systems (including requirements engineering and architectural design) as well as familiarity with state-of-the-art modeling languages (especially the UML).
By the end of the tutorial, the attendees will be familiar with the concepts of software product-line engineering. They will be able to differentiate between the two processes domain engineering and application engineering, and will have an understanding of the differences between single-system design and the design activities in product-line engineering. Especially, the participants will have learned about the concept of variability and will be able to model variability in requirements and architectural design artifacts by using our orthogonal variability modeling technique (OVM-T).
In addition to the usual tutorial handouts, the new text book on software product-lines will be available fresh from the press:
Pohl, K.; Böckle, G.; van der Linden, F.: Software Product-line Engineering – Foundations, Principles, and Techniques. Berlin, Heidelberg, New York: Springer. August, 2005. http://www.software-productline.com/
Presenters’ Biographies (brief)
After years of industrial and academic experience, Prof. Dr. Klaus Pohl has joined the University of Duisburg-Essen where he holds a full professorship for software systems engineering. He received his PhD and his habilitation in computer science from RWTH Aachen, Germany. He is involved in various technology transfer projects as well as major research projects which focused on different aspects of software product-line engineering.
Klaus Pohl is (co-)author of over 90 referred publications. He served as program chair for several international and national conferences, such as the IEEE Intl. Requirements Engineering Conference (RE ‘02), the Experience Track of the 27th International Conference on Software Engineering (ICSE 2005), the German Software Engineering Conference (SE 2005), the 9th Intl. Software Product Line Conference (SPCL Europe 2005) and the 18th Intl. Conference on Advanced Information Systems Engineering (CAiSE 2006).
Dr. Andreas Metzger is a senior research assistant at the “Software Systems Engineering” group at the University of Duisburg-Essen. He received his PhD in 2004 from the University of Kaiserslautern on the topic of model-based quality assurance. His current research interests include model-driven software development, object-oriented modeling and meta-modeling, feature-interactions in product-lines, and variability modeling and management.