Lifecycle Architecture & Integration Track

 
Mik Kersten
Mik Kersten
Track Chair
 
Over the past decade fragmentation has grown in the ALM ecosystem, with best of breed tools forming to support the various needs of developers, business analysts, DevOps and testers. Large scale Agile transformations and ALM modernizations now involve multiple tools as well as a in-house solutions and customizations. Just as architecture became a key discipline in app dev once the breadth of systems and frameworks required it, we are seeing ALM heterogeneity of tools and processes necessitate a new role of the Lifecycle Architect, and a new practice of Software Lifecycle Integration (SLI). It is now time to recognize the diversity of tools and practitioners in the software development and deployment lifecycle, and seeks to improve the connection and collaboration among them. The objective of this track is to examine this rapidly emerging organizational need and address:
  • Changes to the modern software development lifecycle that have lead to a disconnected software delivery process
  • How Agile and deployments are affecting the need for integration and automation to reduce cycle time across the lifecycle
  • Processes, tools and methodologies for unifying the end-to-end flow of information across the lifecycle
  • Open standard and open source based approaches to integration
  • Business drivers for integration including reporting, visibility, traceability and collaboration
  • Connecting Lean Startup and Lean delivery mythologies with Agile to pave the way for Lean ALM
 


Breakout Sessions

 
Ravit Danino
Ravit Danino
Director, Applications Product Management
HP Software & Solutions
 
 

ALM for the Internet of Things

By 2020 the number of connected devices will hit 24B; the number of apps that they will run and the amount of data they will consume and generate will multiply by 5; and $210B will be spent on cloud services.

Sound scary? These data points will change our lives. At any given time, any person and any device, including cars, coffee makers and refrigerators, will be connected. Business applications will span across devices, across processes and across each one of us.

In this reality, the performance and the functionality of applications will become even more business critical than they are today. In order to effectively test and deploy those applications, organizations will be required to leverage the cloud.

So if this is the new reality, we need to get ready. How will you ensure value is delivered in this reality? How will you ensure quality, agility, velocity and scalability for these next generation applications? What are the best processes to use in this reality to ensure that the ROI is achieved?

Join us for this session to understand this new reality of applications and how embracing the modern approach to Application Lifecycle Management will enable you and your organization to handle these challenges and win.
 
 
Steve Speicher
Steve Speicher
Senior Technical Staff
IBM Rational
 
 

Better Integration through Open Interfaces

Like children growing up and not wanting to share their toys, ALM tools have often keep their artifacts to themselves. This caused limited choice and closes the world the tools' user can easily provide a seamless experience when navigating and working with disparate resource types. We'll explore patterns in how ALM tools have evolved, both from the perspective of the developer and what they are looking to achieve and also the producers of these tools. Then we'll show how new practices such as RESTful Web APIs allow the flexibility and loosely-coupled nature needed for resilient integrations. Then we'll hit on real world scenarios where it highlights the importance to have a consistent way to both expose these APIs and the data models they work with. Next we'll explore various standardization efforts such as Open Services for Lifecycle Collaboration (OSLC) and Linked Data that filling this need, as well as how open source projects such as Eclipse Lyo and OSLC4Net help enable this.
 
 
Jon Harding
Jon Harding
SVP, DevOps Engineering Lead
Bank of America
 
 

DevOps Enabled Delivery as an Imperative for Successful Enterprise Agile Adoption

This talk will overview DevOps Practices that add efficiency, quality, velocity, and transparency to Agile software shops. Jon will dive into an analysis of exact practices that align to the Agile manifesto and the necessary financial and human capital investment to successfully role out an Enterprise Agile DevOps practice. Delivering software in a large highly distributed and regulated enterprise is difficult for many reasons, but DevOps can help establish a software assembly line, while still allowing for an integrated development process that bakes quality into the product.

The program not only embraced the Agile practices but redesigned the organizational structure. It took a cultural transformation to bring true collaboration between technology and the business to add the business value. The program has been practicing Agile for almost 2 years. You can learn the key elements that made it successful and also areas that are in focus on for the next level of maturity as part of continuous improvement.
  • How we scaled with a global team in a Waterfall world: organizational design, culture transformation, communication patterns and overcoming enterprise process constraints
  • How we have matured in our tool usage to further enhance our effectiveness: Include such topics as the value of interactions over documentation, showcase our defect unification, traceability, transparency and release management that ensures business value is delivered when needed.
 
 
Arthur Ryman
Arthur Ryman
Distinguished Engineer
IBM Rational
 
 

Link, not Synch!

Tool integration has often been accomplished by synchronizing data between pairs of tools. This approach is problematic for several reasons. OSLC provides an alternative approach based on linking data. OSLC delegated user interface services eliminate the need to synch data in many cases. The recent OSLC Tracked Resource Set specification, combined with an RDF/SPARQL reporting architecture and OpenSocial, make the need to synch a thing of the past.
 
 
Jon Leslie
Jon Leslie
Senior Production Expert
Hansoft
 
 

Mixed Methods in a Large Scale Agile Environment

In large scale agile environments with teams of teams working together, some kind of structure for how to organize is typically used. For example, the Scaled Agile Framework (“SAFe”) provides a framework for working in an efficient way.

However, in practice this is quite hard since the team needs tend to be quite different. For example; companies selling hardware products with embedded software have product programs with hardware developers, who often prefer traditional Gantt scheduling, who need to collaborate with software teams that might prefer Scrum.

This presentation will be based on real world examples where companies have utilized mixed methods in a large-scale agile development environment to succeed with developing better products in a more collaborative way and hence stay ahead of competition.

Topics covered will be:
  • How to work with mixed methods such as Scrum, Kanban and Gantt in a single development program
  • Program level collaboration. Multiple teams working in a single program product backlog allowing a single product release train to avoid siloed work
  • Finally, what are relevant actionable metrics to measure to track progress and make better business decisions?

 
 
Jens Donig
Jens Donig
Senior Consultant
HOOD
 
Martin Kuenzle
Martin Kuenzle
Program Manager, ALM
evosoft
 
 

Workflows à la carte – a model-based approach to the configuration of ALM systems

Wouldn’t it be great if you could adapt appearance and behavior of your digital working environment to your current tasks through a few clicks? Of course customized to your personal needs and preferences, possibly including recommendations and suitable advice. This is a dream of the future, especially for elaborate software development environments. Integrated ALM platforms nevertheless open up new perspectives for customization. We present how to raise workflow automation from coding to the modeling level.

In our talk we illustrate the design of a graphical DSL for Microsoft TFS process templates. We explain the methodical approach, the selection of the modeling tool, and the design decisions that took the project to success. Complemented by code and document generators, model-based ALM takes the hassle out of design and maintenance of individual customer solutions. We demonstrate the new degrees of freedom by means of a realistic and an experimental scenario.
 
 
Mikio Aoyama
Mikio Aoyama
Professor
Nanzan University
 
 

PROMCODE: An Open Platform for Large-Scale Contracted Software Delivery in Software Supply Chains

Software delivery by contract between an acquirer and a supplier is commonplace. For large projects, it is not uncommon to employ a chain of multiple suppliers. In fact, Japanese companies have been using large-scale networks of suppliers for large system integration projects for many years. Some large-scale projects may include over 100 suppliers, and a supply chain extends to suppliers in India, China and other countries. Major challenge in software delivery by a large number of suppliers is the project management overhead. Currently, such management uses data unique to each project and/or supplier. Time-consuming and error prone manual operations are performed to do project management.

To overcome the problem, six top system integrators of Japan, namely IBM, Fujitsu, NEC, NTT DATA, Hitachi and Nomura Research Institute, formed a consortium called PROMCODE (PROject Management of COntracted DElivery for software supply chain). It is intended to develop and prove an open interface specification to exchange data of different schema across the organizational boundaries. The open interface specification is based on OSLC framework and defines essential information as linked data. Each member company of PROMCODE consortium did pilot projects by using the PROMCODE interface specification by applying it to project management data used in real customer projects and validated its value. Also, we have enhanced existing Eclipse Lyo OSLC adapter for spreadsheet to support the PROMCODE interface specification.

This session will introduce background, PROMCODE interface specification, pilot projects by member companies, and OSLC adapter enhancement. The session will conclude with our experience and future roadmap of PROMCODE for extending the impact of PROMCODE in this area.
 
 
Nicole Bryan
Nicole Bryan
VP Product
Management
Tasktop
 
 

SAFe is Only as Strong as Your Integration Strategy

“Our modern world runs on software. In order to keep pace, we practitioners must build increasingly complex and sophisticated software systems. Doing so requires larger teams and continuously rethinking the methods and practices – part art, science, engineering, mathematics, social science – that we use to organize and manage these important activities.” (From SAFe site – Dean Leffingwell and the Scaled Agile Framework Contributors)

There is no question that Agile methodologies are no longer the exclusive domain of start-ups and small, co-located teams. Many of the largest software development organizations in the world are adopting Agile and Lean methods at enterprise scale. Some have turned to The Scaled Agile Framework (SAFe) to help them understand the best practices necessary to accomplish that.

Success with SAFe is dependent on seamless flows of information at different levels and through different parts of the organization in order to achieve agility at scale. Yet the tool infrastructure in place at large (and small) organizations tends to impede this flow of information. After all, different disciplines within the organization use different tools to manage their activities and the development artifacts they create. And rarely are these tools integrated.

We’ve discovered that SAFe is only as strong as the weakest link in your integration strategy; you need to have a solid integration strategy in order to accomplish the requisite seamless flow of information and collaboration between the practitioners and managers on the team.

The goals of SAFE are sound and attainable – if you develop a comprehensive ALM integration strategy in conjunction with your SAFe strategy.

In this session, Nicole will show you how to design and implement Software Lifecycle Integration (SLI) patterns which are the fundamental basis for an enterprise-wide scalable integration strategy, and a necessary underpinning for SAFe.
 
 
Jeff Haynie
Jeff Haynie
Co-Founder & CEO
Appcelerator
 
 

Stop Debating, Start Measuring: How User Analytics Change Lifecycle Speed and Output

Agile taught us that short, frequent release cycles are better than long, artifact-heavy ones. With the rise of mobile, a new lesson has been added to the game — namely, that measuring user sentiment is critical to keeping cycles short, teams productive, and users engaged. Strangely, many companies are content simply to wait for user feedback. The problems with this approach — it skews to polarities, capturing the sentiments only of the most thrilled or disenchanted; it’s a lagging indicator, leaving companies little time to correct problems before users move on — seem evident. But what is the alternative? In fact, there's a new breed of user analytics, one that harvests leading indicators of user experience to drive business investment and backlog prioritization. In this session, Appcelerator CEO and co-founder Jeff Haynie will investigate how traditional PC and web application analytics are evolving to meet the demands of mobile users, as well as identify the five user-based analytics no organization should be without.
 
 
Mik Kersten
Mik Kersten, Chair
CEO
Tasktop
 
 

Towards a Lean Software Lifecycle - Industry Panel

Agile development is not enough. We need to evolve beyond the efficiencies gained by small sets of developers, to a view that software development and delivery should be regarded as a first-class business process. The end-to-end software lifecycle encompasses enterprise processes spanning from business strategy to IT operations. This panel of industry experts will share how leading organizations are adopting new lean ALM practices and tools to enable a build-measure-learn loop that optimizes this business process at scales ranging from the "lean startup" to the Fortune 100.

Ken Schwaber co-developed the Scrum process with Jeff Sutherland in the early 1990s to help organizations struggling with complex development projects.
Ken Schwaber
Scrum.org
Mark Wanish is a Bank of America Technology Executive responsible for the account opening process in the online / mobile channel.
Mark Wanish
Program Director
Bank of America
Sam Guckenheimer is the Product Owner for Microsoft Visual Studio. Sam is also the author of Software Engineering with Microsoft Visual Studio Team System.
Sam Guckenheimer
Product Owner Microsoft Visual Studio
Microsoft
Ravit Danino has more than 7 years’ experience as a product Manager, strategy and business development professional in the enterprise software, systems and networking area.
Ravit Danino
Director, Applications Product Management
HP
 
 
Carson Holmes
Carson Holmes
EVP of Service Delivery
Software Development Experts
 
 

Using Delivery Intelligence from the ALM Portfolio to Enable Strategic Change

Investment in ALM technology holds the promise of improving efficiency and effectiveness across the entire IT value-stream. Because integration is a key determinant to driving better outcomes, integration at the ALM infrastructure level has received quite a bit of attention in recent years. However, investment in ALM infrastructure and its inherent process enactment, is a form of strategic change, and is no different from other strategic initiatives when it comes to the historical success rates for large-scale capability improvement initiatives. Motivational theory tells us that, thanks to the social dimension in the software delivery ecosystem, driving successful transformation in this space requires embracing the diversity of the modern enterprise. But this diversity goes beyond that of toolset vendor offerings; integration requires embracing the highly variable ways-of-working that are manifested in ALM technology. This session will explore the missing link to empowering enterprises to leverage a strategy of hybrid-ALM infrastructure and hybrid methodology. The speaker will talk about integrating delivery intelligence from a broad spectrum of ALM offerings enabling the integration of key performance indicator data, and correlating it to the way teams actually perform their work, to uncover areas for infrastructure and process improvement.
 


Lightning Sessions

 
Michael Azoff
Michael Azoff
Principal Analust
Ovum
 
 

Challenges and opportunities in ALM-PLM integration

The world of product lifecycle management (PLM) is now aware and acting on the need to integrate with ALM solutions, driven by the massive growth of embedded software in engineered products. Moreover, embedded software engineers have yet to realize the full benefits of ALM as the maturity in industry is generally behind that of enterprise IT, where ALM-PLM integration represents the highest maturity level. In this context the drivers for ALM adoption are examined and the opportunities for engineering industries are highlighted. Software today is the value-add in manufactured products, and we are on the brink of an Internet of Things explosion as products and services become connected.
 
 
Ludmila Ohlsson
Ludmila Ohlsson
Strategic Product Manager
Ericsson
R&D and Test
 
 

Integration Principles and Reality

Ericsson has a heterogeneous tools landscape and to achieve an effective flow when using the tools, integration makes sense. But how should integrations be done? The presentation introduces a few principles as part of a strategy for working with integrations, and shows experiences from a number of key integrations cases where the principles meet reality. From that meeting of principles and reality, Ericsson has chosen OSLC as its primary integration technology, so the presentation will also explore the reasons for, and consequences of, this choice.
 
 
Sarah Goff-Dupont
Sarah Goff-Dupont
Bamboo Product Manager
Atlassian
 
 

Unleashing Agile with Git Branches

Moving to Git opens up a whole new level of agility for software teams. Freed from the clunky code freezes and monolithic mega-merges that plague centralized version control, developers can exploit the full power of branch-and-merge workflows to deliver working code faster and lay the procedural foundations for continuous delivery. For stakeholders like product owners, release engineers and business analysts, adopting the branch-and-merge model means better visibility into what work is in progress, what's ready to ship, and what's already been delivered. Attendees will learn:
  • How Git facilitates key agile concepts like building in narrow vertical slices and making releases "non-events"
  • What a branch-and-merge workflow looks like for individual developers, and for the extended product team
  • How to optimize the basic workflow for SaaS and installed applications
  • How the branch-and-merge model integrates with your existing continuous integration and code review practices
  • Trade-offs to consider when evaluating this model
This session assumes a basic understanding of version control systems and agile development processes, and is appropriate for both coders and non-coders
 
 

Matthew McMullin
CTO
LanDesk
 
 

Fundamentals of Lean Software Delivery (From Wednesday BOF)

What is the “Heart of Lean”? The North Star Discussion. And more...
 
 
 
 
platinum
gold
silver
media
Facebook Twitter LinkedIn LinkedIn
Registration
Program
Workshops
2014 Keynote Videos

All ALM Forum 2015 conference
attendees also receive:

photo photo photo
ALM Forum Logowear
ALM Forum Bag