You are on page 1of 14

Institute of Information and Communication Technology (IICT) Bangladesh University of Engineering and Technology (BUET) Course: ICT-6010, Title:

Software Engineering and Application Development Class handouts, Lecture-2


Software engineering process The software process is becoming a big issue for companies that produce software. As a consequence, the software process is becoming more and more important for permanent employee, long term practitioners and short term consultant in the software industry. A process may be defined as a method of doing or producing something. Extending this to the specific case of software, we can say that a software process is a method of developing or producing software. In the past, processes for the development of software have been highly dependent on the individual developer. This can be lead to three key problems: (i) Such software is very difficult to maintain. Imagine, the software developer has fallen under a bus, and somebody else takes over the partially completed work. May be there is a plan, with individual tasks mapped out and those that have been completed neatly marked or may be the plan only exists in the developers head. In any case, a replacement employee ends up starting from scratch, because however good the persons work, the replacement has no clue of where to start. The process may be a superb, but it is an ad-hoc process, not a defined process. (ii) It is very difficult to accurately gauge the quality of the finished product according to any independent assessment. If we have two developers, each working according to their own process, defining their own tests along the way, we have no objective method for comparing their work either with each other, or more important, with a customers quality criteria. (iii) This is a huge overhead involved as each individual works out their own way of doing things in isolation. To avoid this we must find some way of learning from the experience of others who have already trodden the same road. So, it is important for each organization to define the process for a project. There are 4fundamental process activities which are common to all software process. The activities are: (a) Software specification: The functionality of the software and constraints on its operation must be defined. (b) Software development: The software to meet the specification must be produced. (c) Software validation: The software must be validated to ensure that it does what the customer wants. (d) Software evolution: The software must evolve to meet changing customer needs. There is no such thing as a wrong or a right software process. Different software process decomposes these activities in different ways. The timing of the activities varies as does the results of each activity. Different organizations use different processes to produce the same type of product. Different types of product may be produced by an organization using different processes. However, some processes are more suitable than others for same type of application. If wrong process is used, this will probably reduce the quality or the usefulness of the software product to be developed.

The purpose of the process

What do we want our process to achieve? We can identify the following key goals in this respect: Effectiveness: An effective process must help us to produce the right product. It doesnt matter how elegant and well written the software, nor how quickly we have produced it. If it isnt what the customer wanted, or required, its no good. The process should therefore help us to determine what the customer needs, produce what the customer needs, and crucially verify that what we have produced is what the customer needs. Maintainability: However good the programmer, thing will still goes wrong with the software. Requirements often change between versions. In any case, we may want to reuse elements of the software in other products. None of this is made any easier if, when the problem is discovered, everybody stands around scratching their heads saying Oh dear, the person that wrote this left the company last week or worse, Does anybody know who wrote this code? One of the goals of a process is to expose the designers and programmers thought processes in such a way that their intention is clear. Then we can quickly and easily find the remedy of faults or work out where to make changes. Predictability: A new product development needs to be planned and those plans are used as the basis for allocating resources: both time and people. It is important to predict accurately how long it will take to develop the product. A good process will help us to do this. The process helps lay out the steps of development. Furthermore, consistency of process allows us to learn from the designs of other projects. Quality: Quality in this case may be defined as the products fitness for its purpose. One goal of a defined process is to enable software engineers to ensure a high quality product. The process should provide a clear link between a customers desire and a developers product. Improvement: No one would expect their process to reach perfection and need no further improvement itself. Even if we were as good as we could be now, both development environments and requested products are changing so quickly that our processes will always be running to catch up. A goal of our defined process must then be to identify and prototype possibilities for improvement in the process itself. Tracking: A defined process should allow the management, developers and customer to follow the status of the project. Tracking is the evaluation of predictability. It keeps track of how good our predictions are and hence how to improve them.

Relationship among software principles, techniques, methodologies and tools Software principles deal with the process and the final product. The right process will help to produce the right product, but the desired product will also affect the choice of which process to use. A traditional problem in software engineering has been the emphasis on either the process or the product to the exclusion of the other. Both are important. The principles are general enough to be applicable throughout the process of software construction and management. Principles, however, are not sufficient to drive software development. In fact, they are general and abstract statements describing desirable properties of the software process and products. But to apply principles, the software engineer should be equipped with appropriate methods and specific techniques that help incorporate the desired properties into the processes and products. Software engineering methods provide the technical how-tos for building software. Methods encompass a broad array of tasks that include requirements analysis, design, program construction, testing, and support. Software techniques are more technical and mechanical than methods; often, they also have more restrictive applicability. In general the difference 2

between method and technique is not sharp. Sometimes, methods and techniques are packaged together to form methodology. The purpose of a methodology is to promote a certain approach to solving a problem by preselecting the methods and techniques to be used. Software engineering tools provide automated or semi-automated support for the application of techniques, methods and methodologies. When tools are integrated so that information created by one tool can be used by another, a system for the support of software development, called Computer Aided Software Engineering (CASE). CASE combine software, hardware and software engineering database (a repository containing important information about analysis, design, program construction and testing) to create software engineering environment. Item Principles Methodologies Methods Techniques Example Modularity, abstraction, anticipation of changes, separation of concerns, generality, incrementally etc. Waterfall model, evolutionary model, transformation model, spiral model etc. Different approach for software design, software specifications, verifications etc. Modularity technique: Top-down, bottom up, step wise refinement etc. Operational specification: Data flow diagram, finite state machine, Petri net etc.

As technology evolves, software engineering tools will evolve. Methods and techniques will evolve, too although less rapidly than toolsas our knowledge of software increases. Principles will remain more stable; they constitute the foundation upon which all the rest may be built. The software production process model Production and manufacturing process are studied extensively in any discipline whose goal is to produce products. The goal of production process is to make production reliable, predictable and efficient. A well defined production process, as used, for example, in automobile production, has many benefits, including supporting automation and the use of standard components and processes. To solve actual problems in an industry setting, a software engineer or a team of engineers must incorporate a development strategy that encompasses the process, methods and tools. This strategy is often referred to as a process model or a software engineering paradigm. By defining a model of the software production process, we can reap some of the benefits of the 3

standard process. But we must also keep two distinguishing characteristics of the software in mind. Find software production is largely an intellectual activity, not easily amenable to automation. And second, software is characterized by high instability: requirements change constantly and as a consequence, the product they must be evolvable. Boehm [1988] states that the goals of structured process models are to: Determine the order of states involved in software development and evolution, and to establish the transitions criteria for the next stage. Thus a project model addresses the following software project question: (i) What should we do next? (ii) How long shall we continue? According to this view point, process models have a two-fold effect: one hand, they provide the guidance to the software engineer. On the other hand program development and maintenance, in that they enable us to estimate define intermediate milestones. A process of software engineering is chosen based on the nature of the project and application. The methods and tools to be used, and the controls and deliverables those are required. A variety of different process models for software engineering evolved. Each represents an attempt to bring order to an inherently chaotic activity. It is important to remember that each of the models has been characterized in a way that ideally assists in the control and coordination of a real software project. The software production process models are also known as software life cycle model. The term software life cycle is used to describe the period of time that starts with the software system being conceptualized and ends with the software system been discarded after usage. Following are the widely used life-cycle models: 1. waterfall model 2. Prototyping 3. Incremental model 4. Rapid application development model (RAD) 5. Transformation & 6. Spiral model The waterfall model The waterfall model was derived from engineering model to put some order in the development of large software products. It consists of different stages which are processed in a linear fashion. The model is shown in figure below:

Brief description of the phases in the waterfall model is as follows: Requirements are the phase where the What of the software system is defined. It involves extensive user participation and ends with an approved set of requirements documented in a software requirement specification (SRS) document. The SRS form the basis of all further work in the project. The How of the system is defined in the design phase. Here, the problem definition specified in the SRS is translated into design which will solve the problem. The design phase is bridge between What the user wants and the code that will be created to satisfy the requirements. The code is written during the construction phase. During this stage, the software design is realized as a set of program or program units. The code that is developed is tested during the testing phase. This involves unit testing for the lowest level components, integration testing for groups of components and testing of the system as a whole. The last activity in testing is usually that of acceptance testing. Accepting testing validates that the system fulfils that users need and requires considerable user involvement. The system is finally made operational in the deployment phase. Main activities in this phase include training of the users, installation of the system, switchover from the existing (manual, semi-automated or automated) system to the new system. The system thereafter rolls into the operational and maintenance phase. During this phase leftover defects are corrected, if required, the system is made more efficient, new requirements are added and existing functionality and features modified to meet the ever changing business needs. Evolution of waterfall model In the early days of computing, software development was mainly a single-person task. The problem to be solved very often of mathematical naturewas well understood, and there was no distinction between the programmer and the end user of the application. The end user very often a scientist or an engineer-developed the application as a support to his or her activity. The application, by todays standards, was rather simple. Following figure depicts the evolution of software model.

The model used in these early days may be called the code-and-fix model. Basically, software development was essentially seen as a programming activity followed by the implementation activity, with other tasks seen as an extension of the programming and implementation. This lead to considerable amount of bug fixing during the implementation resulting in delays and cost overruns. To overcome these problems, developers started performing testing as an activity between the programming and implementation. Over time, requirements, design and development were identified as important activities and converted into phases. The waterfall model has the following advantages: (i) The model has well defined phases with well defined inputs. (ii) It recognizes the sequences of software engineering activities that shall result in a software product. The main disadvantages of this model are: (i) Real projects rarely follow the sequential flow that the model proposes. (ii) The model assumes that requirements are clearly specified at the beginning of the project. The model has no mechanism to handle changes to the requirements that are identified because of design and construction activities or user feedback. (iii) The model reduces the users involvement between design and testing phase of the project. This creates a gap and reduces users felling of ownership as the model provides no forum for active participation from the users during the intermediate phase of the life cycle. (iv) For large projects, the users have to wait a long time for the delivery of the system. A working version of the program(s) will not be available until late in the project time span. Requirements may change and system may even become redundant by the time they are delivered. (v) The assumption that all requirements have to be known at the beginning in sufficient details lead to premature decisions. Because it is difficult to estimate resources accurately when only limited information is available. It is often difficult for the user/customer to state all the requirements explicitly. This model requires this and difficult to accommodate the natural uncertainty that exist at the beginning of any project. Waterfall model with iteration This model recognizes the fact that development team may have to go back and perform some of the activities of a previous phase as more clarity comes in the later phase.

Despite its well known disadvantages, the waterfall model remains popular because it recognizes the inherent sequence in software engineering activities. Other models, designed to remove the disadvantages of this model. The V-model The waterfall model tends to view testing as a single phase in its life cycle. The V-model attempts to give increased importance to testing related activities by dividing the life cycle into development and testing phases. The model relates each development phase to its associated testing phase. Then in this model, work on testing phase is carried out in parallel. For example, acceptance test planning activities of the acceptance testing phase may be carried out along with the requirement phase. Similarly, system test planning activities of the system testing phase need not wait for unit testing to be completed.

The V-model has the advantage that testing activities are explicitly emphasized. The V-model also links the testing activities with the corresponding specification activities and encourages the preparation of test plans early in the life cycle. Prototype model It is recognized that defining requirements in many situations is particularly tricky. In such situations, detection of the missing or incorrect requirements at the later stage is extremely expensive in terms of delays in delivery. It is also recognized that software project often takes a long time and that requests change over time. In such situation prototype model is more suitable. The prototype paradigm begins with requirements gathering. Developers and customer meet and define the overall objectives for the software, identify whatever requirements are known and outline areas where further definition is mandatory. A quick design then occurs. The quick design focuses on a representation of those aspects of the software that will be visible to the customer/user (i.e, input approach and output format). The quick design leads to the construction of a prototype. The prototype is evaluated by the customer / user and used to refine requirements for the software to be developed. Iteration occurs as the prototype is tuned to satisfy the needs of the customer, while at the same time enabling the developer to better understand what needs to be done.

There are two types of approaches to prototyping the evolutionary approach and the throwaway approach. In the evolutionary approach the prototype is built in such a way that it can be used in the construction phase. The main benefits of the evolutionary approach are that a considerable part of the system is developed during requirements and design stages. However, the evolutionary approach slows down the requirements definition process and also detracts the attention of the developers to internal details like coding standards. In the throw-away approach to prototyping, the use of the prototype ends once all the requirements clearly documented. The prototype system is not used for the actual construction of the system. Typical characteristics of such an approach is that prototype is built using an environment and tools different from the tools and environments that will be used to build the system. The prototype model has certain advantages and particularly use where: Users are unable to specify their requests or have no previous knowledge or experience of computers and therefore are unable to envisage the feature they need. The proposed system is expected to have considerable user interface. The development environment supports the quick creation of prototype. The proposed system has a complex algorithm or outputs that need to be redefined. Improved communication between the developers and the end users. The prototype model however has some disadvantages: Users, on seeing a working prototype, often start expecting the actual system to be ready very soon after. Where prototype is not representative enough, users can get disappointed with the prototype and lose interest in the system being developed. Prototypes are made in a hurry, often without evaluating all options or understanding the full implication of the technical choices made. Incremental model In very large system, users typically do not want to wait for years before they see the whole system. The incremental model phases out deliveries by increments, after an initial information strategy planning. Here, the first increment is the core product or the most important functionality of the system. This is, basic requirements are addresses, but many supplementary (some known, others unknown) remains undelivered. The core product is used by the customer (or undergoes detailed review). As a result of use and/or evaluation, a plan is developed for the next increment. The plan addresses the modification of the core product to 8

better meet the needs of the customer and the delivery of additional features and functionality. The process is repeated following the delivery of each increment, until complete product is produced. It is an evolutionary approach, is interactive in nature. But unlike prototyping, the incremental model focuses on the delivery of an operational product with increment.

Advantages Early delivery of parts of the functionality for use. Feedback from the live usage of the delivered increments. This feedback can be used to prevent similar issues in later deliveries. High priority features are incorporated in early deliveries. Re-prioritization is possible in the course of the project. Disadvantages o Total development costs are higher. o Total time period for delivery of the entire functionality is higher. o The planning the delivery increments are critical to success. Wrong planning can result in a disaster. Rapid Application Development (RAD) model RAD is incremental software development process model that encompasses an extremely short development cycle. The RAD model is a high speed adaptation of a linear sequential model in which rapid development is achieved by using component based construction. If requirements are well understood and project scope is constrained, the RAD process enables a development team to create a fully functional system with very short period (i.e, 60 to 90 days). Executing a project using the RAD approach requires creation of RAD teams to handle each identified application and is possible only for projects where such modularizing is possible and meaningful. The technique is relevant for environment that use fast development tool, like 4-GL environment and where reuse of software components is possible. RAD is suitable 9

for information system application. RAD doesnt work well where the modules have heavy interaction or performance tuning is critical.

The typical activities of RAD are divided into 4-phases. These are: Requirement planning phase: End users and developers form joint application development teams and participate in workshop where they review the RAD methodology and prepare for the next phase. End users and developers understand the boundaries of the application and problems they are trying to solve. User design phase: This consists of gathering high level and detailed requirements followed by information system modeling and customer review. Development and prototyping tools are identified and assembled; a prototype is developed, evaluated and modified, till everyone agrees that the prototype fulfills the user needs. Construction phase: The developers defined the prototype to create the production version of the system. This includes the addition of missing validation. Calculation, processing and interfaces. The user manual and documentation required to maintain the system is also developed in this phase. Cut over phase: This involves the installation of the new system after testing the system with real data. Advantages of the RAD model are: (i) It reduces the amount of time taken to develop database application that have extensive user interface. (ii) Users participate in testing and development became an iterative process of refining successive version of the application. (iii) RAD facilitates communication, so developer can built better application in future. Disadvantages (i) It is difficult to achieve consistency within and between applications developed by different teams. (ii) Design and coding standard are difficult to implement and often ignored. 10

Transformatio model In this model, informal requirements are analyzed which are then formalized using formal methods. This may take several steps. Once the requirements are entirely formalized they are translated into a program. An ideal transformation-based process model is shown in fig. The process consists of two main stages: requirement analysis and optimization. Requirements analysis provides formal requirements, which are fed into the optimization process that does the performance tuning, until we reach a satisfactory, optimized result. The transformation process is controlled by the software engineer and may take advantage of the availability of reusable components. Reusable components may take the form of modules to be included in the application. Before being transformed, specification is verified against the users expectation to check whether they capture the real requirements. The transformation based model is supported by suitable computer aided software environment. The environment provides tools for verifying, requirements, handling reusable components, performing optimization and storing the history of the development of the software.

In waterfall model experience has shown that since changes are anticipated, they are often treated as emergency repairs: they are performed under strong pressure from the customers and management and within strict time constraints. As a consequence, programmers tend to make changes only by modifying the code, without propagating the effects of those changes to change of the specifications. Thus specification and implementation gradually diverge, making future changes to the application more difficult to perform. Updated the affected requirements and design specification is also difficult because these are usually textually documented and changes are difficult to make and trace back. The situation is quite different in the transformation-based approach. Since the history of the development of the software along with the rationale of the every step is recorded of the software along by the support environment. The programmer may be forced to forgo

11

changing the code directly, and instead start retransforming from the appropriate intermediate step of history. Unfortunately, at present, the transformation approach is not practical paradigm for software production process model. It is still a research approach and only experimental environments are available to support it. The transformation approach has been studied for small programs as dual method for proving program corrections. Program correctness proofs represent an analytic, mathematically based approach: they provide a formal framework for analyzing program correctness, after the program is developed. Spiral method The spiral model is developed by Barry Boehm. This model emphasizes on continuous reassessment of the plans and risks and combines the iterative and sequential approach. The goal of this model is to provide a framework for designing such process, guided by the risk levels in the project at hand. As opposed to the previously presented models, the spiral model may be viewed as a metamodel, because it can accommodate any process development model. By using a reference, one may choose the most appropriate development model (i.e. prototype versus waterfall). The guiding principle behind such choice is the level of risk; accordingly, the spiral model provides a view of the production process that supports risk management. Risks are potentially adverse circumstances that may impair the development process and the quality of product. Boehm [1989] defines risk management as a discipline whose objective is to identify, address and eliminate software risk items before they become either threats to successful operation or a major source of expensive software rework. The spiral model focuses on identifying and eliminating high-risk problems by careful process design, rather than treating both trivial and severe problems uniformly.

12

The main characteristic of the spiral model is that it is cyclic and not linear like the waterfall model. Each cycle of the spiral consists of 4-stages, and each stage is represented by one quadrant of the Cartesian diagram. The radius of the spiral represents the cost accumulated so far in the process; the angular dimension represents the program in the process. Stage 1 identifies the objective of the performance of the product under consideration, in terms of qualities to achieve. Furthermore, it identifies alternatives, such as whether to buy, design or reuse any of the softwareand the constraints on the application of the alternative. Stage 2 evaluates all the alternative and potential risk area are identified and dealt with. Risk assessment may require different kind of activities to be planned and dealt with. Risk assessment may require different kind of activities to be planned as prototyping or simulation. Stage 3 consists of developing and verifying the next level product; again strategy followed by the process is indicated by risk analysis. Finally stage 4 consists of reviewing the results of the stages traversed so far and planning for the nextiteration of the spiral if any. The advantages of the spiral model are: Risk analysis is an integral part of this model and is performed in every spiral, leading to increased confidence in the project. The model is flexible and can be tailored for a variety of situations, such as reuse, component based development and prototyping. Combines the best features of waterfall and prototyping models. Disadvantages: Complicated, unsuitable for small project where risks are modest. Requires considerable risk assessment expertise. If a major risk is not discovered, problems will undoubted occur. An assessment of process models Our description of software process model has followed the actual historical evolution of these models, from the unstructured code-and-fix model to the waterfall model to evolutionary mode, and finally spiral model. The driving force behind this evolution was the recognition of the weakness in the extant models and the desire to devise the most effective process to achieve the qualities required for the application at hand. The code-and-fix model can actually be considered no model at all. It consists of following the inspiration and the need of each particular moment, without carefully thinking and planning at the entire process before hand. The code-and-fix model, the waterfall model falls into the other extreme. It is rigid, prespecified, nonadaptive, and monolithic. In its practical application that measures the program of the process. Generally, this documentation is voluminous, but totally passive, making changes difficult to apply as the application enter the maintenance stage and causing the documentation to diverge from the implementation. If the waterfall life cycle model is documentation driven, we characterize evolutionary approach as increment driven. In fact, program through the evolutionary process is marked by the development and possible delivery of increments. The transformational model can be called specification driven, as the development process occurs through the iterative refinement of formal specification. Finally, we saw, the spiral model is a meta-model that may be called risk driven. 13

So far, there has been little detailed comparison of the various models. Some initial experiments led by Boehm explored the productivity of a waterfall-based life cycle compared with that of an evolutionary life cycle based on prototyping and the use of the 4-GL in the area of the interactive end-user applications. The results showed that the waterfall approach addressed product and process control risk better, whereas the prototyping approach addressed user interfaces better. The prototyping approach also avoided the risks of spending much time on not-so-important aspects and helped to concentrate attention on the relevant issues and risks. In addition both projects had roughly equivalent productivity in terms of their rates of delivered source instructions. They also had comparable performance, but the evolutionary approach had 40% less development time and resulted in a product with roughly 40% less source instructions. The waterfall-based process had fewer problems in debugging and integration, due to more thought-out design. The involvement of the end user in the software development process has become a major factor characterizing the evolution of software engineering in recent years. The availability of high level languages and tools encouraged a more explanatory style of work than that permitted by the waterfall life cycle. Today, the end user may become one of the partners of the process and even a principal driving force behind it. Another culture that has emerged more recently emphasizes the production of higher quality products that satisfy their expected users, through the adoption of evolutionary approaches. In this culture, product quality comes first and process quality is in some ancillary to it. This culture also emphasizes the production of tools and libraries, through which specific applications are constructed fast, reliably and economically. The cost effectiveness of this approach is measured on a medium to long term time scale; it is based on investments that progressively enrich the set of available tools and components. As a new application is required, first the library is searched to determine whether something is available for reuse. If the search is successful, the application is constructed by assembling existing components. It may happen that these components only allow the development of a prototype, which is refined later to produce the final product. If so new components are evolved from the existing ones, and new components may be part of the library again.

14

You might also like