You are on page 1of 96

1.

INTRODUCTION
Introduction of Project
Competition in todays age has a new face-Agility. Companies need to act and react faster to a rapidly changing business environment. Smart Information Management is the key to the companys growth. Purchasing is a strategic function in any company, and Purchase Management System is software which automates the entire purchasing cycle for goods and services, resulting in greater efficiencies in time and money savings. The Online Purchase Management System is an enterprise wide single application consists of solutions for all functional areas of a purchase enterprise and Online PMS can help companies meet their financial and business objectives. Online PMS contains detailed reports that keep managers informed throughout the supply chain management process. Online PMS Software manages the purchasing process in a purely paperless mode.

1.2 Title of the Project


Online Purchase Management System

1.3 Purchase Management System under ERP


Enterprise Resource planning software or ERP doesnt live up to its acronym. Forget about planning - it doesnt do that - and forget about resource. a throwaway term. But remember the enterprise part. This is true ERP ambition. It attempts to integrate all departments and functions across a company onto a single computer system that can serve all those different departments particular needs. That is a tall order, building a single software program that serves the needs of people in finance as well as it does the people in human resources and in the warehouse. Each of those departments typically has its own computer system, each optimized for the particular ways that the department does its work, but ERP combines them all together into a single, integrated software program that runs of a single database so that the various departments can more easily share information and communicate with each other .

That integrated approach can have a tremendous payback if companies install the software correctly. The project Online Purchase Management System is the system, which works on a local area network for ease of Purchase department in maintaining the details of purchase of various departments in any organization. It includes the following functions: Vendor Management. Item Details Requisition Entry and Approval Enquiry generation Quotation Entry and Approval Purchase Order Generation Purchase Order Close Report Generation.

Scope of the Project Entry of requisitions by different departments. Approval of requisitions by the administrator and further send them for enquiry to different vendors. Registration of Vendors for different items. Quotation Generation based on received enquiry. Approval of quotation. Generation of Purchase Order. Dispatch of Purchase Order. Closing of Purchase Order. This software can run either on standalone machine or on Network so a number of users can access the data simultaneously. Hard copy of various reports can be generated.

Security features are implemented. Only administrator can view and send the enquiry, quotation, purchase order etc.

1.4 Projects Objectives


The main objective to develop the project is to make the Purchase Management system simple, easy and increase the productivity of the Managers in taking decisions because all the information is available in the organized form. This software provides a user-friendly interface for the user and it reduces the data redundancy. Centralized information is available which can be accessed by number of users. The other objective of software project planning is to provide a framework that enables the manager to make reasonable estimates of resources and schedule. These estimates are made within a limited time at the beginning of a software project and should be updated regularly as the project progresses. There are some other objectives to develop this system. The most important objective is:

1) Capability Business activities are influenced by company or organizations ability to process information quickly and efficiently. The Purchase Management System adds capability in the following ways: Improved Processing speed The inherent speed, with which computers process information is one reason why organization seeks the development of the system projects.

Faster Retrieval of information Locating and retrieving information from storage. The ability in conducting complex searches. 2) Control

Greater accuracy and consistency Carrying out computing steps, including arithmetic, correctly and consistently.

Better Security Safeguarding sensitive and important information in form that is accessible only to authorized person. 3) Communication Enhanced Communication Speeding the flow of information and messages between remote locations as well as with in offices. This includes the transmission of documents within offices.

Integration of business areas Coordinating business activities taking place in separate areas of an organization though capture and distribution of information.

4) Cost Monitor Cost Tracking the purchase process and overhead is essential to determine whether a firm is performing in line with expectations with in budget. Reduce Cost Using computing capability to process at a lower cost than possible with other methods, while maintaining accuracy and performance level.

1.5 Proposed System


The proposed system is an information system which is being designed to replace the existing manual information system. The proposed system has the following key features. Features It reduces the paperwork and increased automation. Very fast processing Efficient management of information Improved security Data safety through redundancy Quick response to adhoc queries Integrity of the data is maintained Transparency in the system

1.6

Introduction of Organisation

e.Soft Technologies Limited is a software development and business process consulting company providing business process re-engineering consultancy and services, enterprise solutions, ERP, Engineering Services, e-business intelligence, data warehousing, ecommerce solutions and CAD solutions. e.Soft was incorporated with the prime objective of providing on-site and offsite professional services specializing in system integration, application development, CAD and web services.

2. SYSTEM ANALYSIS
2.1 Identification of Need
If system, which is going to be developed, is complex in nature the goals of the entire system could not be easily comprehended. Hence the need for a more rigorous system analysis phase arose.

User Developer Managers


Problem Statement
User Interviews

Generate Request

Dom Domain

Build Models

Experience Object Model


Functional Model

2.1.1 Problem Analysis The basic aim of problem analysis is to obtain clear understanding of the needs of the clients and the users, what exactly is desired from the software, and what the constraints on the solution are. Analysis leads to the actual specification.

2.1.2 Problem Analysis Approaches There are three basic approaches to problem analysis. Informal Approach Conceptual modeling-based Approach Prototyping Approach

In this project we use Structured Analysis technique to understand the exact requirement of the organization. In this technique we have divided the main problem in two sub problems and solved them separately. One is Master Creation and other is Transactions and the Master Database module.

2.2 Preliminary Investigation


The Preliminary investigation starts as soon as someone either a user or a number of a particular department recognizes a problem or initiates a request, to modify the current computerized system, or to computerize the current manual system. An important outcome of the preliminary investigation is determining whether the system is feasible or not. Manual work being done in each department of organization needs to be computerized for quick and better results with minimum delay in time. The software is developed in such a way to ensure time utilization, smooth functioning and building a cohesive relationship between various activities. The designing has been made with a view to reduce the effort involved in processing the activities of various people thus diminishing the inconvenience to users and increasing system efficiency and reliability. To provide the user friendly automated system, which provides efficiency, security of data and errorless outputs. The system should be capable of: Managing all order forms, details for their requirements, Vendor details who supply items.

Aim is to perform all transactions involving people and organization using Internet with other channels of transactions, which take a supportive role to Internet. Negotiate the price, quantity, mode of payment, mode of dispatch etc. Receives Quotations for items & services. Provide various reports. Purchase module is responsible for maintaining records for vendors, items, and entering requisitions, generating purchase orders. It provides vendor, item relationship, after enquiry of items and quotation approval. It maintains also pending requisitions and pending P O. The main objective to develop the project is to make the Purchase Management System simple, easy and increase the productivity of the Managers in taking decisions because all the information is available in the organized form. This software provides a user friendly Interface for the user and it reduces the data redundancy. Centralized information is available which can be accessed by number of users. 2.2.1 Drawbacks in current system There is no existing computerized system. All the work is performed manually. The manual system is not secure enough. The current system had lot of problems, which are as follows Difficult to locate or find particular information regarding items, vendors, requisitions, enquiries, quotations, purchase orders etc. Comparison and approval of quotation. All the departments are working in isolation. Every one is maintaining its own database to keep the information related to its purchase and there is no common database. Hence there is duplication of data. 8

Lot of work has to be done manually. No security of data. Maintenance of large numbers of record is a hectic job. Inefficiency to respond to management queries. Time consuming process. Very slow processing. Loss of integrity of data. Inability to recover from data damages. System is not transparent.

In order to cope with the above problems this software has been designed and developed to computerize the working of the Purchase Department. The main objective to develop the project is to make the system simple.

2.3 Feasibility Study


Feasibility is the determination of whether or not a project is worth doing. The process followed in making this determination is called a feasibility study. This type of study determines if a project can and should be taken. Once it has been determined that project is feasible, the analyst can go ahead and prepare a project specification, which finalizes project requirements. All projects are feasible given unlimited resources and infinite time! Unfortunately, the development of computer based system is more likely to be plagued by a scarcity of resources and difficult delivery dates. It is both necessary and prudent to evaluate the feasibility of the project at the earliest possible time. Months or years of efforts, money loss and untold professional embarrassment can be averted if we better understand the project at its study time. The feasibility of a project is being analyzed within some frame work. The most important factor is feasible and desirable then it include in the schedule of the management so that approval can be taken from the same. In the

conduct of the feasibility study, the analysis considers seven distinct, but interrelated types of feasibility. They are: Technical Feasibility Economic Feasibility Operational Feasibility Social Feasibility Management Feasibility Legal Feasibility Time Feasibility

The assessment of the Purchase Management System has the following facts:

Technical Feasibility Technical Analysis begins with the assessment of the technical viability of the proposed system. We have to mention what technologies are required to accomplish system function and performance .We have to also study how will these technology issues affect cost . The existing technology seems sufficient to run the new system. The data holding facility is also seems sufficient because we are using MS SQL RDBMS and it can handle large volume of data, hence in near future if the work increases it can handle it very easily.

Operational Feasibility It seems that management of the purchase is of very much interest in the new system. The management and the users are normally the same members so there is no problem of conflict between the management & users.

10

2.3.1

Financial and economical

Among the most important information contained in feasibility study is cost benefit Analysis an assessment of the economic justification for a computer based system project. Cost benefit Analysis calculates approximate costs for project development and weighs them against tangible and intangible benefits.

2.4 Project Planning


Project life cycle has three stages: 1. Project Initiation Development team prepares the project plans and finalizes the outcome of each phase. In this stage team also prepares the comprehensive list of tasks involved in each phase, and the project assigns responsibilities to the team members, depending on their skills.

2. Project Execution In this stage, the team develops the product. In case of Purchase Management development team, will develop the online Purchase Order Management This Stage consists of following phase: 1. Requirement Analysis 2. High Level Design 3. Low Level Design 4. Construction 5. Testing 6. Acceptance

3. Project Completion - In this stage, the team has to update the site regularly. Each new enquiry has to add by the Purchase Manager as according to the needs and demands. This stage is very important for the freshness of the site.

11

When any updation or upgradation is required for the website, the developers Or maintenance team make the website up to date. There are lots of requirements after the completion of the Project. As this website is dynamic website in which lots of changes are requires such as Update Enquiries, quotations or selected vendors list. So for this is always a way to do this.

12

2.5 Project Scheduling


2.5.1 PERT Chart Program Evaluation Review technique (PERT) Chart is mainly used for high risk projects with various estimation parameters. For each module in a project, duration is estimated as follows: 1. Time taken to complete a project or module under normal conditions, tnormal. 2. Time taken to complete a project or module with minimum time, tmin. 3. Time taken to complete a project or module with maximum time, tmax. 4. Time taken to complete a project from previous related history, thistory. An average of tnormal, tmin, tmax and thistory is taken depending upon the project. Beta Testing Programming SRS And Design Alpha Testing

20 1 15 2 5

35

20

20

10

20 4 6

20 7

20 9

10 11

User Requirement And Analysis

Buy Hardware

Installation

Writing Writing Manuals

Training

User

User Test

13

Figure 2 : P.E.R.T CHART FOR ONLINE Recuriment Process & Employee Management System for HR Group for a Company 2.5.2 Gantt Chart Gantt Chart is also known as Time Line Charts. A Gantt chart can be developed for the entire project or a separate chart can be developed for each function. A tabular form is maintained where rows indicate the tasks with milestones and columns indicate duration (weeks/months). The horizontal bars that spans across columns indicate duration of the task. June Requirement Gathering Design Test Cases Coding Testing Build Figure 3 : Gantt Chart July August September

14

2.6 Software Requirement Specifications (SRS)


SRS is a document that completely describes what the proposed software should do without describing how the software will do it. The basic limitation for this is that the user need keeps changing as environment in which the system was to function changes with time. This leads to a request for requirement changes even after the requirement phase is done and the SRS is produced. The origin of the most software system is the need of a client, who either wants to automate the existing manual system or desires a new software system. The developer creates the software system and the end users will use the completed system. There are three major parties interested in a new system: the clients, the user and the developer. Somehow the requirements for the system that will satisfy the needs of the clients and the concerns of the users have to be communicated to the developer. The Problem is that the client usually does not understand the software and the software development process, and the developer often does not understand the clients problem and application area. The basic purpose of the SRS is to bridge this communication gap. The SRS is the medium through which the client and the user needs are accurately specified; indeed SRS forms the basis of the software development. A good SRS should satisfy all the parties -- something very hard to achieve and involves trade-offs and persuasions. The important purpose of developing SRS is helping the client understand their own needs.

Advantages of SRS An SRS establishes the basis for agreement between the client and the supplier on what the software product will do. An SRS provides a reference for validation of the final product.

15

A high quality SRS is a prerequisite to a high quality software. A high quality SRS reduces the development cost.

Characteristics of an SRS A good SRS is 1. Correct 2. Complete 3. Unambiguous 4. Verifiable 5. Consistent 6. Ranked for importance and/or stability 7. Modifiable 8. Traceable Requirement specification document The requirement for the proposed system is that, it would be less vulnerable to the mistakes being made due to entry at two or three levels and calculations. Moreover, the control of the administrator would be centralized. This will provide the support for Purchase department working process.

(1) Introduction (a) Purpose of the software The purpose of the proposed system is to provide efficient information system for management, department and vendors. The main objective to develop the project is to make the information part simple and to provide user friendly access of this program to all the staff members of the organization so that they can locate and reply the inquiries concerned to them. (b) Scope The software prepared for e.soft technologies ltd., it can be implemented in any Organization with a few minor changes. The software finds good scope in any organization having Purchase department. Talking to the administrator and the employee who were dealing with the Purchase

16

department, we came to know that the manual system was not up to the mark due to the cumbersome data entry and ample of calculations on the basis of which reports are generated. (2) General description (a) Product function and overview Data Entry Section User section This section is developed using ASP.NET with C# as front-end and MS SQL Server as back-end. Only valid user enters to this section by providing login name and password to the system. Administrator section This section can be accessed by providing administrator password. In this section the administrator can authorize persons to data entry. The administrator can add or edit the master table information and transaction information. Data Updation Section User section This section is developed using ASP.NET with C# as front-end and MS SQL Server as back-end. Only valid department user and Valid Vendor can update daily transactions information. Administrator section This section can be accessed by providing administrator password. In this section the administrator can authorize persons to data updation. The administrator can edit the master table information and transaction information. Data Deletion Section Administrator section This section can be accessed by providing administrator password. In this section the administrator can authorize persons to data deletion. The administrator can delete the master table information and transaction information.

17

Data Processing Section This section can be accessed by providing administrator password. In this section the only administrator can process the purchase related information. Report Section This section is developed using Crystal Report as a report generation tool and SQL Server as back-end. (b) User Characteristics The user at the entry section can be the department employee, Vendor or administrator. The employee can made the request for the products required, administrator can manage the whole process of purchasing while vendor can navigate through his account for the enquiries, quotation generation, purchase orders, its dispatch etc. (a) General constraints The back-end has to be either SQL Server 7.0 or 2000 and the system should run on the Windows Operating System. (3) Specific Requirements (a) Input and Output Requisition Entry, Enquiry Entry, Quotation Entry, Purchase Order Generation can serve as input & the items list, vendors information, enquiry details, quotation details, PO details forms are the output of this software.

(b) Functional Requirement There should be no manual entry in the database table by directly accessing the tables i.e. there should be security at database server. 18

Only valid user can Input, update or delete record and only administrator can perform any operation on master database and purchase module

(c) External Interface Requirement The software must be a user friendly platform to reduce the complexity of operation. The Online Purchase Management System should be capable enough to support multi-user environment. The software is based on clientserver architecture so that one or more user can do entries in the software system as well as view reports at a time. (d) Performance Constraints The software is supposed to have lacs of records so it should be capable to generate reports and to perform cumbersome calculations in seconds. (e) Acceptance Criteria Before accepting the system, the developer must demonstrate that the system works on Purchase Department. The developer will have to show through the test cases that all conditions are satisfied. Software Requirement Specification Software Tools Front-end Tool:-Microsoft ASP.NET 2.0 User friendly Low Cost Solution GUI feature Better designing aspects Back-end Tool: -Microsoft SQL Server 20000 features are

19

Security Portability Quality Platform Windows platform like: 2000 professional, XP & Vista Hardware Specification: Intel Pentium and Celeron class processor Processor Speed 1.2 GHz or above RAM - 512 MB HDD - 40 GB FDD-1.44MB Monitor-14SVGA Printer Dot Matrix /Inkjet /Laser Printer Mouse- Normal Keyboard- Normal

20

2.7 Software Engineering Paradigm applied


Requirement Analysis Phase During the requirement analysis phase, the development team analyse the requirement, to be fulfilled by the online Purchase Management System and identifies the portable approaches for meeting these requirements. To identify the requirement needed to the application, to study the existing Purchase Management portals. Finally, team identifies that the Purchase Management System should: 1. 2. 3. 4. 5. Enable a vendors to register with the site after validations has been performed on the data provided but the vendor. Enable vendors to perform activity such as view enquiries, send quotation, and send Purchase Order. Enable the departments to send their requisitions to the purchase department. Enable administrator to view database and different reports generated by Purchase department. Enable Purchase Manager to add and view database to update records of Purchase, vendors etc. Requirement Engineering Processes 1. Elicitation-determine the operational requirements (User needs and customer expectations) 2. Analyze-translate operational requirements into technical specifications 3. Documentation-record the operational requirements and technical specifications Verification-check that specifications are complete, correct and consistent with Needs and expectations

21

4. Generate acceptance test scenarios 5 Requirements Management-control changes to requirements What Is a Software Process? A process is a way of doing something includes work activities Includes procedures for conducting the work activities. The work activities transform input work products into output work products The procedures are supported by methods and tools; for example

Requirements

Design Process

Design Documents Test Plans

Methods And tools Figure 4 : Software Process Process Models for Software Development Iterative Operational Development

Requirements Requirements Specifications Architectural Design Incremental Verification Design Partitioning Incremental Validation

Incremental Builds*

22

Figure 5 : Iterative Development Model Iteration is the process by which the desired result is approached through repeated cycles In software engineering, an iterative approach allows revision of and addition to the work products Different types of iterative models support revision of: Requirements 1. Design 2. Code Analysis of website Analysis is a great starting point for developing a website. Analysis enables your strengths should be permanently promoted on the website can be used to overcome competitive weakness, such as limited resources to establish a twenty four hours a day, seven a weak customer service center. The website also serves as a reliable center point for taking advantage of opportunities. A website with a list serve potential candidates can quickly notify its.

Design Model To solve the actual problem, a website developer or a team of developers must incorporate development strategy that encompasses the presses method and tools. A process model for website chosen based on the nature of the project and application. By selecting appropriate process model for website. Software Engineering Methodology Used

23

Modular approach is used for developing the proposed system. A system is considered modular if it consists of discrete components so that each component can be implemented separately and a change to one component has minimal impact on other components. Every system is a hierarchy of components. This system is not an exception. To deign such a hierarchy there are two approaches: (1) Top down (2) Bottom up Both approaches have some merits and demerits. For this system top down approach has been used. It starts by identifying the major components of the system, decomposing them into their lower level components and iterating until the derived level of detail is achieved. Top down design methods often result in some from of stepwise refinement. Starting from an abstract design, in each step the design is refined to a more concrete level, until we reach a level where no more refinement is needed and the design can be implemented directly. A top down approach is suitable only if the specifications of the system are clearly known and the system development is from scratch. A bottom up approach starts with designing the most basic or primitive components and proceeds to higher level components that use these lower level components. The user of the existing system defines his general objectives for the software, but he does not identify detailed input, processing or output requirements. So, I have chosen PROTOTYPING approach to develop this software. Prototyping

24

The basic idea of prototyping is that instead of freezing the requirements before any design or coding can proceed, a throwaway prototype is build to help understand the requirements. This prototype is developed based on the prototype obviously undergoes design, coding & testing, but each of these phases is not done very formally or thoroughly. By using this prototype the client can get an actual feel of the system, because the interactions with the prototype can enable the client to better understand the requirements of the desired system.

Requirement Analysis Design

Design

Code

Test

Code Requirement Analysis THE PROTOTYPE MODEL Because the system is complicated and large and there is no existing system (computerized) prototyping is an attractive idea. In this situation letting the client test the prototype provides the variable inputs, which help in determining the requirements of the system. It is also an effective method of demonstrating the feasibility of a certain approach.

2.8 Data model


A data model is an abstract model that describes how data is represented and accessed. The term data model has two generally accepted meanings: 1. A data model theory, i.e. a formal description of how data may be structured and accessed. See also List of database models.

25

2. A data model instance, i.e. applying a data model theory to create a practical data model instance for some particular application. See data modeling. Data Model Theory A data model theory has three main components:

The structural part: a collection of data structures which are used to create databases representing the entities or objects modeled by the database.

The integrity part: a collection of rules governing the constraints placed on these data structures to ensure structural integrity. The manipulation part: a collection of operators which can be applied to the data structures, to update and query the data contained in the database.

For example, in the relational model, the structural part is based on a modified concept of the mathematical relation; the integrity part is expressed in firstorder logic and the manipulation part is expressed using the relational algebra, tuple calculus and domain calculus. Data Model Instance A Data Model Instance is created by applying a Data Model Theory. This is typically done to solve some business enterprise requirement. Business requirements are normally captured by a semantic logical data model. This is transformed into a physical Data Model Instance from which is generated a physical database. For example, a Data modeler may use a data modeling tool to create an Entityrelationship model of the Corporate data repository of some business enterprise. This model is transformed into a relational model, which in turn generates a relational database.

26

Entity-relationship model The entity-relationship model or entity-relationship diagram (ERD) is a data model or diagram for high-level descriptions of conceptual data model, and it provides a graphical notation for representing such data models in the form of entity-relationship diagrams. Such models are typically used in the first stage of information-system design; they are used, for example, to describe information needs and/or the type of information that is to be stored in the database during the requirement analysis. The data modeling technique, however, can be used to describe any ontology (i.e. an overview and classifications of used terms and their relationships) for a certain universe of discourse (i.e. area of interest). In the case of the design of an information system that is based on a database, the conceptual data model is, at a later stage (usually called logical design), mapped to a logical data model, such as the relational model; this in turn is mapped to a physical model during physical design. Note that sometimes, both of these phases are referred to as "physical design". There are a number of conventions for entity-relationship diagrams (ERDs). The classical notation is described in the remainder of this article, and mainly relates to conceptual modeling. There are a range of notations more typically employed in logical and physical database design.

27

ERDIAGRAM

28

3. SYSTEM DESIGN
3.1 Modularisation details
Designing of system deals with transforming the requirements of system into a form implement able using a programming language. We can broadly classify various design activities into two parts: Preliminary (or high level) design. Detailed Design.

In preliminary design part we design the following items: 1. Different modules required to implement the design. 2. Control relationship among the identified modules. 3. Interface among different modules.

Designing of this software is done with high cohesiveness, i.e. there is a minimized interaction between two different modules. There is no intra modular relationship between modules. Most of the modules are self independent. At the same time, modules are loosely coupled i.e. inter-modular relation. Hence the software is loosely coupled and highly cohesive.

29

3.1.1 Systems Modules Since we use the structured approach to develop the system we divide the system in modules on the basis of function they perform. These modules are again divided in to sub modules so that problem can be solved easily and accurately.

3.1.1.1 Module Division The description of each module is given below:

1. Master Creation
This module is responsible for creating all the required masters to manage overall processing. Various types of masters are created under this module. This module has following sub Modules listed below.

a. Vendor Entry This sub module is used to manage Vendor records. We have to maintain database of various Vendors with their names, address, phone no, and other information. This module allows us to add vendor records, delete records and modify vendor records. We can print vendor records whenever we want. Vendor codes are generated automatically.

b. Item Entry This sub module is used to manage Items. Items may be raw material and may be product. We have to maintain various information regarding to items such as item code, Item Name, specification, reorder level etc. We have to maintain max level, min level, reorder level, reorder quantity etc. This module allows us to add new Item detail, delete old items, modify items and find items. We can also print items details.

30

c. Vendor Item Relationship This module allows us to create Vendor Item Relationship. There may be number of vendors who supply numbers of items and a particular item may be supplied by more than one vendor. This many to many relationship we have to maintain. This module provides us vendor list and Item list we can select vendor and further select the items, which can be supplied by this currently selected vendor. This vendor item relationship information may be supplied whenever vendor provides us Quotation details.

2. Transactions
This module is responsible for managing all the required transactions. Transactions covered under this module such as Requisition Entry, Enquiry Entry, Quotation Entry and Purchase Order Generation. a. Requisition Entry This module is used to enter Requisition Details. Different types of requisitions from different departments are come to purchase department to purchase the goods. According to the requisition Purchase order to vendors are generated. This module is only responsible to enter Requisition Details supplied from other departments. Requisitions can be Added, Deleted, and Modified and searched in this module. This module manages two tables same time. One is Requisition Master and other is Requisition Detail. Many records in Requisition Detail per record in Requisition Master are maintained.

b. Enquiry Entry This module is used to enter Enquiry Details. Enquiries may be generated by purchase department as well as based on Requisitions from other departments. Each times a new enquiry no. is

31

generated. Each enquiry has a unique Enquiry No, Enquiry Date etc and details of the same enquiry like Item Name, Quantity needed and specification. Purchase module maintains all enquiry records. It allows user to add new enquiries, delete enquiries, and modify enquiries. Enquiries are transferred to vendor for Quotation. Enquiry list or specific enquiry may be print also. Added, Deleted, and Modified and searched in this module. This module manages two tables same time. One is Requisition Master and other is Requisition Detail. Many records in Requisition Detail per record in Requisition Master are maintained. c. Quotation Entry & Approval This module is used to enter Quotation Details supplied by Vendors. Enquiries are made to vendors for different items. Vendors provide us Quotation Details of the same Items mentioned in enquiry. Quotations may be in form of hard copies that should be converted into tables. Quotations are examined by the management and finally approved on the basis of information supplied in the Quotations like rate of items, quality of items, delivery time, mode of payments etc. This module allows us to Add new Quotations, Modify Quotation, Delete Quotations. Quotations are managed in master and detail tables. Quotation Master has Enquiry No against which Quotation is generated, Quotation Date, Vendor Code, Payment Terms etc and Quotation Detail has Items Detail such as Item Name, Rates of Items. This information helps us to maintain Vendor Item Relationship. We can print also one or all Quotations.

d. Generation of Purchase Order This module is used to generate Purchase Order. Purchase Orders are generated on the basis of Requisitions received from various Departments. Different Requisitions may have different or same items. Item list are generated from various Requisitions by adding quantity of same items. When item list are ready vendors are selected who can 32

supply theses items. At a time purchase order is generated for only single vendor. Two tables called PO Master and PO Detail are used to manage Purchase Order. PO Master contains PO No which is generated automatically, PO date, Vendor Code etc. PO details contains list of items have to be purchased with the required quantity. Requisitions for which PO is generated are closed automatically. PO can be found in form of hard copy. e. Requisition Close This module is used to close Requisitions for which Purchase Order is generated. Although Requisitions are automatically closed when PO is generated but in some cases requisitions are to be closed manually. This processing is called short close. For this a flag field is used like Requisition Completed which is set to True when Requisitions are closed, it remains false for pending requisitions. User is provided with the list of pending Requisitions he further selects the requisitions, which are to be closed.

f. Purchase Order Close This module is used to close Purchase Orders against which Goods are received. PO is remained pending until requested items are received from vendors. When Items are received in Store Department, Goods Receive Notes are generated and a copy of GRN is send to Purchase Department. GRN holds the PO No, Vendor Name and the Item list, which are received with their quantity. On the basic of GRN detail completed PO are short closed. Short closing method is implemented by using a flag field such as POCompleted, which is set to true when PO is completed. Before this PO are called pending PO. User closes the pending PO from the list of pending PO.

3.2 Data integrity and constraints

33

Pictorial representations of systems have long been acknowledged as being more effective than narrative. They are easier to grasp when being explained or investigate; it is easier to find a particular aspect interest, and pictorial representations are less ambiguous than narrative. The DFD is a simple graphical notation that can be used to represent a system in terms of the input data to the system, various processing carried out on these data, and the output data generated by the system. The 0 level DFD of system is as follows:

34

35

36

37

3.3 Database design/Procedural Design/Object Oriented Design


Detail design is the most creative & challenging phase in the development life cycle of the project. In the detail design of the system we design the tables of database, schema of tables and relationship between tables and file organization of the application.

3.1.1 Design of Database Table The data to be used in the system are stored in various tables. The number of tables used & their structure are decided upon keeping in mind the logical relation in the data available. The database design specifies:

The various tables to be used Data to store in each table Format of the fields & their types

We are using database of SQL Server. To create the database firstly we start the SQL Server. The starting window appears as:

38

After running the SQL Server, firstly weve created the database purchase with the user sa. The initial size of the database is 3 mb.

DATABASE TABLES

Table Name-DepartmentMaster Field name


DeptCode DeptName

Data type [size]


Varchar (10) Varchar (40)

Constraints
Primary key

Table Name-ItemMaster

Field name
ItemCode ItemName Specification ItemWeight WeightUnitCode ExciseRate Category Purchase Issue Production

Data type [size]


Varchar (10) Varchar (40) Varchar(40) Numeric(9,2) Varchar(10) Numeric(9,2) Varchar (10) Varchar (10) Varchar (10) Varchar (10)

Constraints
Primary key Unique not null

39

Sale UnitCode IssueUnitCode ProductionUnitCode SaleUnitCode MaxRate AvgRate LastRate MinRate MaxLevel MinLevel ReorderLevel ReorderQty CurrBalance

Varchar (10) Varchar(10) Varchar(10) Varchar(10) Varchar(10) Numeric(9,2) Numeric(9,2) Numeric (9,2) Numeric (9,2) Numeric (9,2) Numeric (9,2) Numeric (9,2) Numeric (9,2) Numeric (9,2)

Table Name-Login

Field name
Rid username pwd C_pwd hq ha

Data type [size]


Varchar (10) Varchar (10) Varchar(10) Varchar (10) Varchar (50) Varchar(50)

Constraints
Primary Key

Table Name-New Registration

Field name
Vendorname Compname Address Email Phone_no Mobile_no Username Pwd Cpwd Hq Ha

Data type [size]


Varchar(25) Varchar(25) Varchar(50) Varhar(30) Varchar(15) Varchar(15) Varchar(15) Varchar(20) Varchar(20) Varchar(50) Varchar(50)

Constraints
Not Null

40

Table Name-VendorMaster

Field Name
VendorCode VendorName Address CityCode StateCode CountryCode Pincode Phone Fax Email Url ContactPerson

Data Type [size]


Varchar (10) Varchar (40) Varchar (60) Varchar (10) Varchar (10) Varchar (10) Numeric (10) Numeric (10) Numeric (20) Varchar (40) Varchar (50) Varchar (40)

Constraints
Primary key Not Null

Table Name-POMaster

Field name
PurchaseOrderNo PurchaseOrederDate VendorCode RequisitionNo Remark

Data type [size]


Varchar(10) Datetime Varchar(10) Varchar(10) Varchar(50)

Constraints
Primary key

Table Name-PODetail

Field name
PurchaseOrederNo ItemCode Quantity UnitCode

Data type [size]


Varchar (10) Varchar (10) Numeric (9,2) Varchar (10)

Constraints

Table Name-RequisitionMaster

Field name
RequisitionNo RequisitionDate

Data type [size]


Varchar (10) Datetime

Constraints
Primary key

41

DeptCode CompCode ReqType IndentNo

Varchar (10) Varchar (10) Varchar (10) Varchar (10)

Table Name-RequisitionDetail

Field name
RequisitionNo ItemCode Quantity UnitCode ScheduleDate

Data type [size]


Varchar (10) Varchar (10) Numeric (9,2) Varchar (10) Datetime

Constraints

Table Name-QuotationMaster Data type [size]


Varchar (10) Varchar (10) Datetime Varchar (10) Varchar (10) Varchar (30) Varchar (30) Varchar (20) Varchar (20)

Field name
EnquiryNo VendorCode QuotationDate DispatchMode PaymentTermsCode DeliveryTerms SpecialInstruction Enclosure1 Enclosure2

Constraints

Table Name-Quotation Detail

Field name
EnquiryNo ItemCode Qty UnitCode Rate Specification

Data type [size]


Varchar (10) Varchar (10) Numeric (9,2) Varchar (10) Numeric (9,2) Varchar (40)

Constraints

42

Table Name-EnquiryMaster Field name


EnquiryNo EnquiryDate RequisitionNo VendorCode DueDate

Data type [size]


Varchar (10) Datetime Varchar (10) Varchar (10) Datetime

Constraints
Primary key

Table Name-EnquiryDetail

Field name
EnquiryNo ItemCode Quantity UnitCode Specification

Data type [size]


Varchar (10) Varchar (10) Numeric (9,2) Varchar (10) Varchar (40)

Constraints

43

3.4 User Interface Design


User interface portion of a software product is responsible for all interactions with the user. Almost every software product has a user interface. User interacts with a software product through its user interface, which is the primary component of any software product that is directly relevant to the users. User interface of our project has several characteristics. They are as follows: It is simple to learn. The time and effort required to initiate and execute different commands is minimum. Once users learn how to use interface, their speed of recall about how to use the software. It is attractive to use. The commands supported by interface are consistent.

SCREEN SHOTS

44

4. CODING

4.1 Complete Project Coding

4.3 Standardization of the coding /Code Efficiency

45

The process of optimization starts from the designing stage itself and continues till the deployment and distribution stage. Optimizing speed In order to optimize the speed of the application the following techniques are used: Use of appropriate data type Assigning property values to variables Using Early binding instead of late binding

Use of appropriate data type The use of appropriate data type optimizes the execution speed of the application. Too many implicit data type conversions slow down the execution. The use of variant data type has been avoided in the application as it causes application to run slowly. Assigning property values to variables Accessing a value from a variable is 10 to 20 times faster than accessing it from a property because accessing a value from a property makes use of calling an object. This is not true for a variable as there is no overhead of making a call to the object. This is used to optimize the execution speed of the application. An unoptimized code would have been k = Convert.ToString(Convert.ToInt32(k) + 1); An optimized code would have been c1.cmd.CommandText = "select top 1 city_code from city_master order by city_code desc"; c1.adp.Fill(c1.ds, "code"); b = dr[city_code].ToString(); k = Convert.ToString(Convert.ToInt32(b) + 1);

46

Using Early Binding instead of Late Binding

A client interacts with a component using its properties and methods. In order to access the properties and methods of a component, the client needs to be bound to the component. The process of associating a client with a component is binding. When you implement early binding between a clients call and a component method, the method called is determined at compile time i.e. the call is associated with the appropriate method during the process of compilation. Performance speed Syntax checking at compile time Display of objects in the Object Browser window Provision of help in the Object Browser

Optimizing the Display Speed In order to display a form frequently Hide and Show methods are used instead of the Unload and load events. Loading and unloading a form involves memory overhead and is therefore slower.

Optimizing the Memory The application has been optimized to occupy the least amount of memory and still give a good performance. In order to optimize the memory the data is reclaimed from the string by setting it to . Compiling to Native Code Native code compilation offers several options for performance tuning than are available with P-code compilation. We can use one of the following options for compiling the application: 1) Optimize for fast code 2) Optimize for slow code 3) Favor Pentium Pro(tm)

47

Optimize for fast code This is the best option when the application has been optimized on speed and there is large storage space on the disk. The compiler compiles the code for faster execution. It examines the program structure and changes some of it, resulting in an optimized code but an enlarged executable file. This is the default compile option in Visual Studio 2005.

Optimize for slow code This is the best option when one is concentrating on the hard disk space and not the speed. The compiler creates the smallest possible compiled code, occupying less disk space but probably slow in execution.

Advanced Optimization Options The following Advanced optimization techniques were used: Assume No Aliasing As Aliasing provides a name that refers to a memory location that is already referred by a different name, selecting this option allows the compiler to apply optimization that it could not otherwise be applied. Remove Array Bounds Checks Selecting this option can also optimize application speed. The Visual Studio Compiler by default checks for array indexes and their dimensions. It reports an error if an array index is out of bounds. As the arrays used here (minimum) are sure not to go out of bounds, therefore choosing this option will actually optimize speed and thus gives a faster code.

4.6 Validation checks

48

For correct data entry we will provide different type of validation during software development process. These checks work as barrier to enter irregular entries entered by the user or someone else. These validations are described below: -

Validation Checks-VENDOR ENTRY Vendor Names should not be duplicated. Vendor Code should be unique. All the necessary information like vendor Name, Address, Contact No, should be specified.

Validation Checks-ITEM ENTRY Items Names should not be duplicated. Item Code should be unique. All the necessary information like Item Name, Specification, Category, and levels should be specified. Item category such as Product or raw material should be mentioned.

Validation Checks-VENDOR ITEM RELATIONSHIP

Vendor code and Item code should be valid and exist in corresponding tables. New Item rates should be specified and reflect in Item Master as last rate.

Validation Checks-REQUISITION ENTRY

Requisition No should be unique and in incremental order. Requisition date should be in (mm/dd/yyyy) format. Department Name from, where requisition generated should be specified. 49

Department code must be valid and exist in department table. Each Requisition must have at least one item detail.

Validation Checks-ENQUIRY ENTRY

Enquiry No should be unique and in incremental order. All dates should be in (mm/dd/yyyy) format. Vendor Code must be valid and exist in Vendor Master. Each Enquiry must have al least one-item detail. Items detail should be defined with specification required.

Validation Checks-QUOTATION ENTRY AND APPROVAL

Quotation Date should not be less than enquiry date. All dates should be in (mm/dd/yyyy) format. Enquiry no should be mentioned in Quotation detail. Quotation detail must have at least one item detail. Enquiry no must be valid and exist in Enquiry table. Quotation detail should have delivery mode, dispatch mode and payment terms specified clearly.

Validation Checks-GENERATION PURCHASE ORDER

PO No. should be unique and in incremental order. All dates should be in (mm/dd/yyyy) format. Vendor Code must be valid and exist in Vendor Master. Each PO must have al least one-item detail. Items detail should be defined with specification required, Quantity purchased and unit name.

50

PO should be generated for one vendor at a time. PO should be generated only for pending requisitions.

Validation Checks-REQUISITION CLOSE Only Pending Requisition should be closed.

Validation Checks-PURCHASE ORDER CLOSE Only Pending PO should be closed.

51

5. TESTING

5.1 Testing techniques and Testing strategies


Software testing is a critical element of software quality assurance and represent the ultimate review of specification, design, coding. The purpose of product testing is to verify and validate the various work products viz. units, integrated unit, final product to ensure that they meet their requirements.

5.1.1 Testing Objectives

Basically, testing is done for the following purposes: 1. Testing is a process of executing a program with the intent of finding an error. 2. A good test case is one that has a high probability of finding an as yet undiscovered error. 3. A successful test case is one that uncovers an as yet undiscovered error. Our objective is to design test cases that systematically uncover different classes of errors and do so with a minimum amount of time and effort. This process has two parts: Planning: This involves writing and reviewing unit, integration, functional, validation and acceptance test plans. Execution: This involves executing these test plans, measuring, collecting data and verifying if it meets the quality criteria. Data collected is used to make appropriate changes in the plans related to development and testing. The quality of a product or item can be achieved by ensuring that the product meets the requirements by planning and conducting the following tests at various stages.

52

5.1.2 Types of Testing Software

The main types of software testing are: Component Testing Starting from the bottom the first test level is Component Testing, sometimes called Unit Testing. It involves checking that each feature specified in the Component Design has been implemented in the component. In theory an independent tester should do this, but in practice the developer usually does it, as they are the only people who understand how a component works. The problem with a component is that it performs only a small part of the functionality of a system, and it relies on co-operating with other parts of the system, which may not have been built yet. To overcome this, the developer either builds, or uses special software to trick the component into believing it is working in a fully functional system. Interface Testing As the components are constructed and tested they are then linked together to check if they work with each other. It is fact that two components that have passed all their tests, when connected to each other produce one new component full of faults. These tests can be done by specialists, or by the developers. Interface testing is not focused on what the components are doing but on how they communicate with each other, as specified in the System Design. The System Design defines relationship between components, and this involves stating: 1).What a component can expect from another component in terms of services. 2). How these services will be asked for. 3). How they will be given. 4). How to handle non standard conditions, i.e. errors. 5). Tests are constructed to deal with each of these. 53

The tests are organized to check all the interfaces, until all the components have been built and interfaced to each other producing the whole system. System Testing Once the entire system has been built then it has to be tested against the System Specification to check if it delivers the features required. It is still developer focused, although specialist developers known as system testers are normally employed to do it. In essence System testing is not about checking the individual parts of the design, but about checking the system as a whole. In effect it is one giant component. System testing can involve a number of specialist types of test to see if all the functional and non-functional requirements have been met. In addition to functional requirements these may include the following types of testing for the non-functional requirements: 1). Performance- Are the performance criteria met? 2). Volume- Can large volumes of information be handled? 3). Stress- Can peak volumes of information be handled? 4). Documentation- Is the documentation usable for the system? 5). Robustness- Does the system remain stable under adverse circumstances? There are many others, the needs for which are dictated by how the system is supposed to perform. Acceptance Testing Acceptance testing checks the system against the Requirements. It is similar to system testing in that the whole system is checked but the important difference is the change in focus: System testing checks that the system that was specified has been delivered. Acceptance testing checks that the system delivers what was requested. The customer and not the developer should always do acceptance testing. The customer knows what is required from the system to achieve value in the 54

business and is the only person qualified to make that judgment. The forms of tests may follow those in system testing, but at all times they are informed by the business needs. Release Testing Even if a system meets all its requirements, there is still a case to be answered that it will benefit the business. Release testing is about seeing if the new or changed system will work in the existing business environment. Mainly this means the technical environment, and checks concerns such as: 1). Does it affect any other systems running on the hardware? 2). Is it compatible with other system? 3). Does it have acceptable performance under load? These tests are usually run by the computer operations team in a business. It would appear obvious that the operation team should be involved right from the start of a project to give their opinion of a new system may have. Test Case Design Test case design focuses on a set of techniques for the creation of test cases that meet overall testing objectives. In test case design phase, the engineer creates a series of test cases that are intended to demolish the software that has been built. Any software product can be tested in one of two ways: 1). Knowing the specific function that a product has been designed to perform, tests can be conducted that demonstrate each function is fully operational, at the same time searching for errors in each function. This approach is known as Black Box Testing. 2). Knowing the internal workings of a product, tests can be conducted to ensure that internal operation performs according to specifications and all internal components have been adequately exercised. This approach is known as White Box Testing. 55

Black box testing is designed to uncover errors. They are used to demonstrate that software functions are operations; that input is properly accepted and output is correctly produced; and that integrity of external information is maintained. A black box examines some fundamental aspects of a system with little regard for the internal logical structure of the software. White box testing of software is predicated on close examination of procedural details. Providing test cases that exercises specific set of conditions and/or loops tests logical paths through the software. The state of program may be examined at various points to determine if the expected or asserted status corresponding to the actual status.

5.1.3 Testing the Purchase Management System

Testing phase is the very important phase in the software development so it is fully kept in mind while developing this software. In case of this software, testing has been done in the following areas and manner:--

5.1.3.1) Functional Testing According to the need of the software, the following testing plans have been planed on some amount on test data. Hypothetical data is used to test the system before implementation. Some temporary user ids are created to check the validity and authenticity of the users. Various constraints are checked for their working. A demo case will be taken with dummy data for new users.

5.1.3.2) Security Testing User id and password is checked and verified for secure login and access. It will be demonstrated that two different login sessions have different permissions on the menu items. In case a user forgets his password

56

then administrator has rights to change the password or allocate new password.

5.1.3.3) Performance Testing

Based on the field conditions these testing for fine tuning can be carried out at a later date.

Peak load testing Storage testing Performance time testing Recovery testing

Testing team will take over the project after the initial unit testing, which would mark the completion of project.

5.2 Debugging and Code improvement


Debugging The purpose of debugging is to locate and fix the offending code responsible for a symptom violating a known specification. Debugging typically happens during three activities in software development, and the level of granularity of the analysis required for locating the defect differs in these three. The first is during the coding process, when the programmer translates the design into an executable code. During this process the errors made by the programmer in writing the code can lead to defects that need to be quickly detected and fixed before the code goes to the next stages of development. Most often, the developer also performs unit testing to expose any defects at the module or 57

component level. The second place for debugging is during the later stages of testing, involving multiple components or a complete system, when unexpected behavior such as wrong return codes or abnormal program termination (abends) may be found. A certain amount of debugging of the test execution is necessary to conclude that the program under test is the cause of the unexpected behavior and not the result of a bad test case due to incorrect specification, inappropriate data, or changes in functional specification between different versions of the system. Once the defect is confirmed, debugging of the program follows and the misbehaving component and the required fix are determined. The third place for debugging is in production or deployment, when the software under test faces real operational conditions. Some undesirable aspects of software behavior, such as inadequate performance under a severe workload or unsatisfactory recovery from a failure, get exposed at this stage and the offending code needs to be found and fixed before large-scale deployment. This process may also be called problem determination, due to the enlarged scope of the analysis required before the defect can be localized. Code Improvement The process of optimization starts from the designing stage itself and continues till the deployment and distribution stage. Optimizing speed In order to optimize the speed of the application the following techniques are used: Use of appropriate data type Assigning property values to variables Using Early binding instead of late binding Use of appropriate data type The use of appropriate data type optimizes the execution speed of the application. Too many implicit data type conversions slow down the execution. The use of variant data type has been avoided in the application as it causes application to run slowly. 58

Assigning property values to variables Accessing a value from a variable is 10 to 20 times faster than accessing it from a property because accessing a value from a property makes use of calling an object. This is not true for a variable as there is no overhead of making a call to the object. This is used to optimize the execution speed of the application. An unoptimized code would have been k = Convert.ToString(Convert.ToInt32(k) + 1);

The optimized code is as follows: c1.cmd.CommandText = "select top 1 city_code from city_master order by city_code desc"; c1.adp.Fill(c1.ds, "code"); string b = dr["city_code"].ToString(); k = Convert.ToString(Convert.ToInt32(b) + 1); Using Early Binding instead of Late Binding A client interacts with a component using its properties and methods. In order to access the properties and methods of a component, the client needs to be bound to the component. The process of associating a client with a component is binding. When you implement early binding between a clients call and a component method, the method called is determined at compile time i.e. the call is associated with the appropriate method during the process of compilation. Performance speed Syntax checking at compile time Display of objects in the Object Browser window Provision of help in the Object Browser

59

Optimizing the Display Speed In order to display a form frequently Hide and Show methods are used instead of the Unload and load events. Loading and unloading a form involves memory overhead and is therefore slower. Optimizing the Memory The application has been optimized to occupy the least amount of memory and still give a good performance. In order to optimize the memory the data is reclaimed from the string by setting it to . Compiling to Native Code Native code compilation offers several options for performance tuning than are available with P-code compilation. We can use one of the following options for compiling the application: 4) Optimize for fast code 5) Optimize for slow code 6) Favor Pentium Pro(tm) Optimize for fast code This is the best option when the application has been optimized on speed and there is large storage space on the disk. The compiler compiles the code for faster execution. It examines the program structure and changes some of it, resulting in an optimized code but an enlarged executable file. This is the default compile option in Visual Studio 2005. Optimize for slow code This is the best option when one is concentrating on the hard disk space and not the speed. The compiler creates the smallest possible compiled code, occupying less disk space but probably slow in execution. Advanced Optimization Options The following Advanced optimization techniques were used: Assume No Aliasing

60

As Aliasing provides a name that refers to a memory location that is already referred by a different name, selecting this option allows the compiler to apply optimization that it could not otherwise be applied. Remove Array Bounds Checks Selecting this option can also optimize application speed. The Visual Studio Compiler by default checks for array indexes and their dimensions. It reports an error if an array index is out of bounds. As the arrays used here (minimum) are sure not to go out of bounds, therefore choosing this option will actually optimize speed and thus gives a faster code.

61

6. System Security measures


Security Prompting the user for a userid and password in our application is a potential security threat. So credential information is transferred from the browser to server are encrypted. Cookies are an easy and useful way to keep user-specific information available. However, because cookies are sent to the browser's computer, they are vulnerable to spoofing or other malicious use. So we follow these guidelines: Do not store any critical information in cookies. For example, do not store a user's password in a cookie, even temporarily. Avoid permanent cookies if possible. Consider encrypting information in cookies. Set expiration dates on cookies to the shortest practical time we can.

6.1 Database/data security


A Database Security Strategy Much attention has been focused on network attacks by crackers, and how to stop these. But the vulnerability of data inside the database is somewhat overlooked. Databases are far too critical to be left unsecured or incorrectly secured. Most companies solely implement perimeter-based security solutions, even though the greatest threats are from internal sources. And information is more often the target of the attack than network resources. The best security practices protect sensitive data as it's transferred over the network (including internal networks) and when it's at rest. One option for accomplishing this protection is to selectively parse data after the secure communication is terminated and encrypt sensitive data elements at the SSL/Web layer. Doing so allows enterprises to choose at a very granular level

62

(usernames, passwords, and so on.) the sensitive data to secure throughout the enterprise. Application-layer encryption and mature database-layer encryption solutions allow enterprises to selectively encrypt granular data into a format that can easily be passed between applications and databases without changing the data. I'll focus on database-layer encryption in this article. Data encryption The sooner data encryption occurs, the more secure the information is. Due to distributed business logic in application and database environments, organizations must be able to encrypt and decrypt data at different points in the network and at different system layers, including the database layer. Encryption performed by the DBMS can protect data at rest, but you must decide if you also require protection for data while its moving between the applications and the database and between different applications and data stores. Sending sensitive information over the Internet or within your corporate network as clear text defeats the point of encrypting the text in the database to provide data privacy.

6.2 Creation of User profiles and access rights


Determining user profiles and their privilege domains will contribute to the creation of a personalized software experience. Effective software must only present those that are relevant to a given user and within the user's domain of privilege. These must also reflect the specific grains relevant to the user. Application personalization requires the establishment of a three-dimensional framework inclusive of the following: 1. User groups and hierarchies 2. Privilege domain 3. Content domain

63

Application personalization is an intersection of a three-dimensional framework User Groups and Hierarchies The main purpose for developing user groups and hierarchies is to avoid the repetitive task of allocating certain sets of privilege and content access to each individual user. Establishing such groupings allows allocating a set of privileges to all users within a given group with a single stroke of software command. Similarly, through inheritance, multiple user groups within the hierarchy may be allocated a set of privileges with a single click. Privilege Domain These privileges are essentially functions that may be performed within a dashboard software program. Therefore, privilege is simply a software function, and each privilege domain is a unique collection of those functions. Often, a set of privileges is collectively referred to as a role. For example, the previous list of privileges could be grouped into three sets and each set associated to a role. The three roles and the corresponding set of privileges could be: Content Domain The user groups are defined and the roles have been assigned, but still unanswered is the question: What data and KPIs does a user see on a 64

dashboard? Answering this question leads us to the issue of content domainthe parameters of which would define the KPI granularity, the reports, and the alerts for each dashboard user. Managing content domain involves two aspects: (1) security and (2) relevance. Security refers to the restriction of information delivery only to those with the privilege to access certain information. Information is inherently confidential, and every organization has its boundaries regarding who may access what information. The security framework must be created during a dashboard deployment, determining the permissions and restrictions on the content domain of each user. Relevance refers to the filtering of the most relevant content to a given dashboard user. From all of the permitted information for a given user, an effective dashboard must present the most relevant content with flexibility for the user to access more information as needed.

65

7. Cost Estimation of the Project


For a given set of requirements it is desirable to know how much it will cost to develop the software to satisfy the given requirements, and how much time development will take. These estimates are needed before development is initiated. The primary reason for cost and schedule estimation is to enable the client or developer to perform a cost benefit analysis and for project monitoring and control. A more practical use of these estimates is in bidding for software projects, where the developers must give cost estimates to a potential client for the development contract. For a software development project, detailed, and accurate cost and schedule estimates are essential prerequisites for managing the project. Otherwise, even simple question like is the project late, are there cost overruns, and when is the project likely to complete cannot be answered. Cost and schedule estimate are also required to determine the staffing level for a project a deferent phase. It can be safely said that cost and schedule estimates are fundamental To any form of management and are generally always required for a project. Cost in a project is due to the requirement for software, hardware, and human resources. Hardware resources are such thing as the computer time, terminal time, and memory required Fir the project, whereas software resources include the tool and compilers needed during development .The bulk of the cost of software development is due to the human resources needed, and most cost estimation procedure focus on this aspect. Most cost estimates are determined in terms of person-month (PM). By properly including the overheads in rupees cost of a person-month, besides including the direct cost of the person month, most costs for a project can be incorporated by using PM as the basic measure. Estimates can be based on subjective opinion of some person or determined through the use of models. The costs associated with the system are expenses, outlays or losses arising from developing and using a system. But the benefits are the advantages received from installing and using this system. Cost and benefits can be classified as follow:

66

a)Tangible or intangible Cost that are known to exist but their financial value cannot be exactly measured are referred to as intangible costs. The estimate is only an approximation. It is difficult to fix exact intangible costs. For example, employee movable problems because of installing new system are an intangible cost. How much moral of an employee has been affected can not be exactly measured. In terms of financial value:b) Fixed or variable Some costs and benefits remain constant, regardless of how a system is used. Fixed costs are considered as sunk costs. Once encountered, they will not recur. For example, the purchase of an equipment for a computer center is called as fixed cost as it remains constant whether in equipment is being is called as fixed cost as it remains constant whether in equipment is being used extensively or not. Similarly, the insurance, purchase of software etc. Contrast, variable costs are incurred on a regular basis. They are generally proportional to work volume and continue as long as the system is in operation. For example, the cost of forms varies in proportion to the amount of processing or the length of the reports desired. c) Direct or indirect- direct cost are those which are directly associated with a system. They are applied directly to the operator. For example, the purchase of floppy for Rs 500/- is a direct cost because we can associate the floppy box with money spent. Direct benefits also can be specifically attributable to a given project. For example, a new system that can process 30 percent more transactions per day is a direct benefit. Indirect cost is not directly associated with a specific activity in the system. They are often referred to as overhead expenses. For example, cost of space to install a system, maintenance of computer center, heat, light and air-

67

conditioning are all tangible costs, but it is difficult to calculate the proportion of each attributable to a specific activity such as a report. The estimation of cost of the project is a difficult task but we can estimate the cost of the project by various methods. I am using the COCOMO (Constructive Cost Model). The model has following hierarchy:Model 1:-The basic COCOMO model computed software development effort land cost as a function of program size expressed in estimated lines of code. Model 2:-The intermediate COCOMO model computer software development effort as a function of program size and a set of cost drivers that include subjective assessments of product, hardware personnel and project attributes. Model 3:-The advanced COCOMO model incorporates all characteristics of the intermediate version with an assessment of the cost drivers impact on each step (analysis, design etc.) of the software engineering process. The COCOMO model is defined for three classes of software projects are:1) 2) Organic Mode: Relatively small, simple projects in which small teams Semidetached Mode: An intermediate (in size and complexity) with good application experience work to a set of less than rigid requirements. software project in which teams with mixed experience levels must meet a mix of rigid and less than rigid requirements. 3) Embedded Mode: A software project that must be developed within a The basis COCOMO equation takes the form. E D = = ab KLOCbb Cb Edb set of tight hardware, software and operational constraints.

Where E is the effort applied in person months, D is the development time in chronological months, and KLOC is the estimated number of delivered lines of code for the project. (Express in thousands). The coefficients ab and cb and the exponents bb and db where taken as.

68

This project is an organic project so :ab = bb cb db LOC So, E 2.4 = = = = = = = = D = = = = above N = = 21/7 = 3 3 person 1.05 2.5 0.38 8000 8000/1000 2.4(8)1.05 21.30 21 person - months 2.5 (E) 0.35 2.5 (21) 0.35 7.25 months 7 months approximately = 8 2.4 (KLOC)1.05

Number of line of code in my project LOC=800 KLOC =

Now calculate the D is the development time in chronological months

The computer project duration we use the effort estimated described

Where N is the recommended number of people for the project.

69

8. Reports

70

9. PERT Chart, Gantt Chart


9.1 PERT Chart
Program Evaluation Review technique(PERT) Chart is mainly used for high risk projects with various estimation parameters. For each module in a project, duration is estimated as follows: 5. Time taken to complete a project or module under normal conditions, tnormal. 6. Time taken to complete a project or module with minimum time, tmin. 7. Time taken to complete a project or module with maximum time, tmax. 8. Time taken to complete a project from previous related history, thistory. An average of tnormal, tmin, tmax and thistory is taken depending upon the project. Beta Testing Programming SRS And Design Alpha Testing

20 1 15 2 5

35

20

20

10

20 4 6

20 7

20 9

10 11

User Requirement And Analysis

Buy Hardware

Installation

Writing Writing Manuals

Training

User

User Test

71

Figure 56 : P.E.R.T CHART FOR ONLINE Recuriment Process & Employee Management System for HR Group for a Company

9.2 Gantt Chart


Gantt Chart is also known as Time Line Charts. A Gantt chart can be developed for the entire project or a separate chart can be developed for each function. A tabular form is maintained where rows indicate the tasks with milestones and columns indicate duration (weeks/months). The horizontal bars that spans across columns indicate duration of the task. June Requirement Gathering Design Test Cases Coding Testing Build July August September

Figure 57 : Gantt Chart

72

10. Future scope and further enhancement of the Project


10.1 Further Scope
A test system proposal according to its workability, impact on organization ability to meet users need and effective use resources; it focuses on the following three major questions: What are the users demonstrable needs and how does it need them? What resources are available for the given system? Is the problem worth solving? What is the likely impact of the system on the organization? Each of these questions has to be answered carefully. They revolve around investigation and evaluation of the problems. Identification and description of candidate systems, specification of performance and the cost of each system and the final selection of the best system. End User Support 1. The proposed system is developed in ASP.NET and SQL Server. 2. If organization increases users, it just has to add more machines and install the software on it, which is in the form of exe. Security Security features are implemented. No unauthorized access to the package, as the security is implemented through login and password. Last but one of the most important advantages of the Purchase Management System is that, this system can be used in any Govt. or Public organization, to process and manage their Purchase department working, with slight modifications.

73

There is no doubt that there always remains some scope of improvement. The important thing is that the system developed should be flexible to accommodate any future enhancements. This system can be used to provide some enhancements without rewriting of existing code.

74

10.2 Further Enhancement of the Project


Everything that is made has some or the other things to be added to make it better than revolutions. The project Online Purchase Management System also falls in the same domain. Although it has been tried to develop a robust and fault free system, still enough flexibility has been provided for further enhancements and modifications. As I mentioned earlier that the designed forms are typically reflections of the developer, so I strongly believe that the enhancement to be done with the project to be done with the design changes, coding changes. But at the same time I would like to mention that since one can not claim himself as a master of the technology there is always some scope of technical modifications in the project that may lead to find code redundancy & storage space minimization. Since the data is retrieved from the tables where everything is based on

the coding system if the coding system is changed then the system needs to be redesigned. The number of queries can always be increased when needed by the user just by modifying the code little bit, full concentration is maintained on the design of the system so that it can be easily modified. Design of the system can be changed in the sense of the flow of the control so that the coding can be decreased to a considerable level. The developed sub modules have all the related features but still improvement can be done. The developed package is flexible enough to incorporate the modifications or enhancements with less alteration. Purchase Management System can easily be incorporated in the ERP system, as it is in itself a separate module of other modules, In future web-enabled features can also be included in the software so that the information can be retrieved globally.

75

11. Bibliography
Black Book On ASP.NET Microsoft SQL Server 2000 in 21 Days by Richard Waymire, Rick Sawtell Software Engineering by Roger S. Pressman Software Engineering An Integrated Approach by Panka Jalote Referenced Sites www.msdn.microsoft.com www.w3schools.com www.vb.netcode.com www.microsoft.com

76

12. Appendices
12.1 Introduction to Visual Studio.net

Visual studio.net is a complete set of development tools for building ASP Web applications, XML Web services, desktop applications, and mobile applications. Visual Basic .Net, visual c++.NET all use the same integrated development environment (IDE), which allows them to share tools facilitates in the creation of mixed language solutions. In addition, these languages leverage the functionality of the .NET Framework, which provides access to key technologies that simplify the development of ASP Web applications and XML Web services. Architecture is explained form bottom to the top in the following discussion:

VB

C++

C#

Jscript

Common language specification ASP.NET Web services and Web forms Windows forms

ADO.NET Data and XML BASE Classes Common language Runtime

77

1. At the bottom of the architecture is common language Runtime. The common language runtime loads and executes code that targets the runtime. This code is therefore called managed code. 2. .NET Framework provides a rich set of class libraries. These include base classes, like networking and input/output classes, a data library for data access, and classes for use by programming tools, such as debugging services. 3. ADO.NET is Microsofts ActiveX Data Object (ADO) model for the .NET Framework. ADO.NET is intended specifically for developing web applications. 4. The 4th layer of the framework consists of the windows application model and, in parallel, the web application model. The Web application model-in the slide presented as ASP .NET includes Web Forms and Web Services .ASP.NET comes with built in Web forms controls, which are responsible for generating the user interface. They mirror typical HTML widgets like text boxes or buttons. 5. One of the obvious themes of .NET is unification and interoperability between various programming languages. In order to achieve this; certain rules must be laid and all the languages must follow these rules. 6. The CLR and the .NET Frameworks in general, however, are designed in such a way that code written in one language can not only seamlessly be used by another language. Hence ASP.NET can be programmed in any of the .net compatible language whether it is VB.NET, C#, Managed C++ OR JSCRIPT.NET.

The .NET Framework The .NET Framework is a multi-language environment for building, deploying, and running XML Web services and applications . It consists of three main parts:

78

Common Language Runtime Despite its name , the runtime actually has a role in both a component is running , the runtime is responsible for managing memory , allocation , starting up and stopping threads and processes , and enforcing security policy , as well as satisfying any dependencies that the component might have on other components . The Common Language Runtime is the execution engine for .NET Framework applications. It provides a number of services, including the following: code management loading and execution; application memory isolation; verification of type safety; conversion of IL (platform-independent code generated by compilers) to native, platform-dependent code; access to metadata, which is enhanced type information; managing memory for managed objects; enforcement of code access security; exception handling, including cross-language exceptions; interoperation between managed code, COM objects, and preexisting DLLs (unmanaged code and data); automation of object layout; support for developer services profiling, debugging, and so on.

Unified programming classes The Framework provides developers with a unified, object-oriented, hierarchical, and extensible set of class libraries (APIs). Developers use the Windows Foundation classes.

The .NET Framework class library The .NET Framework includes classes, interfaces, and value types that are used in the development process and provide access to system functionality. To facilitate interoperability between languages, the .NET Framework types 79

are Common Language Specification (CLS) compliant and can therefore be used from any programming language where the compiler conforms to the CLS. The .NET Framework types are the foundation on which .NET applications, components, and controls are built. The .NET Framework includes types that perform the following functions:

represent base data types and exceptions; encapsulate data structures; perform I/O operations; access information about loaded types via reflections; invoke .NET Framework security checks; Provide data access, rich client-side GUI, and server-controlled, clientside GUI.

The .NET Framework provides a rich set of interfaces, as well as abstract and concrete (non-abstract) classes. You can use the concrete classes as is or, in many cases, derive your own classes from them, as well as from abstract classes. To use the functionality of an interface, you can either create a class that implements the interface or derive a class from one of the .NET Framework classes that implements the interface. Microsoft, with the help of Hewlett-Packard and Intel, supplied the OSindependent subset of .NET class library to the ECMA standardization board. For more information visit:

Objectives Of .Net Framework: The .NET Framework is designed to fulfill the following objectives: To provide a consistence object-oriented programming environment whether object code is stored and executed locally but internet distributed, or executed remotely.

80

To provide a code execution environment that minimizes software deployment and versioning conflicts. To provide a code-execution environment that guarantees safe execution of code, including code created by an unknown or semi trusted third party.

To provide a code execution environment that eliminates the performance problems of scripted or interpreted environments. To make the developer experience consistent across widely varying types of applications, such as windows based applications and Web based applications.

To build all communication on industry standards to ensure that code based on the .NET Framework can integrate with any other code.

Server Application Development Server side applications in the managed world are implemented through runtime hosts. Unmanaged applications host the common language runtime, which allows your custom managed code to control the behavior of the server. This model provides us with all the features of the common language runtime and class library while gaining the performance and scalability of the host server.

ADO.NET ADO.NET is a set of libraries included with the Microsoft .NET Framework that help you communicate with various data stores from .NET applications. The ADO.NET libraries include classes for connecting to a data source, submitting queries, and processing results. You can also use ADO.NET as a robust, hierarchical, disconnected data cache to work with data off line. The central disconnected object, the DataSet, allows you to sort, search, filter, store pending changes, and navigates through hierarchical data. The Dataset also 81

includes a number of features that bridge the gap between traditional data access and XML development. Developers can now work with XML data through traditional data access interfaces and vice-versa. Microsoft Visual Studio .NET includes a number of data access features you can use to build data access applications. Many of these features can save you time during the development process by generating large amounts of tedious code for you. Other features improve the performance of the applications you build by storing metadata and updating logic in your code rather than fetching this information at run time. Believe it or not, many of Visual Studio .NETs data access features accomplish both tasks.

The ADO.NET Object Model Now that you understand the purpose of ADO.NET and where it fits into the overall Visual Studio .NET architecture, its time to take a closer look at the technology. In this chapter, well take a brief look at the ADO.NET object model and see how it differs from past Microsoft data access technologies. ADO.NET is designed to help developers build efficient multi-tiered database applications across intranets and the Internet, and the ADO.NET object model provides the means. Figure 1-1 shows the classes that comprise the ADO.NET object model. A dotted line separates the object model into two halves. The objects to the left of the line are connected objects. These objects communicate directly with your database to manage the connection and transactions as well as to retrieve data from and submit changes to your database. The objects to the right of the line are disconnected objects that allow a user to work with data offline. .NET Data Providers A .NET data provider is a collection of classes designed to allow you to communicate with a particular type of data store. The .NET Framework includes two such providers, the SQL Client .NET Data Provider and the OLE DB .NET Data Provider. The OLE DB .NET Data Provider lets you 82

communicate with various data stores through OLE DB providers. The SQL Client .NET Data Provider is designed solely to communicate with SQL Server databases, version 7 and later. Each .NET data provider implements the same base classesConnection, Command, DataReader, Parameter, and Transactionalthough their actual names depend on the provider. For example, the SQL Client .NET Data Provider has a SqlConnection object, and the OLE DB .NET Data Provider includes an OleDbConnection object. Regardless of which .NET data provider you use, the providers Connection object implements the same basic features through the same base interfaces. To open a connection to your data store, you create an instance of the providers connection object, set the objects ConnectionString property, and then call its Open method. Each .NET data provider has its own namespace. The two providers included in the .NET Framework are subsets of the System.Data namespace, where the disconnected objects reside. The OLE DB .NET Data Provider resides in the System.Data.OleDb namespace, and the SQL Client .NET Data Provider resides in System.Data.SqlClient. Namespaces A namespace is a logical grouping of objects. The .NET Framework is large, so to make developing applications with the .NET Framework a little easier, Microsoft has divided the objects into different namespaces. The most important reason for using namespaces is to prevent name collisions in assemblies. With different namespaces, programmers working on different components combined into a single solution can use the same names for different items. Since these names are separated, they dont interfere with each other at compile time. A more practical reason for namespaces is that grouping objects can make them easier to locate. Crystal Reports Crystal Reports for Visual Studio .NET is the standard reporting tool for Visual Studio .NET: it brings the ability to create interactive, presentation

83

quality content-which has been the strength of Cryatal Reports for years- to the .NET platform Crystal Reports fro Visual Studio .NET is an Integrated component of the Visual Studio .NET development environment.

12.2 Microsoft SQL SERVER Microsoft SQL Server 2000 is designed to work effectively as:

A central database on a server shared by many users who connect to it over a network. The number of users can range from a handful in one workgroup, to thousands of employees in a large enterprise, to hundreds of thousands of Web users.

A desktop database that services only applications running on the same desktop.

Server Database Systems Server-based systems are constructed so that a database on a central computer, known as a server, is shared among multiple users. Users access the server through an application:

In a multi tier system, such as Windows DNA, the client application logic is run in two or more locations.

A thin client is run on the user's local computer and is focused The business logic is located in server applications running on a

on displaying results to the user.

server. Thin clients request functions from the server application, which is itself a multithreaded application capable of working with many concurrent users. The server application is the one that opens connections to the database server. This is a typical scenario for an Internet application. For example, a multithreaded server application can run on a Microsoft Internet Information Services (IIS) server and service thousands of thin clients running on the

84

Internet or an intranet. The server application uses a pool of connections to communicate with one or more instances of SQL Server 2000. The instances of SQL Server 2000 can be on the same computer as IIS, or they can be on separate servers in the network.

In a two-tier client/server system, users run an application on their local computer, known as a client application, which connects over a network to an instance of SQL Server 2000 running on a server computer. The client application runs both business logic and the code to display output to the user, so this is sometimes referred to as a thick client.

Advantages of Server Database System Having data stored and managed in a central location offers several advantages:

Each data item is stored in a central location where all users can work with it. Business and security rules can be defined one time on the server and enforced equally among all users. A relational database server optimizes network traffic by returning only the data an application needs. Hardware costs can be minimized. Maintenance tasks such as backing up and restoring data are simplified because they can focus on the central server.

Advantages of SQL Server 2000 as a Database Server Microsoft SQL Server 2000 is capable of supplying the database services needed by extremely large systems. Large servers may have thousands of users connected to an instance of SQL Server 2000 at the same time. SQL Server 2000 has full protection for these environments, with safeguards that prevent problems, such as having multiple users trying to update the same piece of data at the same time. SQL Server 2000 also allocates the available 85

resources effectively, such as memory, network bandwidth, and disk I/O, among the multiple users. Extremely large Internet sites can partition their data across multiple servers, spreading the processing load across many computers, and allowing the site to serve thousands of concurrent users. Multiple instances of SQL Server 2000 can be run on a single computer. For example, an organization that provides database services to many other organizations can run a separate instance

Internet Clients of SQL Server 2000 for each customer organization, all on one computer. This isolates the data for each customer organization, while allowing the service organization to reduce costs by only having to administer one server computer. SQL Server 2000 applications can run on the same computer as SQL Server 2000. The application connects to SQL Server 2000 using Windows Interprocess Communications (IPC) components, such as shared memory,

86

instead of a network. This allows SQL Server 2000 to be used on small systems where an application must store its data locally. The illustration above shows an instance of SQL Server 2000 operating as the database server for both a large Web site and a legacy client/server system. The largest Web sites and enterprise-level data processing systems often generate more database processing than can be supported on a single computer. In these large systems, the database services are supplied by a group of database servers that form a database services tier. SQL Server 2000 supports a mechanism that can be used to partition data across a group of autonomous servers. Although each server is administered individually, the servers cooperate to spread the database-processing load across the group. What's New in Microsoft SQL Server 2000 Microsoft SQL Server 2000 extends the performance, reliability, quality, and ease-of-use of Microsoft SQL Server version 7.0. Microsoft SQL Server 2000 includes several new features that make it an excellent database platform for large-scale online transactional processing (OLTP), data warehousing, and e-commerce applications. The OLAP Services feature available in SQL Server version 7.0 is now called SQL Server 2000 Analysis Services. The term OLAP Services has been replaced with the term Analysis Services. Analysis Services also includes a new data mining component. The Repository component available in SQL Server version 7.0 is now called Microsoft SQL Server 2000 Meta Data Services. References to the component now use the term Meta Data Services. The term repository is used only in reference to the repository engine within Meta Data Services. Relational Database Enhancements Microsoft SQL Server 2000 introduces several server improvements and new features: 1. XML Support 2. Federated Database Servers 87

3. User-Defined Functions 4. Indexed Views 5. New Data Types 6. INSTEAD OF and AFTER Triggers 7. Cascading Referential Integrity Constraints 8. Full-Text Search Enhancements 9. Multiple Instances of SQL Server 10. Index Enhancements 11. 12. 13. 14. 15. 16. 17. 18. Client Components Clients do not access Microsoft SQL Server 2000 directly; instead, clients use applications written to access the data in SQL Server. SQL Server 2000 supports two main classes of applications:

Failover Clustering Enhancements Net-Library Enhancements 64-GB Memory Support Distributed Query Enhancements Updatable Distributed Partitioned Views Kerberos and Security Delegation Backup and Restore Enhancements Scalability Enhancements for Utility Operations

Relational database applications that send Transact-SQL statements to the database engine; results are returned as relational result sets. Internet applications that send either Transact-SQL statements or XPath queries to the database engine; results are returned as XML documents.

Relational Database APIs SQL Server 2000 provides native support for two main classes of database APIs:

OLE DB SQL Server 2000 includes a native OLE DB provider. The provider supports applications written using OLE DB, or other APIs that use OLE DB, such as ActiveX Data Objects (ADO). Through the 88

native provider, SQL Server 2000 also supports objects or components using OLE DB, such as ActiveX, ADO, or Windows DNA applications.

ODBC SQL Server 2000 includes a native ODBC driver. The driver supports applications or components written using ODBC, or other APIs using ODBC, such as DAO, RDO, and the Microsoft Foundation Classes (MFC) database classes.

Additional SQL Server API Support SQL Server 2000 also supports:

DB-Library Embedded SQL

Client Communications The Microsoft OLE DB Provider for SQL Server 2000, the SQL Server 2000 ODBC driver, and DB-Library are each implemented as a DLL that communicates to SQL Server 2000 through a component called a client NetLibrary.

. MS DTC Service 89

The Microsoft Distributed Transaction Coordinator (MS DTC) is a transaction manager that allows client applications to include several different sources of data in one transaction. MS DTC coordinates committing the distributed transaction across all the servers enlisted in the transaction. An installation of Microsoft SQL Server can participate in a distributed transaction by:

Calling stored procedures on remote servers running SQL Server.

Automatically or explicitly promoting the local transaction to a distributed transaction and enlist remote servers in the transaction.

Using Data Types Objects that contain data have an associated data type that defines the kind of data (character, integer, binary, and so on) the object can contain. The following objects have data types:

90

Columns in tables and views. Parameters in stored procedures. Variables. Transact-SQL functions that return one or more data values of a specific data type. Stored procedures that have a return code, which always has an integer data type.

Assigning a data type to an object defines four attributes of the Transact-SQL has these base data types.

Bigint Datetime Money smalldatetime Tinyint Constraints

Binary Decimal Nchar Smallint Varbinary

Bit Float Ntext smallmoney Varchar

char image nvarchar text uniqueidentifier

Cursor Int Real timestamp

Constraints allow you to define the way Microsoft SQL Server 2000 automatically enforces the integrity of a database. Constraints define rules regarding the values allowed in columns and are the standard mechanism for enforcing integrity. Classes of Constraints SQL Server 2000 supports five classes of constraints.

NOT NULL specifies that the column does not accept NULL values. CHECK constraints enforce domain integrity by limiting the values that can be placed in a column. UNIQUE constraints enforce the uniqueness of the values in a set of columns. PRIMARY KEY constraints identify the column or set of columns whose values uniquely identify a row in a table. FOREIGN KEY constraints identify the relationships between tables.

NO ACTION specifies that the deletion fails with an error.

91

CASCADE specifies that all the rows with foreign keys

pointing to the deleted row are also deleted. SQL Server Enterprise Manager SQL Server Enterprise Manager is the primary administrative tool for Microsoft SQL Server 2000 and provides a Microsoft Management Console (MMC)compliant user interface that allows users to:

Define groups of servers running SQL Server. Register individual servers in a group. Configure all SQL Server options for each registered server. Create and administer all SQL Server databases, objects, logins, users, and permissions in each registered server. Define and execute all SQL Server administrative tasks on each registered server. Design and test SQL statements, batches, and scripts interactively by invoking SQL Query Analyzer. Invoke the various wizards defined for SQL Server.

SQL Server Query Analyzer SQL Server Query Analyzer is a graphical User Interface for designing and testing Transact-SQL statements, batches, and scripts interactively. SQL Server Query Analyzer offers the following features: Free-form text editor for keying in Transact-SQL statements. Color coding of Transact-SQL syntax to improve the readability of complex statements Results presented in either a grid or a free form text window. Graphical diagram of the show plan information showing the logical steps built into the execution plan of a Transact-SQL statement. Index tuning wizard to analyze a Transact-SQL statement and the tables it references to see if adding additional indexes will improve the performance of the query.

92

13. Glossary
ANALYSIS: Analysis means breaking a problem into successively manageable parts for individual study. AUTOMATED SYSTEMS: These are nothing but man-made systems that interact with or controlled by one or more computers. CONVERSION: Conversion is the task of translating by new system. DATABASE: A store of integrated data capable of being directly addressed for multiple uses, it is organized so that various files can be accessed through a single reference based on the relationship among records in a file rather the physical location. DATACOUPLING: A form of coupling in which one modules

communicates information to another In the form of elementary parameters. DATAFLOW DIAGRAM: Graphic representation of data movement, process, and files (data stores) used in support of an information system. DATA INTEGRITY: The extends to which the data used for processing is reliable, accurate, and free from error. DATASTRUCTURE: A logically related set of data that can be decomposed into lower level data elements, a group of element handles as a unit. DOCUMENTATION: A means of communication, a written record of a phase of a project, it established design and performance criteria for phases of the project. ENDUSER: End-user is widely used by system analysts to refer to people who are not professional information system specialists but who use computers to perform their jobs.

93

CENTRAL FUNCTION: Central Function are the main work of the system .They transform the major inputs into the major outputs. COHESION: Strength with in a modules; degree of relationship between elements within a module. CONSTANT DATA: This implies to data that are same for every entry. COUPLING: It is the Strength of relation between module .The degree of dependence of one module on another, specifically, a measure of the chance that a defect in one module will appear as a defect in the other or the chance that a change to one module will necessitate a change to the other. CONTEXT DIAGRAM: This will be the most general diagram, really a birds eye view of data movement in the system. CONTROL: In system the element of component that governs the pattern of activities of the system. OVERHEAD: Allocated costs that include maintenance expenses heat light and power costs that are neither direct nor indirect, costs that are tagged to the general administration of the business. PARALLEL RUN: Putting the new system into operation in conjunction with continued operation of the old system PERT: Project Evaluating and Review Technique, A flow system model used to manipulate various values as a basis for determining the critical path. Paradigm: A very clear and typical example of something

94

RELIABILTY: Dependability or level of confidence, in system work-the need to gather dependable information for use in making decision about the system being studied. SECURITY: The protecting of data or hardware against accidental or intentional damage from a defined threat. UNIT TESTING: This involves the test carried out on modules programs which make up system. VALIDATION: Checking the quality of software in both simulated environment EXTERNAL ENTITY: They are organization other information system, departments or people which represent a source of destination of transaction or data FEASIBILITY STUDY: An important outcome of the preliminary investigation is the determination that the system request is feasible which is done through feasibility study. FEEDBACK: The part of a closed loop system that automatically brings back information about the condition being controlled. FILE: Collection of related records organized for a particular purpose. FLEXIBILITY: A measure of the degree to which a system as is can be used in a variety of ways. GANTTCHART: A static system model used for scheduling, portrays output performance against time MODEL: A logical or mathematical representation of a system that encompasses features of interest to the user 95

96

You might also like