You are on page 1of 44

FACULTY OF INDUSTRIAL TECHNOLOGY

Department of Industrial and


Manufacturing engineering
FINAL YEAR PROJECT
DEVELOPMENT OF A MODULAR PRODUCT DESIGN SOFTWARE PROGRAM
WHICH AIDS IN PRODUCT DEVELOPMENT
CASE STUDY: ZIMPLOW LIMITED

By
CYNTHIA MATHE
STUDENT NUMBER

N008 1572H

SUPERVISOR

MR N. GWANGWAVA

Final year project submitted in partial fulfillment of the Bachelor of Engineering


(Industrial & Manufacturing) Honors Degree, NUST
DEDICATION
I would like to dedicate this project to Mellisa and Mitchell who have always been there to support
me in the degree I am pursuing.

ACKNOWLEDGEMENTS
I would like to express my gratitude and appreciation to the Mealie brand staff for the support and
constructive criticism they gave when I was sharing ideas with them and for their guidance
throughout my stay there. For the foundation l have on engineering principles l thank the Industrial
and Manufacturing Engineering lecturers for enlightening me. Finally l thank the Almighty God.

ABSTRACT
The project serves to encourage the use of scientifically justifiable means of product
development which are systematic and less rigorous in a Research and development
environment. In this project, the author makes use of a Visual Basic program as well as an
Oracle database to come up with an integrated tool which will aid in product development.
Software modules are to be linked in such a way as to obtain customer requirements, analyse
data by means of an automated Quality Function Deployment (QFD), cluster parts according
to dependencies and give feedback in terms of a summarised report on the different stages of
the analysis. This will be achieved through the use of an internet based application to gather
customers views for example through a questionnaire that can be filled in online and
accessed from the company in-house application system for further analysis. The main thrust
being the incorporation of modular clusterization techniques for example the Modular
Function Deployment and design structure matrices.

TABLE OF CONTENTS
DEDICATION
ACKNOWLEDGEMENTS
ABSTRACT
TABLE OF CONTENTS
CHAPTER 1 : INTRODUCTORY CHAPTER
1.0 INTRODUCTION
1.1 AIM
1.2 OBJECTIVES
1.3 DEFINITION OF CRITICAL TERMS
1.4 SCOPE
1.5 BACKGROUND
1.6 NEED JUSTIFICATION
1.7 METHODOLOGY
1.8 TIME SCALE
1.9 CONCLUSION

CHAPTER 1: INTRODUCTORY CHAPTER


1.0 Introduction
The project will look at coming up with a software which comprises of software modules
which will tap customer requirements, analyse the requirements and interpret them
technically then finally cluster according to multi-component relationships. It serves to give a
skeletal view of what should be accomplished in the project.

1.1 Aim
To design a software program that supports modular product design in a new product research
and development (R&D) environment

1.2 Objectives
Design a database model for a research and development (R&D) enterprise with focus
on modular product design approach.
Implement the physical database model on a database management system (DBMS).
Design the Graphical User Interface (GUI) of the program.
Create a visual basic program featuring the following software modules;
o Internet based customers voice gathering and relationship
o Quality Function Deployment (QFD) and analysis
o Product module clustering using the Design Structure Matrix (DSM)
1.3 Definition of critical terms
Software program - Written procedures or rules and associated documentation pertaining to
the operation of a computer system and that are stored in read or write memory (Pressman,
2001)
Modular products -Refer to products, assemblies and components that fulfil various functions
through the combination of distinct building blocks (Pahl and Beitz, 1988).
Product development - Product Development is a creation of, innovation of, enhancing the
utility of or continuous improvement of earlier characteristics (like design, service, etc.) of an
existing product or developing an entirely new kind of product to satisfy the end-user's

requirements (Akrani, 2000).


Product architecture- Product architecture is the scheme by which the function of a product is
allocated to physical components. (Ulrich 1995)

1.4 Scope
The project focuses on database modelling in connection with an application which will be
placed online to interact with customers and then analysed in the in-house system application
of the company. Design of the corresponding Visual Basic program to carry out the in-house
analysis will be done. The Design Structure Matrix will be the only clustering technique used
in coming up with the modularisation tool. The software will be used in the Research and
Development department in liaising with customers, forecasting any new designs, design
changes, revisions or updates.

1.5 Background
Mealie Brand has been the monopoly in the manufacture of animal drawn implements mainly
targeted for subsistence farming. It distributes its products both to the local market and export
market within Southern Africa. Although the needs of the subsistence farmers have always
been the standard plough, cultivator and harrows etc the company decided to introduce a
Research and Development department to obtain new knowledge applicable to the company's
business needs, that eventually will result in new or improved products, processes, systems,
or services that can increase the company's sales and profits. Globally it has been apparent for
at least a century that future economic progress will be driven by the invention and
application of new technologies. R&D is one category of spending that develops and drives
these new technologies hence the need to concentrate on it.
An area that merits consideration is the use of various tools for organisation and planning of
R&D activities. The in-house Research and Development department at Mealie Brand is
fairly new and currently there is no software to analyse new obtained knowledge to aid in the
product development process. There are no highly analytical and prioritisation methods to
map the way forward. Decisions to execute that is transfer results to operations or not are
based on a series of meetings with heads of departments concerned and it takes a long time.

Traditionally, scientific and technological knowledge and skills concerning the product were
the prime source. Nowadays, managerial competencies and the ability to work for and with
clients and suppliers are becoming more important. To become a competitive business
research and development departments should create the competencies that enable them to
create value for their clients.

1.6 Need justification


Companies have to successfully implement strategies to design and develop products to
satisfy a wide variety of customer requirements. There is a need to promote commonality,
compatibility, standardization, or modularization among different products or product lines.
This modular product design program is a way of systematically coming up with the
orientation of a product. A modular product would mean that the development time is
reduced because once the design is split up into modules, design teams can work in parallel
on the different modules. The demand for better products is volatile and challenging to
manage, the rapid rate of innovation causes short product lifecycles, thus a short production
lead time associated with modular products enhances a firms ability to respond in
introducing new products. With significantly shortened product life cycles, manufacturers have
found that they can no longer capture market share and gain higher profits by producing large
volumes of a standard product for a mass market.

In this era of globalisation and intense competition the long term health of companies is tied
to their ability to innovate successfully and rapidly with the customer in mind thus it will be
worthwhile to have tools which aid in that. The solutions for local companies lies is in
grasping globalisation and embracing it wholesomely, being innovative not seeking to
optimize the poor yesterdays solution and having the customer in mind call for maintaining
strong customer relations and especially knowing what they want through such tools as QFD.
The software has provisions to store pre designed product modules such that should a
customer require a product consisting of existing modules development consists of simply
refining and assembling what is there thus reducing costs of development and production.
Information can be analysed quickly and shared across departments through the software
furthermore production managers can be able to establish the costs inherent or locked in the
particular design before committing any resources to it.

Mass customization is the new paradigm that replaces mass production, which is no longer
suitable for todays turbulent markets, growing product variety, and opportunities for ecommerce. The emphasis is on fast and easy production to eliminate delays at any point. In
designing products for mass customisation one approach amongst others is that of modular
customisation whereby product modules are literally building blocks that can customise a
product by assembling various combinations of modules. The software program can help in
coming up with the product building blocks.

1.7 Methodology
The research methodology will be as follows
A review of literature from relevant sites, various journals and conference proceedings
will be conducted.
Communication with customers via questionnaires on an internet based application
will be carried out to establish the current requirements on product properties, desired
functions or features.
A review of matrix representations techniques for example Design structure matrix

and Modular function deployment


Modeling of the information application system structure

1.8 Time scale


The schedule of the project can be summarised by means of a Gantt chart shown below
Activity

Octob

Nov Dec

Jan

Feb

Marc

Apri

er

201

201

201

201

2012

2013

201
3

Literature
Review
Company
Audit
Design
Analysis

of

results
Dissertation
Documentatio
n

1.9 Conclusion
Systematic procedures need to be adhered to in pinpointing possible flaws and opportunities
for improvement hence the need for a close scrutiny in continuous improvement, product
design and product development. This chapter has therefore given a summary of what the
project is set to achieve.

CHAPTER TWO: LITERATURE REVIEW


2.0 Introduction
This chapter explores all facts, accommodating useful and essential methodologies for the
theoretical appreciation of product development. Much attention will be paid on product
architecture, modularisation techniques particularly the application of the design structure
matrix concepts in clustering. An analysis of Quality Function Deployment will be done and
database design.
2.1 Product architecture
A product can be described by both its functional and physical elements. The functional
elements are the individual operations and transformations that contribute to the overall
performance of the product. The physical elements are the parts, components, and
subassemblies that ultimately implement the products functions (Ulrich and Eppinger 2000)
The assignment of the functional elements of a product to the physical building blocks of the
product, and definition of the interfaces between the physical building blocks, constitute the
product architecture (Ulrich and Eppinger, 2000). The product architecture, as part of the
product complexity, is discussed by Ulrich and Tung and four different types are identified

Modular design means that one function is allocated to one module.


Function distribution means that one function is mapped to several modules. This
distribution of function over several modules results in an integrated design on that

level.
Function sharing means that several functions are allocated to one module. Again,
function sharing increases the level of integration.

Integrated design means that several functions are allocated to several modules.
Functions are distributed and shared, thereby further increasing the level of
integration

Figure 1: Different types of product architectures


The area of interest is the modular architecture. A modular architecture, also called modular
design, allows a design change to be made to one module without generally requiring a
change to other modules for the product to function correctly. The opposite is an integrated
architecture. In this, functional elements are implemented by using more than one module; a
single module can also implement several functional elements. The interactions between the
modules are ill-defined and may be incidental to the primary function of the products.
Another part of the degree of modularity is the characteristics of the interfaces. A modular
architecture has well-defined, standardized, and decoupled interfaces. Two components are
coupled if a change made to one component
requires a change to the other component in order for the overall product to work correctly.
Integrated product architecture has no restrictions for either definition of interfaces or
coupling

2.2 Product modularisation


Product modularization is about grouping a number of components into modules, and
defining interfaces between modules. This should be done in such a way that design decisions
in one module are isolated from those in other modules. In a modularized product there is
loose coupling (Benassi,1998) between the modules; incidental interactions between a
module and the rest of the product have been minimized. If this is achieved, the product
modularization has the potential to give a number of benefits.
Modular product architectures are generated through the application of a pre-defined method.
An approach to modularity includes the method by which the architecture is derived but it
covers a bit more than just the method itself. The method itself is the way the data is captured
and processed, which is a slightly more narrow concept. The figure below tries to show, on a
very high level, what product architecture approaches have in common.

Figure 2: Architecture work


2.21 Model of reality

All approaches to modularity have to build a model of reality, that captures the aspects of the
product that have implications for the architecture (where the interfaces are required, for
example). Although some of the details differ between different approaches, there are
similarities.
Step One
On the far left, we see an icon that tries to represent the view of reality: this may be a view of
the requirements of the product .Whatever the source of data, team representatives from
Engineering or Marketing functions in the company gather data about the product, its usage,
the customers etc
Step two
Sift through the data to determine what is important in the project. Some data will be
discarded at this point, because it is out of scope or does not fit into the representation of the
product, be it a matrix, a drawing, a flowchart etc. What is left may be a list of customer
requirements, product properties, desired functions or features, cost data for concept
selections.
Step three
Some choice is typically made about the way this data should be represented. One or several
representations may be available, including matrices, flowcharts, product sketches.
Step four
If the objective is to make predictions about the best possible modules, it usually becomes
necessary to select some type of pre-defined representation. The upper is a matrix
representation, which has computer algorithm support for generating modules. The lower is a
function structure diagram (a type of flowchart). When diagrams are used, the work may be
conducted on paper, and module generation may be manual, using a set of pre-defined
heuristic rules for what constitutes good modules. Computer algorithms operating on matrix
representations include such methods as Design Structure Matrix, DSM, and Module
Function Deployment, MFD.
Step five
Depending on the algorithm, the output may be a sorted matrix or a dendrogram, as shown in
the graphic. The computer-generated output is analyzed by the team members, and decisions
are made about modules. Typically, there are many iterations of changing data and resorting
before the output is satisfactory. Once the output is deemed useful, it is documented in some
form, and goes to detailed design,

In this project step four is of paramount importance and thus an analysis of product
modularisation methods is necessary.

2.2.2 Product modularisation methods


A literature survey resulted in six different methods for modularization.
The results of the analysis show that the modularization methods are
useful for products with simple product architecture that is, one function is
allocated to one physical module. However, concerning higher degrees of
product complexity, several functions allocated to several physical
modules, or large physical variation of the variants, the methods seem
insufficient.
Table 1: Modularisation methods found in the literature survey

The analysis, however, must consider what the core of the modularization method is. The
selected methods will be evaluated in terms of these three different phases:

decomposition of the product,


integration phase
evaluation phase.

2.2.2.1 (DSM) Design Structure Matrix (Integration Analysis of Product


Decompositions)
The method includes three steps: decomposition, documentation of interactions between the
elements, and finally clustering the elements into chunks.
1. The first step is decomposition of the overall product concept in terms of functional
and/or physical elements.

2. The next step involves documenting the interactions between elements. Pimmler and
Eppinger (1994) distinguish four types of interactions: spatial, energy, information,
and material. These interactions should be scored on a five-point scale (from 2 to +2)
based on the relative need for each interaction. The interaction between two elements
can be seen as a vector of four scores.
3. Clustering the elements into chunks is the final step in the method. The interactions
between the elements are documented in a DSM with each of the elements on each
axis. Then a clustering algorithm is used to reorder the rows and columns in the
matrix to cluster the positive elements closer to the diagonal. This results in a block
matrix in which the blocks on the diagonal correspond to the resulting architectural
clusters. There are several algorithms that can be used for clustering the elements, or
the clustering can be done with regard to the most important interaction in the design.
2.2.2.2 (AD) Axiomatic Design [Suh, 1990]
The axiom that is the basis for the method stipulates that functions in a product should be
independent of each other The AD method is rather simple in its design. The tool in the
method is the design equation
{FR} = [DM]{DP}..................................................................................................2.2.2.2.1
Where
{FR} is a vector representing the functional requirements, which are derived from
decomposition of the products functions.
{DP} is a vector representing the design parameters, which is the physical realization of the
{FR}.
[DM] is the design matrix that contains the product information needed by the designer.
The formation of the equation should be carried out by using the two following axioms
Axiom 1: The independence axiom
In an acceptable design, the DPs and the FRs are related in such a way that a specific DP can
be adjusted to satisfy its corresponding FR without affecting other functional requirements.
Axiom 2: The information axiom
The best design is a functionally uncoupled design that has the minimum information content.
The solution of the equation is the best design
2.2.2.3 (MFD) Modular Function Deployment [Erixon, 1998]

This method contains five steps, and, in the description of these steps, several tools to be used
in the modularization process are given.
1. The first step, clarifying the customers requirements, is realized by the use of Quality
Function Deployment (QFD). However, the QFD matrix has been changed by the
addition of a column for modularity to the design requirement axis. This is done in
order to emphasize the aim to modularize the product. Erixon [1998] further argues
that this first step should be completed in a cross-functional team. The use of QFD
results in the most important product properties, based on customers requirements.
2. The second step in the method is to create sub functions and then find the technical
solutions that fulfil these functions. Further, the best technical solution has to be
chosen; this selection is done by using a selection matrix, e.g., that of Pugh [1981]. In
that matrix all suggested technical solutions of a function are evaluated and rated.
Preferable is to do this by comparing the different technical solutions with the one
3.

used in a present product, if it is better, equal, or worse than that one.


Further, another matrix is used to support the concept generation, This matrix is
called the Module Identification Matrix (MIM) and has the products sub functions on
one axis and the module drivers on the other. The MIM matrix in this method is
largely the same as the suitability matrix in the MPM method of Huang and Kusiak
[1998]. Modular drivers to be used in the matrix are the company-specific criteria for
modularization. The method can be said to have a rather broad focus in the evaluation,
in the sense that the criteria represent different phases along the entire product life
cycle. Sub functions with some, or a few similar, modular drivers are to be considered
as the most appropriate to be integrated or grouped into a module. These can be found
by follow the rows, row by row, in the matrix.. Using sub functions instead of

technical solutions is a way to leave the existing form of the product.


4. Then there is a step in the method that aims at evaluating concepts, and in particular
the module interfaces. To support the evaluation, the method assists the user with a set
of different rules and formulas, e.g., the axiomatic method of Suh (1990), or the
assortment complexity of Boothroyd and Dewhurst (1987). The aim of this evaluation
is to examine if the use of the method has resulted in a modularized product that
decreases, for example, the product development costs and time (Erixon, 1998).
5. The last step is to improve the module on different levels, e.g., product range, product,
and part. Erixon suggests, for this purpose, using tools such as Activity-Based-Costing
and Design-for-Assembly. Erixon argues that this method should not be considered as

fully replacing the detailed development work that has to be done for each of the
modules.
2.2.2.4 ( MPM) Modeling the Product Modularity [Huang and Kusiak, 1998]
The MPM method uses a different approach for creating the modular structure than the other
methods that have been presented so far. In this method, matrix algebra is used for structuring
both the interactions between the parts in the product, and the suitability of the interactions.
Suitability is determined on the basis of module drivers, or strategic reasons for components
to be parts of modules.
1. The first step in the method is to specify the upper number of components in the
module, and then the matrix is created with columns and rows corresponding to
component numbers. An entry1 indicates that there is an interaction between two
components, and a blank indicates no interaction. If a component exists in different
designs, the number of designs is shown instead of the 1.
2. The next step is to use an algorithm for triangularization of the interaction matrix.
Then the suitability matrix should be rearranged so that the sequence of columns and
rows is the same as in the interaction matrix.
3. After that, the suitability matrix should be combined with the interaction matrix.
Using algebra, and the conditions and axioms stated in the method, gives the suitable
modules. The components that should be included in a module are found along the
diagonal in the matrix.
In this modularization method, the evaluation is included in the integration phasethe phase
in which the selection of the modules is made on the basis of the suitability of two
components or modules interaction.
2.2.2.5(MPD) Modular Product Development [Pahl and Beitz, 1996]
This method consists of six steps, and supports the user with guidelines about how each step
should be accomplished. It starts with clarifying the task for the product to be designed, and
ends with preparing production documents for the product.
1. The first step, clarifying the task, focuses on ascertaining what variety of tasks the
product should be able to perform.
2. The second step is an economic optimization of the modules; especially the economic
consequences when adding variety should be highlighted. It is suggested that all
common functions are used as a base part and then the variety is created by functional
add-ons to the base part.
3. Establishing function structures is the next step in the method, namely, dividing the
required overall functions into smaller sub functions. All functions must be

subdivided into a minimum of similar and recurrent sub functions, which should be
logically and physically compatible with each other. All variants should be included in
the sub functions and based on their characteristics, divided into the following groups:
basic, auxiliary, special, adaptive, and customer. The basic sub functions are the ones
that perform the products essential functions.
4. Searching for solution principles and concept variants is the step in the method where
the functional structure should be realized into technical solutions. Pahl and Beitz
[1996] stress that the use of similarity between energetic and physical working
principles is preferable in the function modules. When all sub functions are realized in
solutions, the selecting and evaluating phase starts. If several concept variants have
been found during the previous steps, each must now be evaluated with the help of
technical and economic criteria, to ensure that the most favourable solution concept is
5.

selected.
Once the solution concept has been selected, the individual modules must be
designed in accordance both with their functions and with the production

requirements. This is done in a step called preparing dimensioned layouts.


6. Preparing production documents is the last step in this method. Here, production
documents are created and structured in such a way that execution of orders can be
based on simple combinations. Setting of part numbers also belongs to this step.
2.2.2.6 (FPD) Fractal Product Design [Kahmeyer, Warnecke, and Sheider, 1994]
Although nominally about fractal product design, this is a modularization method. According
to Kahmeyer, Warnecke, and Sheider a fractal is defined as an independent module with
precisely defined functionality.To use the method is to work through five
defined steps
1. Starting with deciding upon a relevant product range to analyze. Then both the
product and the functional structures of the product range are analyzed. This product
analysis is completed with a definition of representations for the two structures.
2. The next step is a conceptual design of product fractals,
3. This is followed by conceptual design of the fractal interfaces. Kahmeyer, Warnecke,
and Sheider describe these steps in their method as developing alternative fractal
product structures, as well as alternative standardized interfaces for the product
fractals.
4. The results of application of the method, the fractals and their interfaces, are then
assessed and validated by using criteria such as function, quality, manufacturing,
assembly, disassembly, and recycling.

5. Finally, the fractals are redesigned and optimized by using classical redesign, such as
design for x and FMEA. Also QFD is mentioned.

2.3 Modern Design Structure Matrix


The use of the design structure matrix, or dependency structure matrix (DSM), was described
by Steward in the 1980s (Browning 1998, Ulrich 2000) for the analysis of the structure of a
systems design. Browning noted the advantages of dependency matrix-based analysis:
Their utility in these applications stems from their ability to represent complex relationships
between the components of a system in a compact, visual, and analytically advantageous
format. (Eppinger 1994) provides an overview of the basic DSM described by Steward to
represent the dependence of tasks on one another.
The DSM analysis methodology is described below:
1. The interface DSM is developed to identify dependence of components on one another.
The entry into a square of the component DSM was based on the values identified according
to set ranks. The dependence between two components was assumed to be associative, and
the resulting matrix ended up being symmetric about the diagonal

Figure 3:Dependence matrix


2. The elements in the interface DSM were clustered. Following the creation of the initial

DSM, the components were reordered in the matrix. The criteria used for regrouping was one
which placed the squares representing high interface dependency as close to the diagonal as
possible. Using this strategy, components that have a high degree of interdependence will
tend to form clusters. Since these DSMs were of moderate size, it was found that it was
acceptable to perform this activity by inspection using a spreadsheet.

Figure 4: Clustered dependence matrix


The first step shows the completion of component interface dependency identification. The second
matrix shows that in step 2, the components have been reordered and a set of three distinct clusters
can be seen:

the brake subsystem,


the power subsystem
the monitor and control subsystem.

2.3.1 Design structure Matrix Clustering Algorithms


Several algorithms have been developed for manipulation of DSMs.
2.3.1.1 Idicula-Gutierrez-Thebeau Algorithm (IGTA)
2.3.1.2 Fernandez clustering algorithm
In his approach, each element is placed in an individual set and bids evaluated from all the
other sets (clusters). If any cluster is able to make a bid that is better than the current base
case then the element is moved inside the cluster. The objective function is therefore a tradeoff between the costs of being inside a cluster and the overall system benefit.

2.3.1.3 Whitfield et al genetic algorithms


Whitfield used genetic algorithms to form product modules. His algorithm was also built
upon the same concepts introduced by Fernandez and as such suffers from similar
weaknesses.
2.4 Quality function deployment (QFD)
QFD is a customer-driven planning process to guide the design, manufacturing, and
marketing of goods. At the strategic level, QFD presents a challenge and an opportunity for
top management to break out of its traditional narrow focus on results, which can only be
measured after the fact, and to view the broader process of how results are obtained.
Under QFD, all operations of a company are driven by the voice of the customer, rather than
by edicts of top management or the opinions or desires of design engineers.QFD benefits
companies through improved communication and teamwork across all constituencies in the
production process, such as between marketing, design, manufacturing, purchasing and
suppliers. Product objectives are better understood and interpreted during the production
process. Use of QFD determines the causes of customer dissatisfaction, making it a useful
tool for competitive analysis of product quality by top management. Productivity
as well as quality improvements generally follow QFD. More significantly, QFD reduces the
time for new product development, and allows companies to simulate the effects of new
design ideas and concepts. (N Gwangwava S Mhlanga 2011)
QFD is a technique that translates customer requirements, expressed in the customers
language, into an action plan. While listening to customers has always been a good business
practice, QFD formalizes the somewhat arbitrary practice of just listening to and then trying
to meet some of the customers needs by creating a ranking of the most important customer
requirements. It also ensures these get heard by all the functions within the firm who need to
know
2.4.1 Objectives of using QFD in the projects
To define design and specifications for the product, meeting the highest level as possible of
customer requirements and satisfaction.
To ensure consistency between customer requirements and products measurable
characteristics such as dimensions and features of rooms and finish materials used in the
construction work.

To ensure consistency between the design phase and the actual fabrication. QFD can
minimize the problems that usually are detected on the interaction between design and
construction phases (including feasibility problems and reworks).
Optimize the integration of customers perceptions and variables that can affect the
return on investment such as construction cost, speed of sales, schedule and cash flow.
Reduce the time to perform quality features throughout product development
2.4.2 QFD matrices and functions
QFD is a series of interconnecting matrices often called the House of Quality because the
completed matrices resemble a house. An excellent description of how to build such a house
is given in Hauser and Clausing (1988). Each segment of the matrix is important in
assessing:

customer requirements;
the actions over which a business has control; and
the relationship between these two

A small team is formed to work on a new product opportunity or enhancement. A typical


team can consist of members from the marketing, engineering and production departments.
In addition, customers are represented on the team. The team will hold a series of discussions
with customers to explore needs and priorities. From this information, solutions will be
identified to meet these needs
Matrices convey information about the following issues:

Customer requirements identified and documented in the customers own language.


Relative importance of each customer requirement not all customer requirements
are of equal importance to the customer. Assigning a relative weight to each of them

reflects the relative importance of the requirements.


Business parameters the business parameters that might be used to satisfy the
customer requirements are listed across the ceiling of the house of quality. The

parameters are written in the language of the business.


Relationship matrix relationships between the customer requirements and the
business parameters are developed in the body of the matrix. The relationships are
usually specified as strongly related, moderately related, weakly related or not

related, and the matrix is developed using a symbol for each.


Computed ranking of business parameters the results of fundamental
computations integrating the information previously identified are presented in this
area of the house of quality. The business parameters are ranked such that those with

the highest score will have the greatest impact on the most important customer

requirements and will address the greatest number of customer requirements.


Competitors positions in a competitive market, realistic assessment of both the
competition and your own capability is important.

2.4.3 Developing the QFD matrix


The QFD matrix used in this case was based on the House of Quality. It was developed by
Clausing and Hauser in 1988.
1. Implementing the perception of focus group in the QFD matrix
In order to obtain the list of WHATS customers requirements (Figure 1) that will be
applied to the product design, the focus group is interviewed after receiving information
about other products of a similar nature. The focus group could evaluate different aspects of
the current design and compare it with the products of the competition . For this purpose the
group can use drawings and basic information from other products (specifications,
dimensions of components, finishing materials etc).After the benchmarking between the
current product and the competitions products, it is possible to establish the degree of
importance of the designs solution (Figure 1).
Obtaining information from customers
For this case two techniques were used for gathering information on customer needs and
desires for the product. The first technique adopted may be interviews with sales people
(retailers) who have a strong relationship with buyers and users. Another technique used can
be the Focus Group approach using mid-sized and small-sized groups, obtaining
information through questionnaires and benchmarking between different projects in order to
find out likes, dislikes, trends and opinions about similar current and other products.
The illustration is based on construction work, the house being the final product.

Degree: 1 2 3 4 5 - Worst to Best


Figure 5: Customer Needs (WHATS) and Prioritized Needs and Analyzed Competitive
Benchmarking

2.

Develop the technical requirements for meeting the buyers and users needs.

To reach this objective, a brainstorming section is used with the members of the design team.
Brainstorming helps to determine the improvements level and technical details for the design.
All information developed in this phase is organized in the Technical Requirement Table
HOWS as shown in (Figure 2) below. The design team should consider the movement of
Target Values for improvement or optimization of the features design.
By using QFD the design team could also evaluate some details of the layout and features
during the development of the design. By using the roof of the QFD Matrix (Figure 2), it
is possible to examine the correlation among technical requirements. The roof of the House

of Quality helps identify the interactions among the technical requirements and provide early
recognition of positively and negatively correlated features with technical solutions defined
by the design team. After the project design team has determined the technical solution
directions, and had assigned the relative importance and weight to these solutions.

Figure 6: Develop Technical Requirements (HOWS) and Correlation Matrix: Identify


Technical Interactions
3. Determine Target Values
Determine target values (Figure 3) for the technical solutions agreed on

Figure 3: Target Values


To complement the information for the Target Values definition, a technical analysis is
developed as shown in the Technical Requirements: Relative Weight Chart (Figure 4).

Figure 7: Technical Requirements: Relative Weight


After obtaining the final results of the Importance Weight and the Relative Weight of the
Technical Requirements, it is possible for the design team to prioritize and implement the

new layout solutions and new features in the specification and design of the apartment unit.
In the new design for example it was necessary to increase or decrease some of the areas or
shapes of compartments and to eliminate or add new specific solutions.
2.5 Database design
Many details and features are involved during the design of any database.
These details, features and purpose of the database, once gathered are
eventually formatted into a database structure using a predetermined
database model. During the actual design of the database, these details
and features are fashioned artistically and skilfully into a database model.
2.5.1 Database
A database is a mechanism that is used to store information, or data. Information is something
that we all use on a daily basis for a variety of reasons. With a database, users should be able
to store data in an organized manner. Once the data is stored, it should be easy to retrieve
information. Criteria can be used to retrieve information. The way the data is stored in the
database determines how easy it is to search for information based on multiple criteria. Data
should also be easy to add to the database, modify, and remove.(R Stephens and R Plew
2001)
There are many purposes and types of databases. Relational databases are among the most
common. For example, customer relationship management (CRM) databases, which manage
sales leads, customer records, management and billing, are relational databases.
A database is useful in automating as much work as possible to enhance
manual processes.
Some of the most common uses for a database include

Tracking of long-term statistics and trends


Automating manual processes to eliminate paper shuffling
Managing different types of transactions performed by an individual

or business
Maintaining historic information

2.5.2 Tradition methodology of design


Various methodologies might be used when designing a database. A design methodology is
the thought-process and steps taken during the design of a database system.
Methodologies that are used during database design are driven mainly by the knowledge and
experience of developers, as well as automated design (AD) products that are available

Most methodologies used today stem from that of the traditional method. Three of the
primary phases involved in the traditional method are as follows:
Requirements analysis
Data modelling
Normalization
1. Requirements analysis
During the requirement-analysis phase, research is conducted in order to capture all the
business needs as related to the proposed database system. Interviews are conducted by the
development team in order to gather the information that will be used to design the database
system. After all appropriate parties have been interviewed, the development team is
responsible for considering all the issues and requirements brought forward by the customer,
the end user, and perhaps management in order to begin formulating a basic model of the
actual processes involved in the daily operations of the business. The parties interviewed will
help the development team determine the categories of data, business processes, rules, and
other information important to begin modelling the system.
2. Data Modelling
The data modelling phase involves the creation of the logical data model
that will be used to define the physical database structures, or the
physical data model. After the system has been modelled and designed
During the first parts of the data modelling phase, processes and entities that were previously
defined are now defined in more detail. The basic ERD is detailed to include attributes within
each entity. Attributes might be assigned different properties that dictate the specific type of
data to be stored. Process models are used to determine how processes access entities within
the organization. During logical
modelling, relationships that were established between entities are refined if necessary, and
business rules are integrated into the model.

Figure 8: Overview of database design


3. Normalisation phase
The normalization process is used to eliminate (or reduce as much as possible)
redundant data. During the normalization process, large tables with many columns are
divided, or split, into smaller tables with a smaller number of columns. The main benefit of
normalization is to promote overall data consistency between tables and data accuracy
through the reduction of redundant information that is stored. In essence, data only needs to
be modified in one place if an occurrence of the data is only stored one time.
2.5.3 Database models
After commitment to the design effort for a database system, the database model to be used
must be established.
The following database models (types) are discussed in this section:

Flat-file database model

Hierarchical database model


Network database model
Relational database model
Object-oriented (OO) database model
Object-relational (OR) database model

2.5.3.1 Flat-file database model


Before vendors such as Oracle and Microsoft started developing database management
systems that run on a computer, many companies that were using computers stored their data
in flat files on a host computer. The use of flat files to store data was predominant in the
mainframe era. A flat-file database consists of one or more readable files, normally stored in a
text format. Information in these files is stored as fields, the fields having either a constant
length or a variable length separated by some character (delimiter).
Following is an overview of the drawbacks of a flat-file database:

Flat files do not promote a structure in which data can easily be related.
It is difficult to manage data effectively and to ensure accuracy.
It is usually necessary to store redundant data, which causes more work to accurately
maintain the data.

The physical location of the data field within the file must be known.
A program must be developed to manage the data.

2.5.3.2 Hierarchical database model


A hierarchical database is a step above that of a flat-file database, mainly because of the
ability to establish and maintain relationships between groups of data. The architecture of a
hierarchical database is based on the concept of parent/child relationships. In a hierarchical
database, a root table, or parent table, resides at the top of the structure, which points to child
tables containing related data. The structure of a hierarchical database model appears as an
inverted tree, as shown in Figure 9

Figure 9: Hierarchical database model

Drawbacks of the hierarchical model:

Users must be very familiar with the database structure.


Redundant data is stored.

2.5.3.3 Network database model


Improvements were made to the hierarchical database model in order to derive the network
model. As in the hierarchical model, tables are related to one another. One of the main
advantages of the network model is the capability of parent tables to share relationships with
child tables. This means that a child table can have multiple parent tables. Additionally, a user
can access data by starting with any table in the structure, navigating either up or down in the
tree. The user is not required to access a root table first to get to child tables.
The relationship between tables in the network model is called a set structure, where one
table is the owner and another table is a member. This is the same basic concept as the
parent/child relationship discussed earlier. Set structures can represent a one-to-many
relationship between tables. Application programs that access the network database use set
structures to navigate to different parts of the database; therefore if a set structure is modified,
application programs that access the database must also be modified. Figure 2.2 illustrates set
structures.

Figure 10: Set structures for the network database model


The drawbacks of the network database model are as follows:

The structure of the database is not easily modified.


Changes to the database structure definitely affect application programs that access

the database.
The user has to understand the structure of the database.

2.5.3.4 Relational database

The relational database model is the most popular database model used today. Many
improvements have been made to prior database models that simplify data management, data
retrieval, and change propagation management. Data is easier to manage, mainly through the
use of integrity constraints. The retrieval of data is also a refined process, allowing the user to
visualize the database through relational table structures and to ask for specific data without a
detailed knowledge of the database layout. Changes are also easier to propagate, thanks to
features such as integrity constraints and the benefits that normalization (reduction of data
redundancy) provides.

Figure 11: Relational database model


2.5.3.4.1 Types of relational databases
There are two types of relational databases, each of which is associated with particular uses.
A particular relational database type is used based on the required uses of the data. These two
types are the Online Transactional Processing Database and the Online Analytical Processing
database.
1. A transactional, or Online Transactional Processing (OLTP),
The database is one that is used to process data on a regular basis. A good example of a
transactional database is one for class scheduling and student registrations. Say that a
university offers a couple hundred classes. Each class has at least 1 professor and can have
anywhere between 10 and 300 students. Students are continually registering and dropping
classes. Classes are added, removed, modified and scheduled. All of this data is dynamic and
requires a great deal of input from the end user. Imagine the paperwork involved and the staff
required in this situation without the use of a database.
2. An Online Analytical Processing (OLAP)
The database is one whose main purpose is to supply end-users with data in response to
queries that are submitted. Typically, the only transactional activity that occurs in an OLAP

database concerns bulk data loads. OLAP data is used to make intelligent business decisions
based on summarized data, company performance data, and trends. The two main types of
OLAP databases are Decision Support Systems (DSS) and Data Warehouses. Both types of
databases are normally fed from one or more OLTP databases, and are used to make decisions
about the operations of an organization. A data warehouse differs from a DSS in that it
contains massive volumes of data collected from all parts of an organization; hence the name
warehouse. Data warehouses must be specially designed to accommodate the large amounts
of data storage required and enable acceptable performance during data
retrievals. Historic information can be maintained. Historic data is usually related to and
often a part of a transactional database. Historic data may also be a significant part of an
OLAP database. For companies that desire to keep data for years, it is usually not necessary
to store all data online. Doing so will increase the overall amount of data, which means that
more information will have to be read when retrieving and modifying information. Historic
information is typically stored offline, perhaps on a dedicated server, disk drive, or tape
device. For example, in the infrequent event that a user needs to access corporate data from
three years ago, the data can be restored from tape long enough for the appropriate data to be
retrieved and used.
2.5.3.4.2 Relational Database Objects
Various types of objects can be found in a relational database. Some of the most common
objects found in a relational database include

Table - A table is the primary object used to store data in a relational database. When
data is queried and accessed for modification, it is usually found in a table. A table is

defined by columns. One occurrence of all columns in a table is called a row of data.
View- A view is a virtual table, in that it looks like and acts like a table. A view is
defined based on the structure and data of a table. A view can be queried and

sometimes updated.
Constraint - A constraint is an object used to place rules on data. Constraints are used
to control the allowed data in a column. Constraints are created at the column level

and are also used to enforce referential integrity (parent and child table relationships).
Index - An index is an object that is used to speed the process of data retrieval on a
table. For example, an index might be created on a customers name if users tend to
search for customers by name. The customer names would be stored alphabetically in

the index. The rows in the index would point to the corresponding rows in the table,

much like an index in a book points to a particular page.


Trigger - A trigger is a stored unit of programming code in the database that is fired
based on an event that occurs in the database. When a trigger is fired, data might be
modified based on other data that is accessed or modified. Triggers are useful for

maintaining redundant data.


Procedure - A procedure is a program that is stored in the database. A procedure is
executed at the database level. Procedures are typically used to manage data and for
batch processing.

The first four objects deal with the definition of the database, whereas the last two objects
deal with methods for accessing database objects. Objects in a relational database provide
users with a logical representation of data, such that the physical location of the data is
immaterial to the user.
Drawbacks of the relational database model are as follows:

Different groups of information, or tables, must be joined in many cases to retrieve

data.
Users must be familiar with the relationships between tables.
Users must learn SQL.

2.5.3.5 Object-Oriented (OO) Database Model


During the last few years, object-oriented programming has become popular with languages
such as C++, Visual Basic, and Java. An OO programming language allows the programmer
to work with objects to define an application that interacts with a relational database (since
most companies now use the relational database model). For example, elements within a
program or database application are visually represented as objects. These objects have
assigned properties, which can be modified, and can also be inherited from other objects.
Related types of objects are assigned various properties that can be adjusted to define the
particular object and determine how the object will act. With these OO programming tools,
applications are now easier to develop and maintain. Many mundane programming tasks can
be automated by an OO programming tool, thus reducing the amount of time it takes to
develop an application, increasing overall productivity. A problem with the relational
database as OO programming technology advances is that developers must understand both
the relational database language (SQL) as well as the OO programming language (Java, for
example) that is to be used to design the application. It is important for developers to

understand relational database concepts in order for the application to access the data. It can
be confusing for the developer to switch modes of thinking between relational and OO.

Figure 12: Object oriented database model

Drawbacks of the object-oriented model are as follows:

Users must learn OO concepts because the OO database does not work with

traditional programming methods.


Standards have not been completely established for the evolving database model.
Stability is a concern because OO databases have not been around for long.

2.5.3.6 Object-Relational (OR) Database Model


Although some rough seams exist between the object-oriented and relational models, the
object-relational model was developed with the objective of combining the concepts of the
relational database model with object-oriented programming style. The OR model is
supposed to represent the best of both worlds (relational and OO), although the OR model is
still early in development. Vendors are implementing OR concepts into their relational
databases, as the International Standards Organization (ISO) has integrated OR concepts into
the new SQL standard, referred to as SQL3. SQL3 is also referred to as SQL99.
Drawbacks of the object-relational model are as follows:

The user must understand both object-oriented and relational concepts.


Some vendors that have implemented OR concepts do not support object inheritance.

2.5.4 Database management system (DBMS)


A database management system (DBMS) is a complex set of software programs that controls
the organization, storage, management, and retrieval of data in a database. DBMS are
categorized according to their data structures or types, sometime DBMS is also known as
Data base Manager. It is a set of prewritten programs that are used to store, update and
retrieve a database. (G.C Okereke 2002)
Vendors design DBMSs based on a particular database model. A DBMS should have the
following characteristics:

Data is stored on some hardware device and should persist after being accessed.
Access methods include the creation of new data, modification of existing data, and

deletion of data. This is referred to as data persistence.


Multiple users should be allowed to access data simultaneously, called concurrency.
Transactions are managed, allowing for the manipulation of data, and the ability to

save a group of work.


A query language should be available to retrieve data based on criteria supplied by the

user.
Data should be recoverable from a failure. If data is lost, the DBMS should have the
capability to recover the data to any given state.

2.5.5 DBMS Benefits

Improved strategic use of corporate data


Reduced complexity of the organizations information systems environment
Reduced data redundancy and inconsistency
Enhanced data integrity
Application-data independence
Improved security
Reduced application development and maintenance costs
Improved flexibility of information systems
Increased access and availability of data and information
Application-data independence
Improved security
Reduced application development and maintenance costs
Improved flexibility of information systems
Increased access and availability of data and information
Logical & Physical data independence
Concurrent access anomalies.
Facilitate atomicity problem.

Provides central control on the system through DBA

An example of a database management approach in a banking information system is shown


below.

Figure 13: Database management approach in banking

Note how the savings, checking, and instalment loan programs use a database management
system to share a customer database. Note also that the DBMS allows a user to make a direct,
ad hoc interrogation of the database without using application programs.
2.5.6 Database environments
A database environment is a habitat, if you will, in which the database for a business resides.
Within this database environment, users have means for accessing the data. Users might come
from within the database environment, or might originate from outside the environment.
Users perform all different types of tasks, and their needs vary as they are mining for data,
modifying data, or attempting to create new data. Also within the environment, certain users
might be either physically or logically restrained from accessing the data.

1. Mainframe environment
The traditional environment for earlier database systems was the mainframe environment.
The mainframe environment consisted mainly of a powerful mainframe computer that
allowed multiple user connections. Multiple dumb terminals are networked to the mainframe
computer, allowing the user to communicate with the mainframe. The terminals are basically
extensions of the mainframe, they are not independent computers. The term dumb terminal
implies that these terminals do no thinking of their own. They rely on the mainframe
computer to perform all processing.
2. Client/Server Environment
The client/server environment involves a main computer, called a server, and one or more
personal computers that are networked to the server. The database resides on the server, a
separate entity from the personal computer. Each user who requires access to the database on
the server should have their own PC. Because the PC is a separate computer system, an
application is developed and installed on the PC through which the user can access the
database on the server. The application on the client passes requests for data or transactions
over the network directly to the database on the host server. Information is passed over the
network to the database using open database connectivity (ODBC) or other vendor specific
networking software. One of the problems in the client/server environment is that when a
new version of the application is developed, the application must be reinstalled and
reconfigured on each client machine, which can be quite tedious and very time-consuming.

Figure 14: Client/server environment

3. Internet Computing Environment


Internet computing is very similar to client/server computing. As with the client/server
environment, a server, a network, and one or more PCs are involved. Internet computing is
unique because of its reliance on the Internet. In a client/server environment, a user might be
restricted to access systems that are within the corporate intranet. In many cases, client
machines can still access databases outside of the corporate intranet, but additional
customized software might be involved. One aspect of Internet computing that makes it so
powerful is the transparency of the application to the end user. In the Internet computing
environment, the application need only be installed on one server, called a Web server. A user
must have an Internet connection and a supported Web browser installed on the PC. The Web
browser is used to connect to the destination URL of the Web server. The Web server, in turn,
accesses the database in a fashion supported by the application, and returns the requested
information to the users Web browser. The results are displayed on the users PC by the Web
browser. End-user application setup and maintenance is simplified in the Internet computing
environment because there is nothing to install, configure, or maintain on the users PC. The
application need only be installed, configured, and modified on the Web server, reducing the
risk of inconsistent configurations and incompatible versions of software between client and
server machines. When changes are made to the application, changes are made in one
location; for example, on the Web server.

Figure 15: Internet computing environment

CHAPTER THREE: COMPANY AUDIT AND RESEARCH METHODS


3.0 Introduction
In this chapter, the author intends to give a detailed audit of the operations at Zimplow Ltd,
particularly the research and development department. This information is intended for use in
coming up with a questionnaire which reflects modular product design approach.

3.1 Company background


Zimplow Limited is a group of companies with three divisions namely Mealie Brand, CT
Bolts and Tussburg. The company headquarters are at Mealie brand in Bulawayo. Zimplow
(Mealie Brand) was originally established in Bulawayo, Zimbabwe in 1939 by
Czechoslovakian refugees. Zimplow commenced business in that year as the Rhodesian
Plough and Machinery Company Limited (Rhoplow). Rhoplow as it was named then became
a public company in 1947/48. It was formed to ensure that animal drawn implement supplies
to small scale farmers were unaffected by the Second World War. Up to that time, all
agricultural implements were imported from Europe. Rhoplow was then sold to Rothmans at
Independence in 1980 and became Zimplow.
The manufacturing plant or factory is situated on a six hectare site in Bulawayo, Zimbabwes

second largest city. Zimplow has grown from a small manufacturing base into Southern
Africas largest producer of animal drawn traction implements. The company employs more
than 300 people and markets its products throughout the region through experienced and
professional agents.
The following have all contributed to Zimplows growth:
The merging of Bulawayo Steel Products.
The use of Japanese manufacturing procedures and cellular manufacturing.
Its acquisition of C.T Bolts and Tassburg.
A strong continual improvement focus by the management team.
An excellent team which is made up of outstanding performers.
The attainment and maintenance of ISO 9002 Certification in May 1999 and the
subsequent attainment of ISO 9001: 2008 in June 2010.

3.2 Company Mission and Vision


3.2.1Mission Statement
To uplift the livelihood of our communities by availing quality, affordable and reliable steel
products on time every time to the mining, farming, construction and manufacturing sectors
through workmanship that is conscious of its contribution to employee development, society and
environmental sustainability.

3.2.2 Vision
To have a local presence in all African countries south of the Sahara by 2020(36 by 2020).

3.2.3 Focus Goal


To increase our dominance in the existing local and export markets and penetrate the West
African market.

3.3.0 Products and Markets


Mealie Brand products are predominantly marketed in the South of the Sahara, Africa. It
actively markets its products in South Africa, Lesotho, Swaziland, Namibia, Zambia,
Tanzania, D.R.C, Mozambique and East Africa. The major competitors are China and India.
Currently, exports account for 56% of the companys turnover in volume terms 51% of the
companys raw materials are locally produced while 49% are imported. However, the
company enjoys the large market share in the local market with quite a number of retailers
distributing the products throughout the entire parts of the country, some of these include
Farm & City, P.M Manufacturing, various Indian shops and Mohamed Mussa ( Harare )

3.4.0 Company departments


The company strives to achieve its set goals through the interaction of various departments
working with unified efforts. They include Accounts, Sales, Human resources, Engineering,
Production, Stores and purchases along with others. For the purpose of this project, attention
will be paid on few departments (particularly production) regarded as relevant by the author

3.4.1Sales/Marketing department
This department is responsible for marketing the companys products through various
strategies of demand planning and demand management. Demand planning is done through
customer awareness programs in the form of field days on targeted areas. On the field days,
the use of different companys products is demonstrated to customers, hence helping them
gain extensive knowledge about the products. On the other hand, demand management is
achieved through promotions, advertisement and special discounts under stated conditions.
The company markets its products both in local and export markets with Zambia being one of
the best customers on the export market.
It is this department that drives the production system in the sense that it makes analysis of
the sales trend and sets demand forecasts that are then used as a guideline in the production
department. This department also issues production with daily sales updates, pending orders
and stock on hand updates for planning purposes. Such information can also be used by the
procurement department as an input to material requirements planning (MRP).

You might also like