You are on page 1of 18

Patent - a contract between the inventor and the government.

This can prevent our competitors from making the same product.

The US recognizes three kinds of patents except:

Utility patents
Design patents
Living plant patents

Adaptive thinking
-This means improving carefully selected ideas by slight modification.

Metastable state
-Properties are not only a function of their current condition, but also of their processing.

Derjaguin-Landau-Verwey-Overbeek theory
Explains why the ocean’s salinity causes rivers to form deltas

The following are included in Rheology and Mixing


Newtonian flow
Shear thinning
High viscosity products

Ingredient improvements
-It is often useful for ingredients to have properties that are strong functions of temperature or
pH, because this allows the product’s activity to be triggered.

Micelle structures minimize the chemical potential of soap.

Creaminess-It seems close to the generic average of thickness and smoothness


 Three main reasons why Chemists and Engineers dislike Lawyers:
o Money
o Laws
o Truth

Patents and Trade Secrets

 The intellectual property generated in product design is conveniently split into patents and trade
secrets.
 Patent - a contract between the inventor and the government.
 If the inventor convinces the government that the invention is new, then the government gives
the inventor exclusive rights to the invention for ta considerable time.
 Patents are valuable because they grant a period when the inventor can earn higher profits and
hence more easily recover development costs.
 The inventor receives two thirds of the sales if the product is the first of its type to reach the
market. The price earned during the original period can be considerably higher when the product
is protected by a patent.
 Patent can be international.
 Trade secrets - nonpublic information used in manufacturing product.
- may be a special catalyst or important steps in activating the catalyst.
- may not contain any new information but just particular information
- an example is PIN number on a money card
- not legal property
 Two forms of vulnerability of products with trade secrets:
o When an employee changes jobs.
o When one of our competitors independently discovers the secrets and patents it.
 In making the decision as to whether to seek a patent or keep a trade secret, some companies
have tried a third way.
1. They present it as a poster at some minor technical meeting that they expect will be
poorly attended.
2. The company then keeps a careful, notarized record of what was in the poster,
including the trade secrets.
3. The chances that the secret will be noticed by a competitor are remote.
4. If a competitor does in the future discover and patent it, the original discoverer can
then refer back to the poster section.
5. An earlier disclosure invalidates the patents and avoids license fees.

What can be patented?

 The United States recognizes three kinds of patents:


o Utility patents
o Design patents
o Living plant patents
Requirements for Patents

 Useful
 Novel
 Nonobvious

 If the inventor has disclosed the invention in a printed publication, he has 1 year from the date of
disclosure to apply for a patent within the United States.
 Nonobvious means that the differences between the new product and earlier products must be
sufficient that they are not obvious to one having ordinary skill in the art to which the invention
pertains.
 Conditions set by US Supreme Court to consider an invention nonobvious:
o The scope and content of the prior art
o The differences between the art and the claimed invention
o The level of ordinary skill in the field of the invention
 US patent laws divide the inventive process into two steps: conception and reduction to practice.
 Conception – formation by the inventor of a definite idea of the complete invention.
 Actual reduction to practice requires construction of a device or preparing a composition. The
inventor must demonstrate that the invention fulfills its intended purpose. This actual reduction
to practice often includes getting missing information by using some methods.
 If a patent is infringed, the patent holder is entitled to the payment of a reasonable license fee. If
it can be shown that the infringement was willful, damages of two to three times this amount may
be awarded.
 Crucial points to remember about intellectual property law:
o It is complex.
o Intellectual property law has little to do with justice.

 Going backward from the target molecule o simple precursors is exactly what “the
disconnection approach” by Stuart Warren to organic synthesis is designed to achieve.

Trade secrets - nonpublic information used in manufacturing product.


- may be a special catalyst or important steps in activating the catalyst.
- may not contain any new information but just particular information
- an example is PIN number on a money card
- not legal property
Utility patents
most common and important for chemical products.
granted for any useful, new and nonobvious compositions of matter, article of
manufacture or process.
most are complex documents.
have a term of 20 years from the filing date of the earliest US patent application from
which the patent claims priority.
Design patents
involves ornamental features of an article of manufacture
granted for 14 years for any new, original and nonobvious ornamental design for an
article of manufacture.
simple and inexpensive

Conception – formation by the inventor of a definite idea of the complete invention.

FINAL SPECIFICATION
Strategy for final specification:
1. Define the product structure
2. Rank the product’s most important attributes
3. Review any chemical triggers

Central Product Attributes


1. Structural Attributes
- Product’s physical properties (strength, elasticity, etc.)
- More important for devices than for chemical products
2. Equilibrium Changes
- Chemical products will show major changes in equilibrium as a consequence of altered
temperature, pH, etc.
3. Key Rate Processes
- Rate of any chemical reaction
- Rates of heat transfer, fluid flow, or diffusion

Newtonian flow – do not often occur in microstructured products


Non-Newtonian flow – behavior of many microstructured products such as paint and many cleaning
fluids
- depends upon the colloid chemistry of the particular product
High viscosity products – use laminar or turbulent flow for mixing in commercial quantities
Low viscosity products – use turbulent flow for mixing in commercial quantities
Risk, Reliability, and Safety
Hazard: is the potential for human, property, or environmental damage. Ex: cracked steering
linkage, fuel line, or a loose step.

Unsafe condition: another term for hazard. It is a condition which if not corrected can reasonably
be expected to result in failure and/or injury. It is not a function of probability but rather of the
consequence of the causal factors.

Risk: it is the likelihood, expressed either as a probability or as a frequency, of a hazard’s


materializing. Risk is part of our individual experience and that of society as a whole.
The list of risks in our highly complex technological society is endless.

Reliability: a measure of the capability of a part or a system to operate without failure in the service
environment. It is always expressed as a probability; e.g. a reliability of 0.999 implies that there is
probability of failure of 1 part in every 1000.

Safety: is relative protection from exposure to hazards. A thing is safe if its risks are judged to be
acceptable.

FASTENER QUALITY ACT: enacted in response to failures in defence equipment from


companies selling substandard fasteners with phony inspection certificates. The resulting
regulation is 200 pages long, and is reported to be costing the automotive industry $300 million
per year for compliance.

STANDARDS: makes sure that society receives a minimum level of safety and performance.
They are a set of rules that tell what must be done in particular situations.

EPA Standard AP-50: sets the maximum annual average concentration of sulphur dioxide at 80
microgram/m3 and the maximum 24-h average at 365 microgram/m3.

 Mandatory standards: issued by the governmental agencies, and violations are treated
like criminal acts for which fines and/or imprisonment may be imposed
 Voluntary standards: prepared by a committee of interested parties (industry suppliers
and users, government, and the general public), usually under the sponsorship of a technical
society or a trade association.
 Design (specification) standards: specify the acceptable levels of technical details, such
as minimum flow rate and minimum yield strength.
 Performance standards: specify the minimum performance characteristics without
specifying the individual technical details. It can include design specification as examples
of current state-of-the-art methods of meeting the performance criteria
Benefit-cost analysis: most common approach in risk assessment. There’s a difficulty is that
benefits are counted in lives saved while costs are in monetary terms.

Willingness to pay: method in which people are asked directly what they would be willing to pay
to avoid danger or harm.

Conventional engineering design: disregards the fact that materials properties, the dimensions of
the components, and the externally applied loads are stochastic in nature.

TYPICAL APPROACHES FOR INCORPORATING PROBABILISTIC EFFECTS IN


DESIGN

 Use of a factor of safety


 Use of the absolute worst case design
 Use of probability in design

Time period of constant failure rate: a period in which failures can be considered to occur at
random from random overloads or random flaws. These failures follow no predictable pattern.

Wearout period of accelerating failure rate: period during which materials and components
begin to age and wear rapidly.

Mean life: the average life of the No components put on test or in service, measured over the entire
life curve out to wearout.

Mean time to failure (MTTF): the sum of the survival time (up time) for all of the components
divided by the number of failures. It is applied to any period in the life of the component.

Mean time between failures (MTBF): the mean time between two successive component
failures. MTBF is similar to MTTF, but is applied for components or systems that are repaired.

Redundant system: a system in which the components are arranged to give parallel reliability, so
as to say, there is more than one mechanism for the system functions to be carried out.
Full Active Redundancy System : all but one component may fail before the system fails.

Partial Active Redundancy System: certain components can fail without causing system failure
but more than one component must remain operating to keep the system operating.

DESIGN FOR RELIABILITY

 Fail Safe Approach


Is to identify the weak spot in the system or component.

 One-Horse Shay
To design all components to have equal life so that the system will fall apart at the
end of its useful lifetime just as the legendary one-horse shay did.
 Absolute Worst-case Approach-It is a very conservative approach and it often leads to
overdesign.

CAUSES OF UNRELIABILITY
1. Design Mistakes.
a. Incomplete information on loads and environmental conditions
b. Erroneous calculations
c. Poor selection of materials
2. Manufacturing Defects.
a. Poor surface finish or sharp edges that lead to fatigue cracks
b. Decarburization or quench cracks
c. Manufacturing errors cause by the production work force
3. Maintenance.
4. Exceeding design limits.
5. Environmental Factors.

MINIMIZING FAILURE
1. Margin of Safety
Variability in strength has a major impact on the probability of failure, so that
failure can be reduced with no change in the mean value if the variability of the
strength can be reduced.
2. Derating
The reliability of equipment is increased if the maximum operating conditions are
derated below their nameplate values.
3. Redundancy
Existence of parallel paths may result in load sharing so that each component is
derated and has its life increased by a longer than normal time.
Standby unit wears one much more slowly than the operating unit does.
4. Durability
This requires the decision to spend more money on high performance materials so
as to increase service life and reduce maintenance costs.
5. Damage Tolerance
When a crack occurs, it will be detected soon enough after its occurrence so that
the probability of encountering loads in excess of the residual strength is very
remote.
6. Ease of Inspection
If the structure is not capable of this, then the stress level must be lowered until
the initial crack cannot grow to a critical size during the life of the structure.

7. Simplicity
It reduces the chance for error and increases the reliability. The simpler the
equipment needed to meet the performance requirements, the better the design.
8. Specificity
The greater the degree of specificity, the greater the inherent reliability of design.

Different kinds of events

Output event – event that should be developed or analyzed further to determine how it can occur
Independent Event – event that does not depend on other components within the system for its
occurrence.
Normal Event – event that is expected to occur during system operation.

Undeveloped Event – event that has not been developed further because of lack of information or
it is not of sufficient consequence.

DEFECTS AND FAILURE MODES


Four broad categories of failures of engineering designs and systems:

 Hardware failure- failure of a component to function as designed


 Software failure- failure of the computer software to function as designed
 Human failure- failure of human operators to follow instructions or respond adequately to
emergency situations
 Organizational failure- failure of the organization to properly support the system

FAILURE MODES
Four general causes of specific modes of failure of engineering components:

1. Excessive elastic deformation


2. Excessive plastic deformation
3. Fracture
4. Loss of required part geometry through corrosion wear

ROBUST AND QUALITY DESIGN


Robust design
- with a system of design tools that reduce product or process variability, while
simultaneously guiding the performance toward an optimal setting.
Robust product
- provide customer satisfaction even when subjected to extreme conditions on the
manufacturing floor or the service environment.
DEFINITION OF QUALITY:
-ability of a product or service to satisfy a stated or implied need.
-free from defects or deficiencies.
QUALITY CONTROL AND ASSURANCE

Quality control refers to the action throughout the engineering and manufacturing of a
product to prevent and detect product deficiencies and product safety hazards.

The American Society for Quality (ASQ) defines quality as the totality of features and
characteristics of a product or service that bear on ability to satisfy a given need.
In a narrower sense, “quality control” refers to the statistical techniques employed in
sampling production and monitoring the variability of the product.

Quality assurance refers to those systematic actions vital to provide satisfactory


confidence that an item or service will fulfil defined requirements.
Design factors and disturbance factors. These are input parameter that affect the quality of the
product or process.
Design factors. These are parameter that can be specified freely by the designer. It is the designer’s
responsibility to select the optimum levels of the design factors.
Disturbance factors. These are the parameter that are wither inherently uncontrollable or
impractical to control.
Insightful description of Optimization Methods
1. Optimization by evolution. There is a close parallel between technological evolution and
biological evolution. Most designs in the past have been optimized by an attempt to
improve upon an existing similar design. Survival of the resulting variations depends on
the natural selection of user acceptance.
2. Optimization by intuition. The art of engineering is the ability to make good decisions
without being able to formulate a justification. Intuition is knowing what to do without
knowing why one does it. The gif of intuition seems to be closely related to unconscious
mind.
3. Optimization by trial-and-error modeling. This refer to the usual situation in modern
engineering design where it is recognized that the first feasible design is not necessarily
the best. Therefore, the design model is exercised for a few iterations in the hope of finding
an improved design. However, this mode of operation is NOT TRUE OPTIMIZATION.
Some refer to satisficing, as opposed to optimizing, to mean a technically acceptable job
done rapidly and presumably economically. Such a design should not be called an optimal
design.
4. Optimization by numerical algorithm. This is the area of current active development in
which mathematically based strategies are used to search for an optimum. The chief types
of numerical algorithms are listed in the accompanying table.

LINEAR PROGRAMMING is the most widely used applied technique, especially in business
and manufacturing situations.
NONLINEAR PROGRAMMING is used in most mechanical design problems.

This information can be used to identify the improvements that could be achieve if the feasible
domain were modified, which would point out directions for technological improvement.
DYNAMIC PROGRAMMING
It is a mathematical technique that is well suited for the optimization of staged processes.
The word “dynamic” in the names of this technique has no relationship to the usual use of the word
to denote change with respect to time. It is related to the calculus of variations and is not related
to linear and nonlinear programming methods.
It was developed by Richard Bellman in the 1950. It is a well-developed optimization
method.
JOHNSON’S METHOD
A method of optimum design that is especially suited to the non-linear problems found in
the design of mechanical elements such as ears, roller bearings, and hydrodynamic journal bearings
has been developed by R.C. Johnson.
It often requires significant effort to reduce the system equation to a form suitable for an
optimization study. However, the benefit of the method is that it gives considerable insight into
the nature of and possible solution to the problem.
GENETIC ALGORITHMS
These are based on the laws of evolution. Its approach is a stochastic optimization method,
involve reducing a large number of possible design to set of genetic codes, expressed as a string
of 1s and 0s. It follows a survival of the fittest strategy, where members of the design population
compete.
SIMULATED ANNEALING
It is a stochastic method that addresses the category of combinatorial optimization
problems. The method has been used to automate digital circuit layout in 2-D and it is being used
to study the layout of mechanical and electromechanical components in the product architecture.
TEMPERATURE- the probability is an exponential function of a parameter.
QUALITY ASSURANCE

Quality assurance is concerned with all corporate activities that affect customer
satisfaction with the quality of the product. There must be quality assurance
department with sufficient independence from manufacturing to act to maintain
quality. This group is responsible for interpreting national and international codes and
standards in terms of each purchase order and for developing written rules of operating
process.

There must also be procedures for maintaining the identity and traceability of materials
and semi-finished parts while in the various stage of processing. Definite policies and
procedures for dealing with defective material and parts must be in place. There must
be a way to decide when parts should be scrapped, reworked or downgraded to a lower
quality level.

Quality control is not something that can be in place and then forgotten. There must
be procedures for training, qualifying and certifying inspectors and other QC
personnel. Funds must be available for updating inspection and laboratory equipment
and for the frequent calibration of instruments.
ISO 9000

An important aspects of quality assurance is the audit of the quality system against
written standards. The most prevalent quality standard is ISO 9000, and its companion
standards, that are issued by the International Organization for Standards (ISO).
ISO 9000 has become required by companies doing business in the European
Community, and since it is a worldwide marketplace, companies around the world
have been scrambling to become ISO 9000 certified.

Formula of Risk Priority Number (RPN:


RPN = (severity of failure) x (occurrence of failure) x (detection rating)

QUALITY IMPROVEMENT

Four basic costs are associated with quality.

 Prevention –those costs incurred in planning, implementing, and maintaining


a quality system. Included are the extra expense in design and manufacturing
to ensure the highest quality product.
 Appraisal - costs incurred in determining the degree of conformance to the
quality requirements. The cost of inspection is the major contributor.
 Internal failure - costs incurred when materials, parts and components fail to
meet the quality requirements for shipping to the customer. These parts are
either scrapped or reworked.
 External failure - costs incurred when products fail to meet customer
expectation. These result in warranty claims, ill will, or product liability suits.
To simply collect statistics on defective parts and weed them out of the
assembly line is not sufficient for quality improvement and cost reduction. A
proactive effort must be made to determine the root cause of the problem so
that permanent corrections can be made. Two commonly used techniques in
this area of problem solving are the Pareto diagram and Cause-and-effect
analysis.

Pareto diagram

In 1897 an Italian economist, Vilfred Pareto, studied the distribution of wealth


in Italy and found that a large percentage of the wealth was concentrated in
about 10 percent of the population. This was published and became known as
Pareto’s law. In 1954, Joseph Juran generalized Paretos’s law as the “80/20
rule.” For example, 80 percent of sales are generated by 20 percent of the
customers, 80 % of the product defects are caused by 20 % of the parts.

Cause-and-Effect Analysis

Cause-and-Effect Analysis uses the “fishbone diagram” or Ishikawa


diagram to identify possible causes of a problem. The poor quality is
associated with four categories of causes: operator (man), machine, method and
material.
The Cause-and-effect analysis diagram provides an orderly, step-by-step
approach to improving manufacturing quality.
Variability of the input and output parameter in Taguchi methodology.
1. Output variability
a. Variational noise. The short-term unit-to-unit variation due to the manufacturing
process.
b. Inner noise. The long-term change in product characteristics over time due to
deterioration and wear.
2. Input variability
a. Tolerances. (Design factor variability) The normal variability in design factors.
b. Outer noise. It represents the variability of the disturbance factors that contribute
to output variability. (e.g., temperature, humidity, dust, vibration ad human error)
ROBUST DESIGN
Robust Design. The systematic approach to finding optimum values of design factors which leads
to economical design with low variability. Taguchi achieves this goal by first performing
parameter design, and then, if conditions still are not optimum, by performing tolerance design.
Parameter design. The process of identifying the settings of the design parameters or process
variables that reduce the sensitivity of the design to sources of variation. An accurate modeling of
the mean response is not as important as finding the factor levels that optimize robustness. Thus,
once the variance has been reduced the mean response should be easily adjusted by using a suitable
design parameter known as the signal factor. An important tenet of robust design is that a design
found optimum in laboratory experiments should also be optimum under manufacturing and
service conditions. Also, since product designs are often broken down into subsystems for design
purposes, it is vital that the robustness of a subsystem not be affected by changes in another
subsystem. Therefore, i1nteractions among control factors are highly undesirable.
12.6.1 Parameter Design
Parameter design makes heavy use of statistically planned experiments. Two-and three-level
orthogonal arrays are most often used.
Orthogonal arrays are all common fractional factorial design. These arrays have been pairwise
balancing property that every setting of a design parameter occurs with every setting of all other
design parameters the same number of times. They keep this balancing property while minimizing
the number of test runs.
Two parts of Taguchi Type Parameter Design of Experiments
1. Design Parameter Matrix. It specifies the test settings of the design parameter.
2. Noise Matrix.
12.7 Optimization Methods
Optimization is the process of maximizing a desire quantity or minimizing an undesired one.
Optimization theory is the body of mathematics that deals with the properties of maxima and
minima and how to find the maxima and the minima numerically. In the typical design
optimization situation, the designer has created a general configuration for which the numerical
values of the independent variables have not been fixed.
Typical objective function could be cost, weight, reliability and producibility or a combination of
these. Inevitably, the objective function is subject to certain constraints.
Constrains. These arise from physical laws and limitation or from compatibility conditions on the
individual variables.
Functional constraints. Denoted by ψ, also called as the equality constraints, specify relations
that must exist between variables.
Regional constrains. Denoted by φ, also called the inequality constraints, are imposed by specific
details of the problem.
Specifications. These are the points of interaction with other parts of the system. Often results
from an arbitrary decision to carry out a sub optimization of the system.
A common problem in design optimization is that there often is more than one design characteristic
that is of value to the user. In formulating the optimization problem, one predominant characteristic
is chosen as the objective function and the other characteristics are reduced to the status as
constraints.
Soft Specification. These are negotiable specification, and should be considered as target values
until the design progresses to such a point that it is possible to determine the penalty that is being
paid in trade-offs to achieve the specifications.
Insightful description of Optimization Methods
1. Optimization by evolution. There is a close parallel between technological evolution and
biological evolution. Most designs in the past have been optimized by an attempt to
improve upon an existing similar design. Survival of the resulting variations depends on
the natural selection of user acceptance.
2. Optimization by intuition. The art of engineering is the ability to make good decisions
without being able to formulate a justification. Intuition is knowing what to do without
knowing why one does it. The gif of intuition seems to be closely related to unconscious
mind.
3. Optimization by trial-and-error modeling. This refer to the usual situation in modern
engineering design where it is recognized that the first feasible design is not necessarily
the best. Therefore, the design model is exercised for a few iterations in the hope of finding
an improved design. However, this mode of operation is NOT TRUE OPTIMIZATION.
Some refer to satisficing, as opposed to optimizing, to mean a technically acceptable job
done rapidly and presumably economically. Such a design should not be called an optimal
design.
Optimization by numerical algorithm. This is the area of current active development in which
mathematically based strategies are used to search for an optimum. The chief types of numerical
algorithms are listed in the accompanying table.
12.7.2 SEARCH METHODS
SEVERAL CLASSES OF SEARCH PROBLEMS
1. Deterministic search- it is assumed to be free from appreciable experimental error
2. Stochastic search- it is the existence of random error thrust be considered.
*Linear programming and dynamic programming are techniques that deal well with situations of
this class.
UNIFORM SEARCH
The trial points are spaced equally over the allowable range of values. Each point is
evaluated in turn in an exhaustive search.
UNIFORM DICHOTOMOUS SEARCH
In this procedure, experiments are performed in pairs to establish whether the function is
increasing or decreasing. Since the search procedure is uniform, the experiments are spaced evenly
over the entire range of values.
SEQUENTIAL DICHOTOMOUS SEARCH
Two preceding examples were simultaneous search techniques in which all experiments
were planned in advance.
FIBONACCI SEARCH
It is a very efficient sequential technique. It is based on the use of the Fiboncci number
series, which is named after a thirteenth-century mathematician.
GOLDEN SECTION SEARCH
If the function turns out to be very steep near the maximum, we should have chosen more
trials. It is a good compromise, because although is it slightly less efficient that the Fibonacci
search, it does not require an advance decision on the number of trials.
COMPARISON OF METHODS.
A measure of the efficiency of a search technique is the reduction ratio which is defined as
the ratio of the original interval of uncertainty to the interval remaining after n trials.
12.7.3 MULTIVARIATE SEARCH METHOD
When the objective function is a function of two or more variables, the geometric
representation is a RESPONSE SURFACE. It usually is convenient to deal with contour lines
produces by the intersection of planes of constant y with thee response surface.
LATTICE SEARCH
It is an analog to the single-variable exhaustive search, a two-dimensional grid lattice is
superimposed over the contour. In the absence of special knowledge about the location of the
maximum, the starting point is near the center of the region at 1 point.
UNIVARIATE SEARCH
It is a one-at-a-time method. All of the variables are kept constant except one, and it is
varied to obtain an optimum in the objective function. That optimal value is then substituted into
the function and the function is optimized with respect to another variable. The objective function
is optimized with respect to each variable in sequence, and an optimal value of a variable I
substituted into the function for te optimization of the succeeding variables.
STEEPEST ASCENT
The path of steepest ascent (or descent) up on the response surface is the gradient vector.
The gradient method does not essentially that with mathematics. We change the direction of the
search in the direction of the maximum slope, but we must do so in finite straight segments.
12.7.5 OTHER OPTIMIZATION METHODS
MONOTONICITY ANALYSIS
It is an optimization technique that may be applied to design problems with monotonic
properties. Where the objective function and constrains successively increase or decrease with
respect to some design variable. It can often show the designer which constraints are active at the
optimum.
ACTIVE CONSTRAINTS - a design requirement which has a direct impact on the location of
the optimum. This information can be used to identify the improvements that could be achieve if
the feasible domain were modified, which would point out directions for technological
improvement.
DYNAMIC PROGRAMMING
It is a mathematical technique that is well suited for the optimization of staged processes.
The word “dynamic” in the names of this technique has no relationship to the usual use of the word
to denote change with respect to time. It is related to the calculus of variations and is not related
to linear and nonlinear programming methods.
It was developed by Richard Bellman in the 1950. It is a well-developed optimization
method.
JOHNSON’S METHOD
A method of optimum design that is especially suited to the non-linear problems found in
the design of mechanical elements such as ears, roller bearings, and hydrodynamic journal bearings
has been developed by R.C. Johnson.
It often requires significant effort to reduce the system equation to a form suitable for an
optimization study. However, the benefit of the method is that it gives considerable insight into
the nature of and possible solution to the problem.
GENETIC ALGORITHMS
These are based on the laws of evolution. Its approach is a stochastic optimization method,
involve reducing a large number of possible design to set of genetic codes, expressed as a string
of 1s and 0s. It follows a survival of the fittest strategy, where members of the design population
compete.
SIMULATED ANNEALING
It is a stochastic method that addresses the category of combinatorial optimization
problems. The method has been used to automate digital circuit layout in 2-D and it is being used
to study the layout of mechanical and electromechanical components in the product architecture.
TEMPERATURE- the probability is an exponential function of a parameter.
12.8 EVALUATION CONSIDERATIONS IN OPTIMIZATION
In consumer products, it usually is cost, in aircraft it is weight, and in implantable medical
devices it is power consumption. The strategy is to optimize these “bottleneck factors” first.
12.8.2 SENSITIVITY ANALYSIS
With this, we find which design parameters are most critical to the performance of the
design, and what are the critical ranges of those parameters. For some problems, the sensitivity
analysis can be done analytically, while in others it must be done numerically.
12.9 DESIGN OPTIMIZATION
It has been a natural development to combine computer-aided-engineering (CAE) analysis
and simulation tools with computer-based optimization algorithm.

You might also like