You are on page 1of 1

ADOR 1. 2. 3. 4. 5. 6. 7. 8. 9. Dual, Solve dual by any method, Show strong Duality exists!

branch and Bound LP x1=1 , x2=1 Write down the constraints including the original. What is GAP between of subproblem of Lower bound and Upper Bound. IF you keep on solving SUBPROBLEM what will happen? DP(Midterm 2 - q1) What are stages and states? Write Recursive relationship. Find Longest Path from node 1 to 7.

1. IF you have question to find shortest path by DISJKASTRA's ALgorithm, then yo u have to FORMULATE in LP. HOw do you do that? --> Tecnically it's not LP, but Binary LP. Each arc is binary (1 we take, 0 we don't). So you minimize sum d(i)*x(i) and the constrains are that each route that has to be taken it's actually taken, and so on... 545 1. 10% width of halfwidth, find additional runs. 2. if u are operating 24 hrs, what will be best approach? Batch Means, REgenerat ive, etc..3 3. CI for single system. 4. CHI-2 DIST. Reject or not? Write Ho: 5. WIll you be able to solve this problem by DES method? YES or no and WHY? 6. Multiplicative LCG 7. EVENT Calendar after 4th event from tnow = 100 - ans 127 = Tnow 8. What will happen if data is not independent in non-terminating system? Doubl e the BAtch Size. 9. RAndom VAriate Generator - EXPN dist.

DOE 1. IF you have 8 factors and 2 levels, which method do you use to check which fa ctor is important? ( I think u can use 1/16th fraction with 16 runs 2^8-4) ???? --> I'll simply say fractional factorial design, making sure your design is a t least resolution III ( IV or V even better, but not necessary). You tell me why resolution III at least. 2. What assumption do you assume when you are doing an experiment? ( Do you wri te ANOVA assumpation??? - Normality, Independent, Constant Variance) --> The most important assumption, always, is independence (even in Simulatio n we looked for correlation, we want to avoid correlation or dependence. If the variance it's not constant (sometimes it happens, when we find th e funnel shaped graphics), we apply a transformation like box-cox or something like that. If the data it's not normal, it's ok, it follows another distribution, a nd by central limit theorem, when we have lots of data (going to infinity), it becomes normally distributed.

You might also like