## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

1

**Branch and Bound Method
**

The design technique known as branch and bound is similar to backtracking in that it searches a tree model of the solution space and is applicable to a wide variety of discrete combinatorial problems. Backtracking algorithms try to find one or all configurations modeled as Ntuples, which satisfy certain properties. Branch and bound are more oriented 2 towards optimization.

Here all the children of the E- node are generated before any other live node can become E- node. Here two state space trees can be formed BFS (FIFO)and D search (LIFO).

3

Variable size tuple 1 X1=1 2 X2=2 6 X3=3 12 X4=4 16 13 14 15 7 X2=4 8 X1=4 X1=2 3 X2=3 9 10 X3=4 11 4 5

BFS SEARCH for sum of subset problem

4

**Variable size tuple 1
**

X1=1 2 X2=2 3 X3=3 4 X4=4 5 6 8 12 7 X2=4 9 X1=4 X1=2 10 X2=3 11 13 X3=4 15 14 16

DFS SEARCH for sum of subset problem

5

**Variable size tuple
**

X1=1 2 X2=2 10 X3=3 14 X4=4 16 15 13 11 X2=4 12

1 X1=4

X1=2 3 X2=3 7 8 X3=4 9

4

5

6

**D- SEARCH for sum of subset problem
**

6

**Fixed size tuple
**

1 X1=1 2 X1=0 3

18 20

19 12

4 21 13 6

5 7

22

23

16

17

14

15

10

11

8

9

**Nodes are generated in D search manner for sum of subset problem
**

7

**Traveling salesman problem
**

The salesman problem is to find a least cost tour of N cities in his sales region. The tour is to visit each city exactly once. Salesman has a cost matrix C where the element cij equals the cost (usually in terms of time, money, or distance) of direct travel between city I and city j. Assume cii=infinity for all i. Also cij= infinity if it is not possible to 8 move directly from city I to city j.

Branch and bound algorithms for traveling salesman problem can be formulated in a variety of ways. Without loss of generality we can assume that every tour starts and ends at city one. So the solution space S is given by { 1 , , 1 | is a permutation of (2,3,4« n)}

|S|= (n-1)!.

9

1 I1=2 I1=3 2 I2=3 I2=4 5 6 7 8 9 10 I3=2 I3=4 11 Tour 1 2 3 4 1 12 12431 13 14 15 16 3 I1=4 4 I2=3

A State space tree for traveling salesman problem with n= 4

10

all tours

-----{3,5}

{3,5}

------{2,1}

{2,1}

**A branch and bound state space tree for traveling salesman problem
**

11

What is meant by bounding? With each vertex in the tree we associate a lower bound on the cost of any tour in the set represented by the vertex. The computation of these lower bounds is major labor saving device in any branch and bound algorithm. There fore much thought should be given to obtain tight bounds.

12

Assume that we have constructed a specific complete tour with cost = m. If the lower bound associated with the set of tours represented by a vertex v is M. And M>= m

**Then no need to search further for descendants of v for the optimum tour.
**

13

**Basic steps for the computation of lower bounds
**

The basic step in the computation of lower bound is known as reduction. It is based on following observations: 1- In the cost matrix C every full tour contains exactly one element from each row and each column. Note: converse need not be true e.g {(1,5),(5,1),(2,3),( 3,4),(4,2)}.

14

{(1,5),(5,1),(2,3),( 3,4),(4,2)}.

15

1

5

2

3

»4

16

Row Reduction

2- If a constant h is subtracted from every entry in any row or column of C , the cost of any tour under the new matrix C¶ is exactly h less than the cost of the same tour under matrix C. This subtraction is called a row (column) reduction

17

3- By a reduction of the entire cost matrix C we mean the following: Sequentially go down the rows of C and subtract the value of each row¶s smallest element hi from every element in the row. Then do the same for each column. Let h = hi summation over all rows and columns The resulting cost matrix will be called the reduction of C. h is a lower bound on the cost of any tour.

18

Let A be the reduced cost matrix for a node R. Let S be a child of R such that edge (R,S) corresponds to including edge (i,j) in the tour. 1- change all entries in row i and column j of A to E.(so that no edge from this row (column)leaving from I(coming to j), may be included in the tour in future). 2- set A(j,1) = This prevents A(j,1) since node 1 should be the last node of the tour.

19

4-Reduce all rows and columns in the resulting matrix except for rows and columns containing only . Let the resulting matrix be B. Let r be the total amount subtracted then lower bound on S is lower bound for (R) + A(i,j) + r

20

example

Cost matrix

Reduced Cost Matrix

15 3 19 16

20 5 6 4

30 16 18 7

10 4 2 16

11 2 4 3

12 0 15 11

10 3 3 0

17 11 12 0

0 2 0 12

1 0 2 0

Reduced cost matrix lower bound = 25(subtracting from rows 10,2,2,3,4) and 1,3 from column 1 and 3. So all tours in the given graph have length at least 25.

21

15 3 19 16

20 5 6 4

30 16 18 7

10 4 2 16

11 2 4 3

2 3 1 4 5 2 h1 h2 h3 h4 h5 C1 C2 C3 C4 C5

15 3 19 16

10 5 6 4

20 16 18 7

00 4 2 16

1 2 4 3

22

1 I1=2

25 I1=5

35

I1=3 2

53

3

25

4

5

31

I2=2 i2= 3 28 I3=3 I3=5 52 52 9 10 28 I4=3 11 28 6 7 8

36

50

23

0

10 17 0 11 2 0 0 12 3

1 0 2 0

0

15 11

11

2

0 2 0

12 15 3 11 0

12

0

12

12

0

Reduced Cost Matrix

Path (1,2) node 2(25 + 10=35)

24

0

15 11

11

2

12

0

12

0

0 2 0

1 4 0

3 3 0

2 0

12

0 2 0

Path (1,2) node 2(25 + 10=35) =

path(1,3) node 3 (25+17+11=53

12 0 3 12 0 0

9 9 0

0 0 12

12 0 3 3 11 0

11 12 0

0 2 0

path(1,5) node 5

Path 1,4 25 node 4

0

10 17 0 11 2 0 0 12 3

1 0 2 0

12

12 15 3 11 0

9 3 0

0

0

12

0 9 0

12

12

0

Reduced Cost Matrix path(1,5) node 5

25 + +2 + 3=31

26

12 0 15 11

10 3 3 0

17 11 12 0

0 2 0 12

1 0 2 0

1 4 0

3 3 0

2 0

12

0 2 0

Reduced Cost Matrix

path(1,3) node 3 (25+17+11=53

27

12 0 15 11

10 3 3 0

17 11 12 0

0 2 0 12

1 0 2 0

12

3 3 0

11 12 0

0 2 0

28

0

11

Reduced Cost Matrix

Path 1,4 node 4

12

3 3 0

11 12 0

0 2 0

0

11

11

node 6

0 2

29

0

11

0

Path 1,4 node 4 bound 25

Path 1-4-2 Bound 28

0

11

11

node 6

0 2

0

node 9

0

30

0

Path 1-4-2 Bound 28

Path 1-4-2-3 Bound 52

0

11

11

0

npde 6

0 2 0

1 0

1 0 0

node 7

0 0

31

Path 1-4-2

Path 1 4 3 :

0

0

Path 1-4-2-3

node 9

Path 1-4-2-5

node 10

**Least cost (LC) Search
**

In both LIFO and FIFO branch and bound the selection rule for the next E- node is rather rigid and in a sense blind.The selection rule for the next E- node does not give any preference to a node that has a very good chance of getting the search to an answer node quickly. The search for an answer node can often be speeded by using an ³ intelligent ranking C*(.) for live nodes.The next E- node is selected on the basis of this ranking function.

32

Let g^ (x) be an estimate of the additional effort needed to reach an answer node from x. H(x) is the cost of reaching x from the root. F(.) is any non decreasing function. Node x is assigned a rank using c^(.) such that C^(x)= f(h(x)) + g^(x) Using f(.)=0 usually biases the search algorithm to make deep probes into the search tree. Note: BFS and d-search are special cases of LC search. If g^(x)= 0 and f(h(x)) = level of node x then a LC search generates nodes by level.Which is BFS

33

Assignment Problem

34

There are n people who need to be assigned to execute n jobs, one person per job. (i.e. each person is assigned to exactly one job and each job is assigned to exactly one person). C(i, j) is the cost if the i th person is assigned j th job for each pair i, j =1,2«n The problem is to find an assignment with the smallest total cost.

35

Hungarian method is much more efficient for this problem.

36

Lower bound

There are many ways to find a lower bound. We can relax the condition on person, i.e. one Person may be assigned more than one job Or We can relax he condition on jobs More than one person may be assigned to a job

37

problem

Job1 9 6 5 7 Job2 2 4 8 6 Job3 7 3 1 9 Job4 8 7 8 4 persons A B C D

38

Person A to job 1 Cut first row and first column then try to assign the remaining persons to cheapest jobs lower bound 9+3+1+4=17 Job1 9 6 5 7 Job2 2 4 8 6 Job3 7 Job4 8 7 8 persons A B C D

39

3 1

9

4

Start Lb=10

ap 1 Lb=17

a p2 Lb=10

a p3 Lb=20

a p4 Lb =18

bp 1 Lb=13

b p3 Lb=14

bp 4 Lb=17

cp 3 d p4 Lb=13

cp 4 d p3 Lb=25

**For a 2 delete 2nd col ,1st row find min of each row and total of it
**

40

**Lower bound when A-2 and B-1 Lower bound 2+6+1+4=13 Job1 9 6 5 7 Job2 2 4 8 6 Job3 7 Job4 8 7 8 persons A B C D
**

41

3 1

9

4

Lower bound=2+3+1+4=10

Job1 9 6 5 7 Job2 2 4 8 6 Job3 7 Job4 8 7 8 persons A B C D

42

3 1

9

4

Knapsack Problem

Ite m 1 2 3 4 Wt Val Val/ wt 4 40 10 7 42 6 5 25 5 3 12 4

Knapsack ¶s capacity is 10

43

A simple way to compute upper bound is = v + (W ± w) (vi+1/wi+1) Where v = total value of items already added in bag w = wt of items already selected i+1 is the best per unit payoff among the remaining items

44

Sate space tree W=0,v=0 With 1 W=4,v=40 Up=76 With 2 W=11 Not feas w/ o 2 W=4, v=40 Up=70 w/ o 3 With 3 W=9, v=65 Up=69 W= 12 W=9, v=65 Up=65 W=4, v=40 Up=64 Up=100 W /o 1 W=0,v=0 Up=60

v Ite W V

m t 1 2 3 4 4 7 5 3

al 40 42 25 12

v

v v

Not feas

V al/ wt 10 6 5 445

**Traveling salesman problem
**

a 1 5 3 6 7 b 9

c 2 8

4

d

3 e

9

46

Lower Bound for each city I find the sum si of the distances from city I to the two nearest cities; compute the sum s of these n numbers; divide the result by two and round up if all distances are integers.

Lb = [(1 +3)+ (3 +6) + (1 +2) + (3 +4) + (2+3)]/2 = 14

47

a Lb=14

a, b Lb=14

a, c X

a, d Lb=16 X

a, e Lb=19 X

a, b, c Lb=16

a, b, d Lb=16

a, b, e Lb=19 X

a, b, c, d (e,a) Lb= 24 First tour

a, b, c ,e (d, a) Lb=19 Better tour

a, b, d, c (e, a) Lb=24 Inferior tour

a, b, d,e (c, a) Lb= 16 Optimal tour

48

- branch and bound tsp
- tv_17_2010_3_273_278.pdf
- 2d LSI Systems
- 5.Rashmi Final Ppt
- Fan Yang Final Project Report Fixed Figure References
- Untitled
- WMCP
- Intech-traveling Salesman Problem an Overview of Applications Formulations and Solution Approaches
- Chapt03PP_110626
- sol_ex2_sig_11
- TSP EA
- Artificial Intelligence in MIS
- IJEAS0205053
- Data Structures
- Moddel Trans From at Ion as an Optimization Problem
- Data Mining Frequent Pattern
- hw3 machine learning
- Linear Programming Exercises
- 2012-2013-homework-pre
- ERRATA SG Textbook
- Randomness
- Multiobjective_Optimization_NSGAII_0
- Week2.pdf
- Hof
- Deadlock
- 00941847
- Vol 1 No 5 Page 579 to 582
- Ahp
- Contoh Biseksi
- Numerical Methods

- tmpC8D2.tmp
- Optimization of Water Distribution Network for Dharampeth Area
- A Review on Image Inpainting with K-Nearest Neighbor (KNN) Method
- Personalized Gesture Recognition with Hidden Markove Model and Dynamic Time Warping
- tmp2A0A.tmp
- 68537_1985-1989
- Implementation of Feed Forward Neural Network for Image Compression on FPGA
- A System for Efficient Retrieval of the Images from Large Datasets using Ripplet Transform and Edge Histogram Detector
- tmpDE32.tmp
- Analyzing Sentiment at Sentence-Level on Tweets using Hybrid Systems
- Review on Advanced Prediction of Difficult Keyword Queries Over Databases
- tmpDBB7.tmp
- Weighted Density based Error Optimization for Classical Dataset
- An Algorithm to Improve Accuracy of Recommendation System
- 63607_2010-2014
- Digital Image Watermarking Based on LSB for RGB Image
- Image Processing Techniques For Quality Checking In Food Industry

- Simulation of Single and Multilayer of Artificial Neural Network using Verilog
- UT Dallas Syllabus for cs6363.002.08s taught by Balaji Raghavachari (rbk)
- The Optimizing Multiple Travelling Salesman Problem Using Genetic Algorithm
- Comparative Analysis of Optimization Algorithms Based on Hybrid Soft Computing Algorithm
- tmpAA72
- tmpCB3F.tmp
- Cluster Analysis Techniques in Data Mining
- Analysis & Design Algorithm MCQ'S
- As 2805.5.1-1992 Electronic Funds Transfer - Requirements for Interfaces Ciphers - Data Encipherment Algorith
- UT Dallas Syllabus for opre7313.001.08f taught by Milind Dawande (milind)
- tmpDF60.tmp
- A Survey on Gesture Recognition
- OCR for Gujarati Numeral using Neural Network
- Content-Based Image Retrieval Using Features Extracted From Block Truncation Coding
- Scheduling Resources In a Hetero-Gene Cloud Using Genetic Algorithm
- Introduction to Multi-Objective Clustering Ensemble
- A Survey of Modern Data Classification Techniques
- UT Dallas Syllabus for cs3333.001.11s taught by Jeyakesavan Veerasamy (veerasam)
- tmp8BC6
- Appraisal of PSO Algorithm over Genetic Algorithm in WSN Using NS2
- Comparison of different Sub-Band Adaptive Noise Canceller with LMS and RLS
- UT Dallas Syllabus for opre7313.001.11s taught by Milind Dawande (milind)
- Determining the shortest path for Travelling Salesman Problem using Nearest Neighbor Algorithm
- tmp904.tmp
- Principles of parallel algorithm models and their objectives
- UT Dallas Syllabus for cs3345.501 05s taught by Greg Ozbirn (ozbirn)
- UT Dallas Syllabus for cs4349.501 06f taught by Ramaswamy Chandrasekaran (chandra)
- UT Dallas Syllabus for cs2305.002 05f taught by Timothy Farage (tfarage)
- Voice Recognition System using Template Matching
- A review on Development of novel algorithm by combining Wavelet based Enhanced Canny edge Detection and Adaptive Filtering Method for Human Emotion Recognition

Sign up to vote on this title

UsefulNot usefulClose Dialog## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

Close Dialog## This title now requires a credit

Use one of your book credits to continue reading from where you left off, or restart the preview.

Loading