Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

Chap 5 Tree.ppt

Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Prochain SlideShare
Review session2
Review session2
Chargement dans…3
×

Consultez-les par la suite

1 sur 110 Publicité

Plus De Contenu Connexe

Publicité

Plus récents (20)

Chap 5 Tree.ppt

  1. 1. Trees and Graph Dr. M. M. Agarwal All the material are integrated from the textbook "Fundamentals of Data Structure in C”
  2. 2. Outline  Introduction (4.1)  Binary Trees (4.2)  Binary Tree Traversals (4.3)  Additional Binary Tree Operations (4.4)  Threaded Binary Trees (4.5)  Heaps (4.6) & (Chapter 9)  Binary Search Trees (4.7)
  3. 3. Outline (2)  Selection Trees (4.8)  Forests (4.9)  Set Representation (4.10)  Counting Binary Trees (4.11)  References & Exercises
  4. 4. 4.1 Introduction  What is a “Tree”?  For Example :  Figure 4.1 (a)  An ancestor binary tree  Figure 4.1 (b)  The ancestry of modern Europe languages
  5. 5. The Definition of Tree (1)  A tree is a finite set of one or more nodes such that :  (1) There is a specially designated node called the root.  (2) The remaining nodes are partitioned into n ≥ 0 disjoint sets T1, …, Tn, where each of these sets is a tree. We call T1, …, Tn, the sub-trees of the root. root … T1 T2 Tn
  6. 6. The Definition of Tree (2)  The root of this tree is node A. (Fig. 4.2)  Definitions:  Parent (A)  Children (E, F)  Siblings (C, D)  Root (A)  Leaf / Leaves  K, L, F, G, M, I, J…
  7. 7. The Definition of Tree (3)  The degree of a node is the number of sub-trees of the node.  The level of a node:  Initially letting the root be at level one  For all other nodes, the level is the level of the node’s parent plus one.  The height or depth of a tree is the maximum level of any node in the tree.
  8. 8. Representation of Trees (1)  List Representation  The root comes first, followed by a list of sub- trees  Example: (A(B(E(K,L),F),C(G),D(H(M),I, J))) data link 1 link 2 ... link n A node must have a varying number of link fields depending on the number of branches
  9. 9. Representation of Trees (2)  Left Child-Right Sibling Representation  Fig.4.5  A Degree Two Tree  Rotate clockwise by 45°  A Binary Tree data left child right sibling
  10. 10. 4.2 Binary Trees  A binary tree is a finite set of nodes that is either empty or consists of a root and two disjoint binary trees called the left sub-tree and the right sub-tree. root The left sub-tree The right sub-tree  Any tree can be transformed into a binary tree.  By using left child-right sibling representation  The left and right subtrees are distinguished
  11. 11. Abstract Data Type Binary_Tree (structure 4.1) Structure Binary_Tree (abbreviated BinTree) is: Objects: a finite set of nodes either empty or consisting of a root node, left Binary_Tree, and right Binary_Tree. Functions: For all bt, bt1, bt2  BinTree, item  element Bintree Create()::= creates an empty binary tree Boolean IsEmpty(bt)::= if (bt==empty binary tree) return TRUE else return FALSE
  12. 12. BinTree MakeBT(bt1, item, bt2)::= return a binary tree whose left subtree is bt1, whose right subtree is bt2, and whose root node contains the data item Bintree Lchild(bt)::= if (IsEmpty(bt)) return error else return the left subtree of bt element Data(bt)::= if (IsEmpty(bt)) return error else return the data in the root node of bt Bintree Rchild(bt)::= if (IsEmpty(bt)) return error else return the right subtree of bt
  13. 13. Special Binary Trees  Skewed Binary Trees  Fig.4.9 (a)  Complete Binary Trees  Fig.4.9 (b)  This will be defined shortly
  14. 14. Properties of Binary Trees (1)  Lemma 4.1 [Maximum number of nodes] :  (1) The maximum number of nodes on level i of a binary tree is 2i -1, i ≥ 1.  (2) The maximum number of nodes in a binary tree of depth k is is 2k -1, k ≥ 1.  The proof is by induction on i.  Lemma 4.2 :  For any nonempty binary tree, T, if n0 is the number of leaf nodes and n2 the number of nodes of degree 2, then n0 = n2 +1.
  15. 15. Properties of Binary Trees (2)  A full binary tree of depth k is a binary tree of depth k having 2k -1 nodes, k ≧ 0.  A binary tree with n nodes and depth k is complete iff its nodes correspond to the nodes numbered from 1 to n in the full binary tree of depth k.
  16. 16. Binary Tree Representation  Array Representation (Fig. 4.11)  Linked Representation (Fig. 4.13)
  17. 17. Array Representation  Lemma 4.3 : If a complete binary tree with n nodes (depth = └log2n + 1┘) is represented sequentially, then for any node with index i, 1 ≦ i ≦ n, we have:  (1) parent (i) is at └ i / 2 ┘, i ≠ 1.  (2) left-child (i) is 2i, if 2i ≤ n.  (3) right-child (i) is 2i+1, if 2i+1 ≤ n.  For complete binary trees, this representation is ideal since it wastes no space. However, for the skewed tree, less than half of the array is utilized.
  18. 18. Linked Representation typedef struct node *tree_pointer; typedef struct node { int data; tree_pointer left_child, right_child; };
  19. 19. 4.3 Binary Tree Traversals  Traversing order : L, V, R  L : moving left  V : visiting the node  R : moving right  Inorder Traversal : LVR  Preorder Traversal : VLR  Postorder Traversal : LRV
  20. 20. For Example  Inorder Traversal : A / B * C * D + E  Preorder Traversal : + * * / A B C D E  Postorder Traversal : A B / C * D * E +
  21. 21. Inorder Traversal (1)  A recursive function starting from the root  Move left Visit node Move right
  22. 22. Inorder Traversal (2) In-order Traversal : A / B * C * D + E
  23. 23. Preorder Traversal  A recursive function starting from the root  Visit node Move left  Move right
  24. 24. Postorder Traversal  A recursive function starting from the root  Move left Move right Visit node
  25. 25. Other Traversals  Iterative Inorder Traversal  Using a stack to simulate recursion  Time Complexity: O(n), n is #num of node.  Level Order Traversal  Visiting at each new level from the left- most node to the right-most  Using Data Structure : Queue
  26. 26. Iterative In-order Traversal (1)
  27. 27. Iterative In-order Traversal (2) Add “+” in stack Add “*” Add “*” Add “/” Add “A” Delete “A” & Print Delete “/” & Print Add “B” Delete “B” & Print Delete “*” & Print Add “C” In-order Traversal : A / B * C * D + E Delete “C” & Print Delete “*” & Print Add “D” Delete “D” & Print Delete “+” & Print Add “E” Delete “E” & Print
  28. 28. Level Order Traversal (1)
  29. 29. Level Order Traversal (2) Add “+” in Queue Deleteq “+” Addq “*” Addq “E” Deleteq “*” Addq “*” Addq “D” Deleteq “E” Deleteq “*” Level-order Traversal : + * E * D / C A B Addq “/” Addq “C” Deleteq “D” Deleteq “/” Addq “A” Addq “B” Deleteq “C” Deleteq “A” Deleteq “B”
  30. 30. 4.4 Additional Binary Tree Operations  Copying Binary Trees  Program 4.6  Testing for Equality of Binary Trees  Program 4.7  The Satisfiability Problem (SAT)
  31. 31. Copying Binary Trees  Modified from postorder traversal program
  32. 32. Testing for Equality of Binary Trees  Equality: 2 binary trees having identical topology and data are said to be equivalent.
  33. 33. SAT Problem (1)  Formulas  Variables : X1, X2, …, Xn  Two possible values: True or False  Operators : And (︿), Or (﹀), Not (﹁)  A variable is an expression.  If x and y are expressions, then ﹁ x, x ︿ y, x ﹀y are expressions.  Parentheses can be used to alter the normal order of evaluation, which is ﹁ before ︿ before ﹀.
  34. 34. SAT Problem (2)
  35. 35. SAT Problem (3)  The SAT problem  Is there an assignment of values to the variables that causes the value of the expression to be true?  For n variables, there are 2n possible combinations of true and false.  The algorithm takes O(g 2n) time  g is the time required to substitute the true and false values for variables and to evaluate the expression.
  36. 36. SAT Problem (4)  Node Data Structure for SAT in C
  37. 37. SAT Problem (5)  A Enumerated Algorithm  Time Complexity : O (2n)
  38. 38. SAT Problem (6) void post_order_eval(tree_pointer node){ if (node){ post_order_eval(node->left_child); post_order_eval(node->right_child); switch(node->data){ case not: node->value=!node->right_child->value; break; case and: node->value=node->right_child->value && node->left_child->value; break; case or: node->value=node->right_child->value || node->left_child->value; break; case true: node->value=TRUE; break; case false: node->value=FALSE; break; } } }
  39. 39. 4.5 Threaded Binary Trees (1)  Linked Representation of Binary Tree  more null links than actual pointers (waste!)  Threaded Binary Tree  Make use of these null links  Threads  Replace the null links by pointers (called threads)  If ptr -> left_thread = TRUE  Then ptr -> left_child is a thread (to the node before ptr)  Else ptr -> left_child is a pointer to left child  If ptr -> right_thread = TRUE  Then ptr -> right_child is a thread (to the node after ptr)  Else ptr -> right_child is a pointer to right child
  40. 40. 4.5 Threaded Binary Trees (2) typedef struct threaded_tree *threaded_pointer; typedef struct threaded_tree { short int left_thread; threaded_pointer left_child; char data; short int right_child; threaded_pointer right_child; }
  41. 41. 4.5 Threaded Binary Trees (3) Head node of the tree Actual tree
  42. 42. Inorder Traversal of a Threaded Binary Tree (1)  Threads simplify inorder traversal algorithm  An easy O(n) algorithm (Program 4.11.)  For any node, ptr, in a threaded binary tree  If ptr -> right_thread = TRUE  The inorder successor of ptr = ptr -> right_child  Else (Otherwise, ptr -> right_thread = FALSE)  Follow a path of left_child links from the right_child of ptr until finding a node with left_Thread = TRUE  Function insucc (Program 4.10.)  Finds the inorder successor of any node (without using a stack)
  43. 43. Inorder Traversal of a Threaded Binary Tree (2)
  44. 44. Inorder Traversal of a Threaded Binary Tree (2)
  45. 45. Inserting a Node into a Threaded Binary Tree  Insert a new node as a child of a parent node  Insert as a left child (left as an exercise)  Insert as a right child (see examples 1 and 2)  Is the original child node an empty subtree?  Empty child node (parent -> child_thread = TRUE)  See example 1  Non-empty child node (parent -> child_thread = FALSE)  See example 2
  46. 46. Inserting a node as the right child of the parent node (empty case)  parent(B) -> right_thread = FALSE  child(D) -> left_thread & right_thread = TURE  child -> left_child = parent  child -> right_child = parent -> right_child  parent -> right_child = child (1) (2) (3)
  47. 47. Inserting a node as the right child of the parent node (non-empty case) (1) (2) (3) (4)
  48. 48. Right insertion in a threaded binary tree void insert_right(threaded_pointer parent, threaded_pointer child){ threaded_pointer temp; child->right_child = parent->right_child; child->right_thread = parent->right_thread; child->left_child = parent; child->left_thread = TRUE; parent->right_child = child; parent->right_thread = FALSE; If (!child->right_thread){/*non-empty child*/ temp = insucc(child); temp->left_child = child; } } (1) (2) (3) (4)
  49. 49. 4.6 Heaps  An application of complete binary tree  Definition  A max (or min) tree  a tree in which the key value in each node is no smaller (or greater) than the key values in its children (if any).  A max (or min) heap  a max (or min) complete binary tree A max heap
  50. 50. Heap Operations  Creation of an empty heap  PS. To build a Heap  O( n log n )  Insertion of a new element into the heap  O (log2n)  Deletion of the largest element from the (max) heap  O (log2n)  Application of Heap  Priority Queues
  51. 51. Insertion into a Max Heap (1) (Figure 4.28)
  52. 52. Insertion into a Max Heap (2)  the height of n node heap = ┌ log2(n+1) ┐  Time complexity = O (height) = O (log2n) void insert_max_heap(element item, int *n) { int i; if (HEAP_FULL(*n)){ fprintf(stderr, “the heap is full.n); exit(1); } i = ++(*n); while ((i!=1) && (item.key>heap[i/2].key)) { heap[i] = heap[i/2]; i /= 2; } heap[i] = item; }
  53. 53. Deletion from a Max Heap  Delete the max (root) from a max heap  Step 1 : Remove the root  Step 2 : Replace the last element to the root  Step 3 : Heapify (Reestablish the heap)
  54. 54. Delete_max_heap (1) element delete_max_heap(int *n) { int parent, child; element item, temp; if (HEAP_EMPTY(*n)) { fprintf(stderr, “The heap is emptyn”); exit(1); } /* save value of the element with the highest key */ item = heap[1]; /* use last element in heap to adjust heap */ temp = heap[(*n)--];
  55. 55. Delete_max_heap (2) parent = 1; child = 2; while (child <= *n) { /* find the larger child of the current parent */ if ((child < *n) && (heap[child].key<heap[child+1].key)) child++; if (temp.key >= heap[child].key) break; /* move to the next lower level */ heap[parent] = heap[child]; child *= 2; } heap[parent] = temp; return item; }
  56. 56. 4.7 Binary Search Trees  Heap : search / delete arbitrary element  O(n) time  Binary Search Trees (BST)  Searching  O(h), h is the height of BST  Insertion  O(h)  Deletion  O(h)  Can be done quickly by both key value and rank
  57. 57. Definition  A binary search tree is a binary tree, that may be empty or satisfies the following properties :  (1) every element has a unique key.  (2&3) The keys in a nonempty left(/right) sub- tree must be smaller(/larger) than the key in the root of the sub-tree.  (4) The left and right sub-trees are also binary search trees.
  58. 58. Searching a BST (1)
  59. 59. Searching a BST (2)  Time Complexity  search  O(h), h is the height of BST.  search2  O(h)
  60. 60. Inserting into a BST (1)  Step 1 : Check if the inserting key is different from those of existing elements  Run search function  O(h)  Step 2 : Run insert_node function  Program 4.17  O(h)
  61. 61. Inserting into a BST (2) void insert_node(tree_pointer *node, int num) { tree_pointer ptr, temp = modified_search(*node, num); if (temp || !(*node)) { ptr = (tree_pointer) malloc(sizeof(node)); if (IS_FULL(ptr)) { fprintf(stderr, “The memory is fulln”); exit(1); } ptr->data = num; ptr->left_child = ptr->right_child = NULL; if (*node) if (num<temp->data) temp->left_child=ptr; else temp->right_child = ptr; else *node = ptr; } }
  62. 62. Deletion from a BST  Delete a non-leaf node with two children  Replace the largest element in its left sub-tree  Or Replace the smallest element in its right sub-tree  Recursively to the leaf  O(h)
  63. 63. Height of a BST  The Height of the binary search tree is O(log2n), on the average.  Worst case (skewed)  O(h) = O(n)  Balanced Search Trees  With a worst case height of O(log2n)  AVL Trees, 2-3 Trees, Red-Black Trees  Chapter 10
  64. 64. 4.8 Selection Trees  Application Problem  Merge k ordered sequences into a single ordered sequence  Definition: A run is an ordered sequence  Build a k-run Selection tree
  65. 65. Time Complexity  Selection Tree’s Level  ┌ log2k ┐+ 1  Each time to restructure the tree  O(log2k)  Total time to merge n records  O(n log2k)
  66. 66. For Example
  67. 67. Tree of losers  The previous selection tree is called a winner tree  Each node records the winner of the two children  Loser Tree  Leaf nodes represent the first record in each run  Each non-leaf node retains a pointer to the loser  Overall winner is stored in the additional node, node 0  Each newly inserted record is now compared with its parent (not its sibling)  loser stays, winner goes up without storing.  Slightly faster than winner trees
  68. 68. Loser tree example 10 8 9 9 20 10 6 11 8 12 9 13 90 14 17 15 10 4 20 5 9 6 90 7 9 2 17 3 8 1 6 Run 1 2 3 4 5 6 7 8 overall winner *Figure 4.36: Tree of losers corresponding to Figure 4.34 (p.235) 15 15 9 8 15 15 9
  69. 69. 4.9 Forests  A forest is a set of n ≧ 0 disjoint trees.  T1, …, Tn is a forest of trees  Transforming a forest into a Binary Tree B(T1, …, Tn)  (1) if n = 0, then return empty  (2) a root (T1);  Left sub-tree equal to B(T11,T12, …, T1m), where T11,T12, …, T1m are the sub-trees of root (T1);  Right sub-tree B(T2, …, Tn)
  70. 70. Transforming a forest into a Binary Tree Root(T1) T11,T12, T13 B(T2, T3)
  71. 71. Forest Traversals Pre-order : In-order : Post-order :
  72. 72. 4.10 Set Representation  Elements : 0, 1, …, n -1.  Sets : S1, S2, …, Sm  pairwise disjoint  If Si and Sj are two sets and i ≠ j, then there is no element that is in both Si and Sj.  Operations  Disjoint Set Union  Ex: S1 ∪ S2  Find (i )
  73. 73. Union Operation  Disjoint Set Union  S1 ∪ S2 = {0, 6, 7, 8, 1, 4, 9}
  74. 74. Implement of Data Structure
  75. 75. Union & Find Operation  Union(i, j)  parent(i) = j  let i be the new root of j  Find(i)  While (parent[i]≧0)  i = parent[i]  find the root of the set  Return i;  return the root of the set
  76. 76. Performance  Run a sequence of union-find operations  Total n-1 unions  n-1 times, O(n)  Time of Finds  Σn i=2 i = O(n 2)
  77. 77. Weighting rule for union(i, j)  If # of nodes in i < # of nodes in j  Then j becomes the parent of i  Else i becomes the parent of j
  78. 78. New Union Function  Prevent the tree from growing too high  To avoid the creation of degenerate trees  No node in T has level greater than log2n +1 void union2(int i, int j){ int temp = parent[i]+parent[j]; if (parent[i]>parent[j]) { parent[i]=j; parent[j]=temp; } else { parent[j]=i; parent[i]=temp; } }
  79. 79. Figure 4.45 Trees achieving worst case bound (p.245)
  80. 80. Collapsing Rule (for new find function)  Definition: If j is a node on the path from i to its root then make j a child of the root  The new find function (see next slide):  Roughly doubles the time for an individual find  Reduces the worse case time over a sequence of finds.
  81. 81. New Find Function  Collapse all nodes form i to root  To lower the height of tree
  82. 82. Performance of New Algorithm  Let T(m, n) be the maximum time required to process an intermixed sequence of m finds (m≧n) and n -1 unions, we have :  k1mα(m, n) ≦ T(m, n) ≦ k2mα(m, n)  k1, k2 : some positive constants  α(m, n) is a very slowly growing function and is a functional inverse of Ackermann’s function A(p, q).  Function A(p, q) is a very rapidly growing function.
  83. 83. Equivalence Classes  Using union-find algorithms to processing the equivalence pairs of Section 4.6 (p.167)  At most time : O(mα(2m, n))  Using less space
  84. 84. 4.11 Counting Binary Trees  Three disparate problems :  Having the same solution  Determine the number of distinct binary trees having n nodes (problem 1)  Determine the number of distinct permutations of the numbers from 1 to n obtainable by a stack (problem 2)  Determine the number of distinct ways of multiply n + 1 matrices (problem 3)
  85. 85. Distinct binary trees  N = 1  only one binary tree  N = 2  2 distinct binary trees  N = 3  5 distinct binary trees  N = …
  86. 86. Stack Permutations (1)  A binary tree traversals  Pre-order : A B C D E F G H I  In-order : B C A E D G H F I  Is this binary tree unique?  Constructing this binary tree
  87. 87. Stack Permutations (2)  For a given preorder permutation 1, 2, 3, what are the possible inorder permutations?  Possible inorder permutation by a stack   (1, 2, 3), (1, 3, 2), (2, 1, 3), (2, 3, 1), (3, 2, 1)  (3, 1, 2) is impossible  Each inorder permutation represents a distinct binary tree
  88. 88. Matrix Multiplication (1)  The product of n matrices  M1 * M2 * … * Mn  Matrix multiplication is associative  Can be performed in any order  N = 3, 2 ways to perform  (M1 * M2) * M3  M1 * (M2 * M3)  N = 4, 5 possibilities
  89. 89. Matrix Multiplication (2)  Let bn be the number of different ways to compute the product of n matrices.  We have :
  90. 90. number of distinct binary trees  Approximation by solving the recurrence of the equation ∵   Solution : (when x →∞) Simplification : ∴ Approximation :
  91. 91. Heapsort—An optimal sorting algorithm  A heap : parent  son
  92. 92.  output the maximum and restore:  Heapsort: construction output
  93. 93. Phase 1: construction  input data: 4, 37, 26, 15, 48  restore the subtree rooted at A(2):  restore the tree rooted at A(1):
  94. 94. Phase 2: output
  95. 95. Implementation  using a linear array not a binary tree.  The sons of A(h) are A(2h) and A(2h+1).  time complexity: O(n log n)
  96. 96. Time complexity Phase 1: construction d = log n : depth # of comparisons is at most: L d    0 1 2(dL)2L =2d L d    0 1 2L  4 L d    0 1 L2L-1 ( L k   0 L2L-1 = 2k (k1)+1) =2d(2d 1)  4(2d-1 (d  1  1) + 1) : = cn  2log n  4, 2  c  4 d L d-L
  97. 97. Time complexity Phase 2: output 2 i n    1 1 log i = : =2nlog n  4cn + 4, 2  c  4 =O(n log n) log i i nodes
  98. 98. 給定4個城市的相互距離 1 2 3 4 12 1 8 10 3 2
  99. 99. 最小展開樹問題 尋找一個將四個城市最經濟的聯結 1 2 3 4 12 1 8 10 3 2
  100. 100. 旅行推銷員問題 Traveling Salesman Problem (TSP) 尋找一個從(1)出發,回到(1)的最短走法 1 2 3 4 12 1 8 10 3 2
  101. 101. TSP是一個公認的難題 NP-Complete  意義:我們現在無法對所有輸入找到一 個有效率的解法  避免浪費時間尋求更佳的解法  Ref: Horowitz & Sahni, Fundamentals of Computer Algorithms, P528.
  102. 102.  2n相當可怕 10 30 50 N 0.00001 s 0.00003 s 0.00005 s N2 0.0001 s 0.0009 s 0.0025 s 2n 0.001 s 17.9 min 34.7 year  像satisfiabilibility problem  目前只有exponential algorithm,還沒有人找 到polynomial algorithm (你也不妨放棄!) 這一類問題是NP-Complete Problem  Garey & Johnson “Computers & Intractability”
  103. 103. 窮舉法(Enumerating) (想想看什麼問題不能窮舉解?)  旅行推銷員問題: 3!走法 (n-1)!  最小展開樹問題: 16種樹 n(n-2) Cayley’s Thm. Ref: Even, Graph Algorithms, PP26~28 12 4 1 1 4 3 2
  104. 104.  Labeled tree  Number sequence One-to-One Mapping  N個nodes的labeled tree可以用一個 長度N-2的number sequence來表達。  Encoding: Data Compression.
  105. 105. Labeled treeNumber sequence  在每一個iteration裡,切除目前所有leaves中 編號最小的node及其edges,記錄切點,切到 只剩一條edge為止。 例. Prune-sequence:7,4,4,7,5(切點)  Label最大者必在最後的edge.  每個node原先的degree數=此node在 Prune-sqeuence中出現的次數+1. 2 3 4 7 1 5 6
  106. 106. Number sequenceLabeled tree Prune-sequence: 7,4,4,7,5 k 1 2 3 4 5 6 7 deg(k) 1 1 1 3 2 1 3 Iteration 1 0 1 1 3 2 1 2 Iteration 2 0 0 1 2 2 1 2 Iteration 3 0 0 0 1 2 1 2 Iteration 4 0 0 0 0 2 1 1 Iteration 5 0 0 0 0 1 0 1 Iteration 6 0 0 0 0 0 0 0  每一個iteration裡,選擇degree為1且編號最小的node,連接prune-sequence中 相對的node,之後兩個nodes的degree均減1. Iteration 1 Iteration 2 Iteration 3 Iteration 4 Iteration 6 Iteration 5 1 7 1 7 2 4 1 7 2 4 3 1 7 4 1 7 4 3 2 1 7 4 3 2 6 5 3 2 5 6
  107. 107. Minimal spanning tree Kruskal’a Algorithm A B D C E 70 65 300 90 50 80 75 200 Begin T <- null While T contains less than n-1 edges, the smallest weight, choose an edge (v, w) form E of smallest weight 【 Using priority queue, heap O (log n) 】, delete (v, w) form E. If the adding of (v, w) to T does not create a cycle in T,【 Using union, find O (log m)】 then add (v, w) to T; else discard (v, w). Repeat. End. O (m log m) m = # of edge
  108. 108. priority queue heap operation O(log n) Initial O(n)  Tarjan: Union & Find almost linear (Amortized)  Correctness  Edge tree minimal  Edge cycle  Delete cycle edge cost tree 1 2 4 3 7 5 6
  109. 109. spanning tree spanning forest link 1. edge(2,3) 2. edge(1,4) S1={1,2,3} S2={4,5} Edge set Set Find, Union O(log n) 1 3 2 4 5

×