background preloader

ADS

Facebook Twitter

ADS Exam 3

ADS EXAM 2. Project One. Algorithm - What does O(log n) mean exactly. Greedy algorithm. Greedy algorithms determine minimum number of coins to give while making change. These are the steps a human would take to emulate a greedy algorithm to represent 36 cents using only coins with values {1, 5, 10, 20}. The coin of the highest value, less than the remaining change owed, is the local optimum. (Note that in general the change-making problem requires dynamic programming or integer programming to find an optimal solution; However, most currency systems, including the Euro and US Dollar, are special cases where the greedy strategy does find an optimum solution.)

For example, a greedy strategy for the traveling salesman problem (which is of a high computational complexity) is the following heuristic: "At each stage visit an unvisited city nearest to the current city". This heuristic need not find a best solution but terminates in a reasonable number of steps; finding an optimal solution typically requires unreasonably many steps.

Specifics[edit] Greedy choice property Types[edit] Lecture 16: Greedy Algorithms. This is the eleventh post in an article series about MIT's lecture course "Introduction to Algorithms. " In this post I will review lecture sixteen, which introduces the concept of Greedy Algorithms, reviews Graphs and applies the greedy Prim's Algorithm to the Minimum Spanning Tree (MST) Problem. The previous lecture introduced dynamic programming. Dynamic programming was used for finding solutions to optimization problems. In such problems there can be many possible solutions. Each solution has a value, and we wish to find a solution with the optimal (minimum or maximum) value. Lecture sixteen starts with a review of graphs. The lecture continues with graph representations. Here is an example of adjacency matrix for a digraph G = ({1, 2, 3, 4}, {{1, 2}, {1, 3}, {2, 3}, {4, 3}}): You can see that the element aij of matrix A is 1 if there is an edge from i to j, and the element is 0 if there is no edge from i to j (i is row, j is column).

Storage required by this representation is O(V+E).

Arrays, Linked Lists, Huffman Trees, Leftist Trees

Soritng Algos. CmSc 250 Big-Oh. Introduction The complexity of an algorithm is measured by the operations needed to solve the corresponding problem. We are concerned with estimating the complexity of algorithms, where the number of operations depends on the size of the input. Examples of such algorithms are: reading a file: the number of read operations depends on the number of records in the file finding a name in a list of names: the number of operations depend on the number of the names in the list finding the greatest element in an array of elements: the number of operations depends on the length of the array. If N is the number of the elements to be processed by an algorithm (N is the size of the input) then the number of operations can be represented as a function of N: f(N) (sometimes we use lower case n) We can compare the complexity of two algorithms by comparing the corresponding functions.

Classifying functions by their asymptotic growth Each particular growing function has its own speed of growing. If. Cpumemory.pdf (application/pdf Object) Recitation 21: Amortized Analysis Examples. (See: Introduction to Algorithms, Cormen, Leiserson and Rivest and Stein, 2nd ed., Ch. 17.) Amortized analysis refers to determining the time-averaged running time for a sequence of operations.

It is different from what is commonly referred to as average case analysis, because amortized analysis does not make any assumption about the distribution of the data values, whereas average case analysis assumes the data are not "bad" (e.g., some sorting algorithms do well on "average" over all input orderings but very badly on certain input orderings).

That is, amortized analysis is a worst case analysis, but for a sequence of operations, rather than for individual operations. It uses the fact that we are analyzing a sequence to "spread out" the costs (think of insurance where everyone can pay a relatively modest amount despite some catastrophic costs). Three approaches are commonly used for amortized analysis, and have been termed: Physicist's Method on Stacks Φ(h) = 2n − m. c + Φ(h') − Φ(h), Queue.