Resizing-array implementation
Problem: How do we grow and shrink the array to avoid underflow and overflow?
Simple Approach
Section titled “Simple Approach”We could look at the first solution simply as:
- push() : Increases size of array
s[]
by 1. - pop() : decreases size of array
s[]
by 1.
The issue with this solution is the operations can become too expensive we would need to copy all items to a new array and then inserting first N
items can take time proportional to
.
This would not be feasible for a large N
as the algorithm is quadratic.
The challenge is to resize infrequently reducing operations and increasing the efficiency of our algorithms.
Growing our Array
Section titled “Growing our Array”Lets take the growth of our array as an example.
Repeated doubling
Section titled “Repeated doubling”One method we could use is to create a new array twice the size, and copy items.
public ResizingArrayStackOfStrings(){ s = new String[1];}
public void push(String item){ if (N == s.length) resize (2 * s.length); s[N++] = item;}
private void resize(int capacity){ String[] copy = new String[capacity]; for (int i = 0; i < N; i++){ copy[i] = s[i]; } s = copy;}
This method now makes inserting first N items take time proportional to N (Not ) . The reasoning behind this is we only create a new Array every time we double the length of our original array - However, we have already iterated enough operations to fill the array in the first place hence the doubling triggering after N operations have already been completed. Essentially our algorithm is now achieves amortized constant time.
We represent the total cost for N operations in tilde notation, considering the array doubling operations, giving us ~3N.
. (this includes array accesses to double to size K)
Shrinking our Array
Section titled “Shrinking our Array”The first simple approach here would be to:
- push() : double size of array
s[]
when array is full. - pop() : halve size of array
s[]
when array is one-half full.
Unfortunately this approach would be too expensive in it’s worst case due to a phenomenon called thrashing.*
Consider a client doing push-pop-push-pop operations sequentially when the number of elements in the array hovers around the one-half full threshold, this would cause thrashing where the array will consistently double and half creating new arrays on every operations.
Since each doubling or halving operation requires copying all existing elements ( an ) operation), these frequent resizes would lead to a total time complexity that is quadratic () for a sequence of N such operations, this defeats the purpose of amortized constant time performance improvements.
Amortized Shrink Operations
Section titled “Amortized Shrink Operations”An efficient approach is to amortize our shrink operations by moving our bounds that trigger a shrink to if the array is one quarter full. If the array is a quarter full we then resize it to half its original length repeating the operations only when the criterion of 1/4 full is met removing possible thrashing scenarios.
public String pop(){ String item = s[--N]; s[N] = null; if(N > 0 && N == s.length/4){ resize(s.length/2); } return item;}
Amortized Analysis
Section titled “Amortized Analysis”Amortizing operations will ensure:
- Client does not need to provide a maximum amount of memory for the stack
- Guaranteed that the amount of memory that we use is always a constant multiple of the number of items actually on the stack.
To analyse the results we would take the average running time per operation over a worst-case sequence of operations. At the point the stack doubles it takes time proportional to N the trade off is really fast pushes and pops.