On weeks 11 and 12 worked on:
Had a very productive discussion with Norbert. We decided to start designing the simulation model alongside formulating metrics. Preparing for agent-based simulation with cadCAD:
1. Introduction, Optimization Problems (MIT 6.0002 Intro to Computational Thinking and Data Science)
0/1 Knapsack Problem
You have limited strength, so there is a max weight you can carry
Find a V that maximizes the value
n-1
ΣV[i]* I[i].value
i=0
subject to the constraint that the total weight is less than or equal to w
n-1
ΣV[i]* [[i].weight ≤ w
i=0
Brute Force Algorithm enumerates all possible combinations of the items and remove the combinations whose weight exceeds the allowed weight. It's not very efficient due to the 0/1 knapsack problem being exponential!
Greedy Algorithm puts the "best" available item in the knapsack first
What defines the best?
def testGreedys(foods, maxUnits):
print('Use greedy by value to allocate', maxUnits,
'calories')
testGreedy(foods, maxUnits, Food.getValue)
print('\nUse greedy by cost to allocate', maxUnits,
'calories')
testGreedy(foods, maxUnits,
lambda x: 1/Food.getCost(x))
print('\nUse greedy by density to allocate', maxUnits,
'calories')
testGreedy(foods, maxUnits, Food.density)
names = ['wine', 'beer', 'pizza', 'burger', 'fries',
'cola', 'apple', 'donut', 'cake']
values = [89,90,95,100,90,79,50,10]
calories = [123,154,258,354,365,150,95,195]
foods = buildMenu(names, values, calories)
testGreedys(foods, 1000)
The problem of the algorithm is it can get stuck on a local optimum (local maximum) rather than the globally optimal solution (global maximum)!
Search Tree Implementation builts top down from the root as items are selected. As another item is added to the bag, a node gets created as a left branch (as long as the constraint are meet). On the other hand a right branch is formed reflecting the consequence of not including the added item. The process is then recursively applied to non-leaf children At the end a node with the highest value is choosen.
The downside is it gets too long to compute a solution with big data sets. The same as brute force, an inherently exponential problem!
Dynamic Programming
def fastFib(n, memo = {}):
"""Assumes n is an int >= 0, memo used only by recursive calls
Returns Fibonacci of n"""
if n == 0 or n == 1:
return 1
try:
return memo[n]
except KeyError:
result = fastFib(n-1, memo) + fastFib(n-2, memo)
memo[n] = result
return result
for i in range(121):
print('fib(' + str(i) + ') =', fastFib(i))
When does it work?
Optimal substructure: a globally optimal solution can
be found by combining optimal solutions to local
subproblems
Overlapping subproblems: finding an optimal solution
involves solving the same problem multiple times
Summary