--- tags: Optimization Algorithms,AddBoost --- # Adaboost - It is supervised learning algorithm, used to solve calssificaitn problem - Idea behind AddaBoost add more weight to miss calssifed entity and give less weight to correctly classifed entity , - In AddaBoost each observation is classifed into Stump. Stump is nothing but a tree having only one root node and 2 leafe node ## Following are the algorithm steps: 1. Assign equal weight for each observation. $$ \frac{1}{Total\, number\, of\, sample} $$ **Note**: During first iteration all observation gets equal weight 2. Build Stump for each observation in the dataset. Building Stump is nothing building a decision tree 3. Calculate the toatl erorr for each stump $$Total\, error = \ Sum\ of\ weight\ of incorrectly\ calssifed\ observation $$ <!-- <img src="https://render.githubusercontent.com/render/math?math=Total\, error = \ Sum\ of\ weight\ of incorrectly\ calssifed\ observation"> --> 4. Calculate the amount of Say This step will determine how much say each stump plays role in final classificaiton $$ammount\ of\ say = \frac{1}{2}*(\frac{1- total\ error}{total \ error})$$ >sample weight of incorrectly classified observation used to determine Amount of say. 5. Increase the smaple weights for the incorrectly classifed observation 6. New smaple weights for incorrectly calssfied observation calcuated by $$new\ sample\ weight = sample\ weight\ old*e^{amount\ of\ say}$$ once we find sample weights for incorrectly classifed observation, next step is to calculate sample weight for the correct observation 7. Sample weights for the correctly classfied one is done by using below formula $$new\ sample\ weight = sample\ weight\ old*e^{-amount\ of\ say}$$ -ve exponential value makes new smaple weight for the correctly classfied one *very small* 9. Normalize weight. If we add up all incorrect and correct smaple weight of total sum should be equal to 1. In case sum not equalt on 1, then use the below formula to normaize weights $$ Noramalized\ weights = \frac{newly\ calculated\ sample wt}{sum\ of\ all new\ weights} $$ 9. Make new dataset In this step we construct new data set that is same size as old data set. 10. Repeat setp 1