--- tags: research --- # Auto-tuning algorithm of mixnets [toc] ## Preliminary Quantify the anonymity (get objective function valued ) : Initial parameter space : x=[x_1,....,x_n] f(x) : performane(anonymity provided) of a particular configuration by running the full application. f(x*) : the best application performance H_0 : Observations of pairs of samples and value of the objective function. EI : expected improvement Note : in the Beta-Bernoulli statiscal model, we choose k beta distributions since it's the conjugate prior to the Bernoulli likelihood , which would lead to efficient posterior updating. Thus we choose Dirchlet distribution as it's conjugate prior to the multi-nomial distribution. ## Configuarable parameter space ## Algorithm(Bayesian Optimization) The algorithm starts with a small initial set of parameter configurations obtained by randomly sampling the parameter space. ### Surrogate model Assumption : Parameters with great impact on the performance(for simplicity ,we only consider anonymity provided now), the set of values corresponding to good performance will be different from those for bad performance. We denote them respectively as p_{g,x_i}(x_i) and p_{b,x_i}(x_i) for good performance and bad on anonymity measurement. ## Experiment - Select a small set of initial parameter samples uniformly at random from the configuraion space. Obtain the true objective function value for each by running the experiment. Add all the H_0 pairs to the list. - Build the surrogate model using the list of samples in the observation history. The samples are divided into two groups(i.e. good or bad) , based on the "quantile threshold", the surrogate model then consturtcs two probability densities for each of the parameter in the configuration domain. - Select one sample to add to the observation based on the prediction from the surrogate model . The selection algorithm selects the best candidate with the highest EI metric. - Evaluate the true objective function of the selected candidate by runnning the experiment . Add the candidate and objective function value pair to the observation history. Update the surrogate model with the new history. - Iterate over 3-4 until termination condition is met.