_ __ __ _ /\ | | | \/ | | / \ _ _| |_ ___ | \ / | | / /\ \| | | | __/ _ \| |\/| | | / ____ \ |_| | || (_) | | | | |____ /_/ \_\__,_|\__\___/|_| |_|______|
AutoML: taking the human expert out of the loop
On this page, we provide a tool for assessing the importance of an algorithm's hyperparameters. It takes as input performance data gathered with different hyperparameter settings of the algorithm (for example using the Bayesian optimization methods in HPOlib, fits a random forest to capture the relationship between hyperparameters and performance, and then applies functional ANOVA to assess how important each of the hyperparameters and each low-order interaction of hyperparameters is to performance.
The software to perform functional ANOVA is available at our github repository. It contains the original fANOVA implementation, written in JAVA, as well as a python interface.