Intuitive & Scalable Hyperparameter Tuning with Apache Spark + Fugue

Hyperparameter tuning is critical in model development. And its general form: parameter tuning with an objective function is also widely used in industry. On the other hand, Apache Spark can handle massive parallelism, and Apache Spark ML is a solid machine learning solution. But we have not seen a general and intuitive distributed parameter tuning solution based on Apache Spark, why? Not every tuning problem is on Apache Spark ML models. How can Apache Spark handle general models? Not every tuning problem is a parallelizable grid or random search. Bayesian optimization is sequential, how can Apache Spark help in this case? Not every tuning problem is single epoch, deep learning is not. How to fit algos such as hyperband and ASHA into Apache Spark? Not every tuning problem is a machine learning problem, for example simulation tuning is also common. How to generalize? In this talk, we are going to show how using Fugue-Tune and Apache Spark together can eliminate these painpoints Fugue-Tune like Fugue, is
Back to Top