| |||||||||||||
DISCML @NIPS 2012 : NIPS workshop on Discrete Optimization in Machine Learning | |||||||||||||
Link: http://discml.cc | |||||||||||||
| |||||||||||||
Call For Papers | |||||||||||||
===================================================================
Call for Contributions 4th Workshop on Discrete Optimization in Machine Learning (DISCML): Structure and Scalability, at the Annual Conference on Neural Information Processing Systems (NIPS 2012) http://www.discml.cc Submission Deadline: Sunday 16th September =================================================================== Optimization problems with discrete solutions (e.g., combinatorial optimization) are becoming increasingly important in machine learning. The core of statistical machine learning is to infer conclusions from data, and when the variables underlying the data are discrete, both the tasks of inferring the model from data, as well as performing predictions using the estimated model are discrete optimization problems. Two factors complicate matters: first, many discrete problems are in general computationally hard, and second, machine learning applications often demand solving such problems at very large scales. The focus of this year's workshop lies on structures that enable scalability. Which properties of the problem make it possible to still efficiently obtain exact or decent approximate solutions? What are the challenges posed by parallel and distributed processing? Which discrete problems in machine learning are in need of more scalable algorithms? How can we make discrete algorithms scalable while retaining quality? Some heuristics perform well but as of yet are devoid of a theoretical foundation; what explains such good behavior? We would like to encourage high quality submissions of short papers relevant to these workshop topics. Accepted papers will be presented as spotlight talks and posters. Of particular interest are new algorithms with theoretical guarantees, as well as applications of discrete optimization to machine learning problems, especially large scale ones. Areas of interest include Optimization: • Combinatorial algorithms • Submodular / supermodular optimization • Discrete Convex Analysis • Pseudo-boolean optimization • Parallel & distributed discrete optimization Continuous relaxations: • Sparse approximation & compressive sensing • Regularization techniques • Structured sparsity models Learning in discrete domains: • Online learning / bandit optimization • Generalization in discrete learning problems • Adaptive / stochastic optimization Applications: • Graphical model inference & structure learning • Clustering • Feature selection, active learning & experimental design • Structured prediction • Novel discrete optimization problems in ML, Computer Vision, Natural Language Processing, Speech processing, Computational Biology. Submission deadline: September 22, 2012 Length & Format: max. 6 pages NIPS 2012 format Time & Location: December 7 or 8 2012, Lake Tahoe, Nevada, USA Submission instructions: Email to submit@discml.cc Invited talks by • Satoru Fujishige • Amir Globerson • Alex Smola Organizers: Stefanie Jegelka (UC Berkeley), Andreas Krause (ETH Zurich, Switzerland), Jeff A. Bilmes (University of Washington, Seattle), Pradeep Ravikumar (University of Texas, Austin) |
|