| |||||||||||
NIPS DISCML 2017 : NIPS 2017 Workshop on Discrete Structures in Machine Learning (DISCML) | |||||||||||
Link: http://www.discml.cc | |||||||||||
| |||||||||||
Call For Papers | |||||||||||
============================================================
Call for Papers DISCML -- 7th Workshop on Discrete Structures in Machine Learning at NIPS 2017 (Long Beach) Dec 8, 2017 www.discml.cc ============================================================ Discrete optimization problems and combinatorial structures are ubiquitous in machine learning. They arise for discrete labels with complex dependencies, structured estimators, learning with graphs, partitions, permutations, or when selecting informative subsets of data or features. What are efficient algorithms for handling such problems? Can we robustly solve them in the presence of noise? What about streaming or distributed settings? Which models are computationally tractable and rich enough for applications? What theoretical worst-case bounds can we show? What explains good performance in practice? Such questions are the theme of the DISCML workshop. It aims to bring together theorists and practitioners to explore new applications, models and algorithms, and mathematical properties and concepts that can help learning with complex interactions and discrete structures. We invite high-quality submissions that present recent results related to discrete and combinatorial problems in machine learning, and submissions that discuss open problems or controversial questions and observations, e.g., missing theory to explain why algorithms work well in certain instances but not in general, or illuminating worst case examples. We also welcome the description of well-tested software and benchmarks. Areas of interest include, but are not restricted to: * discrete optimization in context of deep learning * bridging discrete and continuous optimization methods * graph algorithms * continuous relaxations * learning and inference in discrete probabilistic models * algorithms for large data (streaming, sketching, distributed) * online learning * new applications Submissions: Please send submissions in NIPS 2017 format (length max. 6 pages, non-anonymous) to submit@discml.cc Submission deadline: October 30, 2017. Organizers: Jeff A. Bilmes (University of Washington, Seattle), Stefanie Jegelka (MIT), Amin Karbasi (Yale University), Andreas Krause (ETH Zurich, Switzerland), Yaron Singer (Harvard University) |
|