NIPS Highlights, Learn How to code a paper with state of the art frameworks
Every year hundreds of papers are published at NIPS. Although the authors provide sound and scientific description and proof of their ideas, there is no space for explaining all the tricks and details that can make the implementation of the paper work. The goal of this workshop is to help authors evangelize their paper to the industry and expose the participants to all the Machine Learning/Artificial Intelligence know-how that cannot be found in the papers. Also the effect/importance of tuning parameters is rarely discussed, due to lack of space.
Presenting the code of an algorithm published in a paper has not been easy. With the emergence of new fast prototyping systems such as TensorFlow, CNTK, PyTorch, MXNet, etc it is now much easier to present an implementation to an audience through an ipython notebook. The target group of this workshop is mainly ML Researchers/Practitioners from the industry, who want to accelerate transition of research to industrial applications.
The focus of our workshop is on making research more accessible to industry. For that reason we ask all presenters to prepare ipython notebooks that demonstrate practical examples of use. We plan to make these publicly available.
Call for Submissions
Authors and friends* of NIPS 2017 (and from other conferences) are encouraged to submit a poster of their paper along with a detailed ipython notebook that contains an implementation of their algorithms. The poster must demonstrate the ideas and concepts discussed on the paper in a high level and avoid tedious mathematical proofs and notations. The ipython notebook must be self sufficient and it must have very detailed comments and notes along with visuals that demonstrate how equations and pseudo code from the paper translates into code.
Authors are encouraged to use small datasets (real and synthetic) to limit the runtime. If the model size is too big and training time is legthy, it is ok to have pretrained models stored on a website and have the notebook download it. The authors are free to select the python platform of their choice (TensorFlow, PyTorch, CNTK, MxNet, Keras, etc).
The submissions will be judged based on the technical clarity and ease of understanding of the poster and the code. A limited number of accepted submissions will be presented during the oral sessions. The rest of the accepted submissions will be presented during a poster session. All the material will be available online.
*We will accept submissions from individuals who are not the authors of the original paper as long as they have the endorsement of the author of the original paper. The authors must be listed in the submission.
Submit your poster along with an ipython notebook by email email@example.com
- Nikolaos Vasiloglou
- Alex Dimakis
- Guy Van Den Broek
- Kristian Kersting
- Alex Ihler
- Animashree Anandkumar
- Assaf Araki
- Daniel Gordon
- Justin Basilico
- Alexandros Karatzoglou
- Ben Lau
November 14th 2017 23:59 EST. Acceptance notification November 22nd
November 21st 2017 23:59 EST. Acceptance notification November 28th
November 22nd 2017 or November 28th depending on the submission date
December 9th 2017
The first name in bold denotes the presenter and not necessarily the author of the poster.
- “Tensor Regression Networks with TensorLy and MXNet”, (Best Poster Award) Jean Kossaifi, Zack Lipton, Aran Khanna, Tommaso Furlanello and Anima Anandkumar
- “Long-Term Forecasting using Tensor-Train RNNs”, Rose Yu, Stephan Zheng, Anima Anandkumar, Yisong Yue
- “Multimodal Compact Bilinear Pooling for Visual Question Answering”, Yang Shi, Akira Fukui, Dong Huk Park, Daylen Yang, Anna Rohrbach, Trevor Darrell, Marcus Rohrbach
- “Bayesian GAN in Pytorch”, (Best Poster Award) Ben Athiwaratkun, Andrew Gordon Wilson
- “Combining Symbolic Expressions and Black-box Function Evaluations in Neural Programs”, Forough Arabshahi, Sameer Singh, Animashree Anandkumar
- “Learning from noisy singly-labeled data”, Ashish Khetan, Zachary C. Lipton, Animashree Anandkumar
- “Efficient Exploration through Bayesian Deep Q-Networks”, Kamyar Azizzadenesheli, Emma Brunskill, Animashree Anandkumar
- “Melody Transcription System”, Shayenne Moura, Franc ̧ois Rigaud, Mathieu Radenen
- “ShortScience.org – Reproducing Intuition”, Joseph Paul Cohen, Henry Z. Lo
- “A Convolutional Encoder Model for Neural Machine Translation”, Dushyanta Dhyani, Pravar Mahajan, Jonas Gehring, Michael Auli, David Grangier, Yann N. Dauphin
- “Improved Graph Laplacian via Geometric Self-Consistency”, Yu-Chia Chen, Dominique Perrault-Joncas, Marina Meilă, James McQueen
- “StrassenNets: Deep Learning with a Multiplication Budget”, Michael Tschannen, Aran Khanna, Anima Anandkumar
All the notebooks can be found in the github repository https://github.com/vasiloglou/mltrain-nips-2017
We have also uploaded the notebooks on the azure-notebooks platform https://notebooks.azure.com/mltrain/libraries/MLTrain-NIPS-2017
You will need to create Microsoft account and clone the above library. Then you hit the run button and it will start a server on the azure cloud and run the notebook. All the notebooks have been tested on this platform. There are always glitches so if you find a bug, email firstname.lastname@example.org to report it.
Look at the bottom of the page
A case study: implementing ID3 decision trees to be as fast as possible (MLPACK)
by Ryan Curtin ,
Best of Both Worlds: Transferring Knowledge from Discriminative Learning to a Generative Visual Dialog Mode
by Lu, Jiasen , Georgia Tech
Ben Athiwaratkun: Bayesian GAN in Pytorch,
Dhyani Dushyanta: A Convolutional Encoder Model for Neural Machine Translation,
Forough Arabshahi: Combining Symbolic Expressions and
Black-box Function Evaluations in Neural Programs,
Jean Kossaifi: Tensor Regression Networks with TensorLy and MXNet,
Joseph Paul Cohen: ShortScience.org – Reproducing Intuition,
Kamyar Azizzadenesheli: Efficient Exploration through Bayesian
Ashish Khetan: Learning from noisy, single-labeled data,
Rose Yu: Long-Term Forecasting using Tensor-Train RNNs
Shayenne da Lu Moura: Melody Transcription System
Tschannen Michael: Fast Linear Algebra in Stacked Strassen Networks
Yang Shi: Multimodal Compact Bilinear Pooling for Visual Question Answering
Yu-Chia Chen: Improved Graph Laplacian via Geometric Consistency