NIPS Highlights, Learn How to code a paper with state of the art frameworks

Dec 09 @ 08:50 AM - 06:05 PM

NIPS, Los Angeles, California

NIPS Highlights

Workshop Description

Every year hundreds of papers are published at NIPS. Although the authors provide sound and scientific description and proof of their ideas, there is no space for explaining all the tricks and details that can make the implementation of the paper work. The goal of this workshop is to help authors evangelize their paper to the industry and expose the participants to all the Machine Learning/Artificial Intelligence know-how that cannot be found in the papers. Also the effect/importance of tuning parameters is rarely discussed, due to lack of space.

Presenting the code of an algorithm published in a paper has not been easy. With the emergence of new fast prototyping systems such as TensorFlow, CNTK, PyTorch, MXNet, etc it is now much easier to present an implementation to an audience through an ipython notebook. The target group of this workshop is mainly ML Researchers/Practitioners from the industry, who want to accelerate transition of research to industrial applications.

The focus of our workshop is on making research more accessible to industry. For that reason we ask all presenters to prepare ipython notebooks that demonstrate practical examples of use. We plan to make these publicly available.

Call for Submissions

Authors and friends* of NIPS 2017 (and from other conferences) are encouraged to submit a poster of their paper along with a detailed ipython notebook that contains an implementation of their algorithms. The poster must demonstrate the ideas and concepts discussed on the paper in a high level and avoid tedious mathematical proofs and notations. The ipython notebook must be self sufficient and it must have very detailed comments and notes along with visuals that demonstrate how equations and pseudo code from the paper translates into code.

Authors are encouraged to use small datasets (real and synthetic) to limit the runtime. If the model size is too big and training time is legthy, it is ok to have pretrained models stored on a website and have the notebook download it. The authors are free to select the python platform of their choice (TensorFlow, PyTorch, CNTK, MxNet, Keras, etc).

The submissions will be judged based on the technical clarity and ease of understanding of the poster and the code. A limited number of accepted submissions will be presented during the oral sessions. The rest of the accepted submissions will be presented during a poster session. All the material will be available online.

*We will accept submissions from individuals who are not the authors of the original paper as long as they have the endorsement of the author of the original paper. The authors must be listed in the submission.

Submit your poster along with an ipython notebook by email

Organizing committee

Important Dates

Submission deadline
November 14th 2017 23:59 EST. Acceptance notification November 22nd

November 21st 2017 23:59 EST. Acceptance notification November 28th

Acceptance Notification
November 22nd 2017 or November 28th depending on the submission date

Workshop day
December 9th 2017

Subscribe to our mailing list to get updates. Send an email to, or follow us on twitter @ismion_mltrain

Accepted Posters

The first name in bold denotes the presenter and not necessarily the author of the poster.


All the notebooks can be found in the github repository

We have also uploaded the notebooks on the azure-notebooks platform

You will need to create Microsoft account and clone the above library. Then you hit the run button and it will start a server on the azure cloud and run the notebook. All the notebooks have been tested on this platform. There are always glitches so if you find a bug, email to report it.



Look at the bottom of the page


  • 09/12/2017


08:50 am - 9:00

Opening remarks

09:00 am - 09:45 am

Lessons learned from designing Edward

by Dustin Tran , Columbia University

09:45 am - 10:05 am

Tips and tricks of coding papers on PyTorch

by Sumith Chintalla , Facebook

10:05 am - 10:25 am

Differentiable Learning of Logical Rules for Knowledge Base Reasoning

by Fan Yang , CMU

10:45 am - 11:15 am

Coding Reinforcement Learning Papers

by Shangtong Zhang , University of Alberta

11:15 am - 11:35 am

A Linear-Time Kernel Goodness-of-Fit Test (NIPS 2017 Best paper)

by Wittawat Jitkrittum , GATSBY

11:35 am - 11:55 am

Imagination-Augmented Agents for Deep Reinforcement Learning

by Sébastien Racaniere , DeepMind

11:55 am - 12:15 pm

Inductive Representation Learning on Large Graphs

by Will Hamilton , Stanford

12:15 pm - 12:35 pm

Probabilistic Programming with Pyro

by Noah Goodman , Uber AI

Lunch Poster session
02:00 pm - 02:45 pm

Simple and Efficient Implementation of Neural Nets with Automatic Operation Batching

by Graham Neubig , Carnegie Mellon University

02:45 pm - 03:05 pm

Learning Texture Manifolds with the Periodic Spatial GAN

by Roland Vollgraf , Zalando

03:05 pm - 03:40 pm

A case study: implementing ID3 decision trees to be as fast as possible (MLPACK)

by Ryan Curtin ,

03:40 pm - 04:00 pm

Self-Normalizing Neural Networks

by Thomas Unterthiner , Kepler University Linz

04:00 pm - 04:25 pm

Best of Both Worlds: Transferring Knowledge from Discriminative Learning to a Generative Visual Dialog Mode

by Lu, Jiasen , Georgia Tech

05:00 pm - 06:00 pm


Ben Athiwaratkun: Bayesian GAN in Pytorch,

Dhyani Dushyanta: A Convolutional Encoder Model for Neural Machine Translation,

Forough Arabshahi: Combining Symbolic Expressions and
Black-box Function Evaluations in Neural Programs,

Jean Kossaifi: Tensor Regression Networks with TensorLy and MXNet,

Joseph Paul Cohen: – Reproducing Intuition,

Kamyar Azizzadenesheli: Efficient Exploration through Bayesian
Deep Q-Networks,

Ashish Khetan: Learning from noisy, single-labeled data,

Rose Yu: Long-Term Forecasting using Tensor-Train RNNs

Shayenne da Lu Moura: Melody Transcription System

Tschannen Michael: Fast Linear Algebra in Stacked Strassen Networks

Yang Shi: Multimodal Compact Bilinear Pooling for Visual Question Answering

Yu-Chia Chen: Improved Graph Laplacian via Geometric Consistency

06:00 pm - 06:05 pm

Closing remarks


NIPSLong Beach Convention Center Room 202