20181226-arih-column-thum

This article is recommended for those who

Data scientists, machine learning engineers, anyone studying artificial intelligence

Time needed to finish reading this article

10 minutes

Introduction

Hello. This is Tsuchiya from Macnica AI Research Center!

It's been really cold lately... This time, I will post a hot blog so as not to lose to the cold.

Its name is "AI Top Conference NeurIPS 2018 Summary".

Specifically, from the top conference specializing in machine learning "What kind of conference is NeurIPS​ ​2018?" ” will be explained.

Without further ado, let me talk about NeurIPS 2018.

What is NeurIPS 2018?

What is NeurIPS 2018? Regarding that, the table below summarizes the characteristics of the AI-related academic societies introduced in the previous blog.

name detail place season

AI in general

IJCAIMore The world's top conference not only for machine learning but AI in general Sweden

July
AAAI Conference equivalent to IJCAI Hawaii January
JSAI A Japanese conference called the National Conference of the Japanese Society for Artificial Intelligence Kagoshima June

Statistical Machine Learning/Deep Learning

NeurIPS Top machine learning conference, was called NIPS until last year Canada December
ICML A top conference on par with NeurIPS, which used to focus on experiments, but in recent years has shifted to theory-oriented Sweden June
IBIS Japan's largest machine learning workshop Sapporo November

computer vision

CVPR top conferences in computer vision America June
ICCV Conference on par with CVPR, held every other year Italy October

*Researched in December 2018. All conference dates are for 2018.

NeurIPS 2018 is a top AI conference specializing in machine learning held from December 2nd to 8th, 2018.

As I wrote in the previous article, “Learning the Cutting Edge of AI from Papers,” the shortest route to finding high-quality papers is to select them from academic societies.
That said, there are several types of academic societies, each with its own characteristics, so finding the one that suits you is also important.

This time, I would like to summarize NeurIPS 2018, a conference specializing in machine learning, which was held recently among many conferences.
By the way, NeurIPS 2018 was so popular that tickets were sold out in 10 minutes. The AI boom is still coming, isn't it? smile

Now that the preamble has gotten longer, I have prepared images that will “make you feel like you know it” about the technological trends in machine learning in 2018.

This year's AI technology trends from NeurIPS 2018

This image shows the titles of 1011 papers accepted atNeurIPS 2018 in order to understand this year's technology trends.am. Color has no meaning at all.

* Created from https://nips.cc/Conferences/2018/Schedule?type=Poster. "via", "using", etc. are used a lot, but meaningless words are removed.

It's a super simple image, but do you feel like you've picked up on the trends in machine learning?
Of course, Neural Network is the most common word, but other words are also noteworthy.

Reinforcement learning, graphs, and Bayesian are still hot,
There seems to be a lot of research interest in more efficient​ ​optimization (Optimization) and other algorithms (Algorithms).

Trends change little by little every year, but deep learning itself is an academic field with a short history, so a paper accepted at an academic conference this year may become the de facto standard two years from now. Therefore, it is very important to understand the trends.

Let's read the NeurIPS 2018 paper. I would like to say that, but in fact, at NeurIPS 2018, it would be a waste to use only papers.

In the next section, we will introduce how to use NeurIPS 2018, including papers.

NeurIPS 2018 learns from Tutorials and Accepted Papers

Until now, I have focused only on papers, but NeurIPS 2018 is not limited to papers. In NeurIPS 2018, Tutorials are also very helpful, so let's take the initiative and use them.

Then visit the NeurIPS 2018 site (http:// ).

Even if you actually look at the site, I think it's hard to find the information you want, so I prepared a capture image.
(As an aside, the UX of overseas sites such as Kaggle is completely different from that of Japan, so it is not designed to be easy for Japanese people to use for the first time.)

Quote: From https://nips.cc/Conferences/2018(created on December 21, 2018)

Tutorials are cutting-edge information put together by researchers for the general public.
There are about 9 titles such as 'Scalable Bayesian Inference'.
Accepted Papers are papers accepted at NeurIPS 2018, and only high quality papers are carefully selected, but there are 1011 papers in total.
I would like to read all of them, but I don't think I have enough time, so I need to select more carefully from among them.

Some of you may have thought, "1011 papers are not carefully selected at all!", but this 1011 papers is also a record number, and it seems that the number of accepted papers was more than any other year.
That should be it. This year, 4,854 papers were submitted, and of those, 1,011 were selected as accepted papers.

Also, I didn't extract it because it didn't seem possible to use it immediately this time, but the Student Best Paper Awards NeurIPS 2018 is destructive.
This is only 4 books, so if you are interested, please read it.
[Reference URL]
https://nips.cc/Conferences/2018/Awards

Let's start with Tutorials.

3 NeurIPS 2018 Tutorials

There are a total of 9 Tutorials at NeurIPS 2018, all of which are interesting. This time, I will summarize three tutorials from among them.

[Reference URL]
https://nips.cc/Conferences/2018/Schedule?type=Tutorial

The titles of the three tutorials I will be putting together are as follows:

  • Visualization for Machine Learning
  • Scalable Bayesian Inference
  • Unsupervised Deep Learning


Let's take a look at the first one.

Visualization for Machine Learning

[Presenter]
Fernanda Viégas, data visualizer of Google Brain's PAIR (People + AI + Research), Martin Wattenberg of the same institute

[overview]
It provides a wide range of visualization methods for machine learning.
Specifically, we will explain each visualization method in three phases: analysis of training data, understanding of model internals, and performance testing.
There are various visualization methods that are not listed in the book, so it is recommended for those who are at a loss for explanation methods and want to know general visualization methods for cutting-edge analysis projects.

[Tutorial URL]
https://nips.cc/Conferences/2018/Schedule?showEvent=10986

[Video URL]
https://videos.videoken.com/index.php/videos/neurips-2018-turorial-session-on-visualization-for-machine-learning/

Scalable Bayesian Inference

[Presenter]
Professor David Dunson of Duke University, an authority on the development of statistical methodologies

[overview]
I will explain the approach for data with very large sample data and very high dimensional data.

Specifically, from a review of classical methods for large sample sizes, such as the Laplace method and the Bayesian central limit theorem, to a conceptual and practical approach to extend the commonly used Markov chain Monte Carlo method. I will explain.
These methods have applications in various fields such as computational advertising, genomics, and neuroscience.

[Tutorial URL]
https://nips.cc/Conferences/2018/Schedule?showEvent=10984

[Video URL]
https://videos.videoken.com/index.php/videos/scalable-bayesian-inference-neurips-2018/

Unsupervised Deep Learning

[Presenter]
Deep Mind Researcher Alex Graves, Facebook AI Research Lab Marc' Aurelio Ranzato

[overview]
Deep Neural Networks are used as a way to utilize a huge amount of unlabeled data, but there is a problem that it is difficult to set the objective function. In this tutorial, we approach that challenge by predicting all data with a probabilistic model.
In addition, as applied examples, we will focus on generative models such as self-supervised algorithms, energy-based models, and GANs, and will explain everything from academic issues to practical approaches.

[Tutorial URL]
https://nips.cc/Conferences/2018/Schedule?showEvent=10985

[Video URL]
https://videos.videoken.com/index.php/videos/unsupervised-deep-learning-google-deepmind-facebook-artificial-intelligence-neurips-2018/

5 NeurIPS 2018 Accepted Papers

A total of 1011 papers were accepted for NeurIPS 2018, each of which is sharp in its own way. I was able to summarize it.
The following five papers are summarized here.

  • Pelee:A Real-Time Object Detection System on Mobile Devices
  • Deep Anomaly Detection Using Geometric Transformations
  • Evolutionary Stochastic Gradient Descent for Optimization of Deep Neural Networks
  • Representer Point Selection for Explaining Deep Neural Networks
  • Supervising Unsupervised Learning


Now let's summarize them one by one.

Pelee:A Real-Time Object Detection System on Mobile Devices

[overview]
There is a need to run Convolutional NN even with limited computer resources (mobile terminals, etc.), and MobileNet, ShuffleNet, MobileNetV2, etc. have been adopted.
In this field, evaluation indicators include speed, accuracy, and lightness (fewer parameters), and these evaluation indicators are superior to other networks.

[arXiv URL]
https://arxiv.org/abs/1804.06882

[GitHub URL]
https://github.com/Robert-Jun Wang/Pelee

Deep Anomaly Detection Using Geometric Transformations

[overview]
Speaking of anomaly detection, AutoEncoder method is trending, but this paper proposes to apply geometric transformation.
Specifically, by training geometrically transformed images, we propose a mechanism that can learn geometric features and classify whether they are anomalous data or not.
As a result, for all datasets of CIFAR-10, CIFAR-100, Fashion-MNIST, and CatsVsDogs, the one-vs-all scheme significantly improved AUROC when compared to existing methods.

[arXiv URL]
https://arxiv.org/abs/1805.10917

[GitHub URL]
https://github.com/izikgo/AnomalyDetectionTransformations

Evolutionary Stochastic Gradient Descent for Optimization of Deep Neural Networks

[overview]
When you hear Evolutionary, you may think "Is it an evolved SGD (Stochastic Gradient Descent)?", But that's not the case. Evolutionary Stochastic Gradient Descent combines the existing SGD (Stochastic Gradient Descent) with a parallel Evolutionary algorithm for population-based search.
Validation has shown that this optimization method is effective in a variety of tasks, and indeed in all networks such as speech recognition, image recognition, and natural language processing.

[arXiv URL]
https://arxiv.org/abs/1810.06773

Representer Point Selection for Explaining Deep Neural Networks

[overview]
In recent years, machine learning has been used in many places, but an increasing number of studies have focused not only on the accuracy and speed of prediction, but also on why the prediction was made.
This paper is one of the results of that research, and proposed a way to explain the prediction of neural networks by focusing on what is called the representative point of the training set.
The idea is that the predicted value by the Pre-Activation neural network can be expressed as a linear combination of what is called the representative point value, and by doing so, it will be possible to grasp the important points of the training point. That's what it means.

[arXiv URL]
https://arxiv.org/abs/1811.09720

Supervising Unsupervised Learning

[overview]
The problem with unsupervised learning is that there is no method to evaluate the performance of the algorithm, and it is very difficult to evaluate the performance.
In this paper, we try to improve it by using Meta Unsupervised Learning (MUL). Specifically, we devise a framework for adapting supervised learning to novel unsupervised learning on repositories of differently labeled datasets.
As a result, experiments show that simple algorithms perform better in unsupervised learning.

[arXiv URL]
https://arxiv.org/abs/1709.05262

Summary

As the title suggests, this time I posted a summary of NeurIPS 2018, the top AI conference.

Also, this time, we extracted papers with less sharp content, but there are many sharp papers, so please check the NeurIPS 2018 accepted paper page (https://nips.cc/Conferences/2018/Schedule? type=Poster).

Also, in the next blog, I will implement the paper and verify how useful it is, so please look forward to it.