2020 - Meeting Schedule
Date (MM/DD) Time Location Presenter/Title
11/10 14:20-16:00 online (see announcement mail)
Yusuke Sekikawa
[@Denso IT]
Event-based Vision: Emerging Sensors and Processors
Ikuro Sato
[@Denso IT, Tokyo Tech]
Integer-Decomposition Method for Network Compression
10/13 14:20-16:00 online (see announcement mail)
Koki Madono
[@AIST, Waseda Univ]
Block-wise Scrambled Image Recognition Using Adaptation Network
Toshiki Masuyama
[@AIST, Waseda Univ]
Self-supervised Neural Audio-Visual Sound Source Localization via Probabilistic Spatial Modeling
08/11 15:25-17:05 online (see announcement mail)
Atsushi Nitanda
Optimization of neural networks in function spaces
Kenta Oono
[@Susuki Lab/PFN]
Theoretical analysis of graph neural networks---expressive power, optimization and generalization
07/14 15:25-17:05 online (see announcement mail)
Hanna Hoshino
[@Yokota Lab]
Towards reproducible and reliable experiments: Weights and Biases
Shun Iwase
[@Yokota Lab]
RePOSE: Render-and-Compare Iterative Pose Refinement Network for 6DoF Pose Estimation
06/09 15:25-17:05 online (see announcement mail)
Mohamed Wahib
Scaling Distributed Deep Learning Workloads beyond the Memory Capacity with KARMA
Yosuke Oyama
[@Matsuoka Lab]
The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism
05/12 15:25-17:05 online (see announcement mail)
Ruidong Jin
[M2@Murata Lab]
An Application of Bipartite Graph Convolutional Networks on Metropolitan Emergency Medical Service Demand Analysis
Ryuta Matsuno
[M2@Murata Lab]
Optimizing convolutional kernels of graphs for better performance
04/14 15:25-17:05 online (see announcement mail)
Asako Kanezaki
RotationNet for Joint Object Categorization and Unsupervised Pose Estimation from Multi-view Images
Rei Kawakami
[@TokyoTech/Denso IT Laboratory]
Improving Robustness in Recognition: Motion, open-set, and multi-task learning
03/10 15:05-16:35 online (see announcement mail)
Ikuro Sato
[@Denso IT Lab]
Classifier Anonymization Technique for Co-Adaptation Breaking
Hiroki Naganuma
[@Yokota Lab]
Trends in Deep Learning Theory at NeurIPS 2019
02/12 15:05-16:35 GSIC2階会議室
Shuhei Watanabe
Speeding up of the Nelder-Mead Method by Data-driven Speculative Execution
Yoshihiko Ozaki
Practical Deep Neural Network Performance Prediction for Hyperparameter Optimization
Yuki Tanigaki
Paper presentation of "NAS-Bench-101: Towards Reproducible Neural Architecture Search"
01/07 15:05-16:35 W8E 10F large meeting room
Masato Motomura
[@Motomura Lab]
AI Computing Architectures: Recent Works and Future Efforts