![]() |
Takashi Ishida / 石田 隆
Email: ishi at k.u-tokyo dot ac dot jp |
I am a Lecturer at Department of Complexity Science and Engineering, Graduate School of Frontier Sciences, The University of Tokyo.
I am also affiliated with Department of Computer Science, Graduate School of Information Science and Technology and Department of Information Science, Faculty of Science.
I received my PhD from the University of Tokyo in 2021, advised by Prof. Masashi Sugiyama.
Prior to that, I received the MSc from the University of Tokyo in September 2017 and the Bachelor of Economics from Keio University in March 2013.
T. Ishida, I. Yamane, N. Charoenphakdee, G. Niu, M. Sugiyama.
Is the Performance of My Deep Network Too Good to Be True? A Direct Approach to Estimating the Bayes Error in Binary Classification.
In Proceedings of Eleventh International Conference on Learning Representations (ICLR2023).
Note: This paper was selected for oral (notable-top-5%) presentation.
[arXiv]
[OpenReview]
[code]
[Fashion-MNIST-H (Papers with Code)]
[Video]
I. Yamane, Y. Chevaleyre, T. Ishida, F. Yger.
Mediated Uncoupled Learning and Validation with Bregman Divergences: Loss Family with Maximal Generality.
In Proceedings of the 26th International Conference on Artificial Intelligence and Statistics (AISTATS2023).
[paper]
[code]
[video]
T. Ishida, I. Yamane, T. Sakai, G. Niu, M. Sugiyama.
Do We Need Zero Training Loss After Achieving Zero Training Error?
In Proceedings of Thirty-seventh International Conference on Machine Learning (ICML2020).
[paper]
[code]
[video]
T. Ishida, G. Niu, A. K. Menon, and M. Sugiyama.
Complementary-label learning for arbitrary losses and models.
In Proceedings of Thirty-sixth International Conference on Machine Learning (ICML2019).
[paper]
[poster]
[slides]
[video]
[code]
T. Ishida, G. Niu, and M. Sugiyama.
Binary classification from positive-confidence data.
In Advances in Neural Information Processing Systems 31 (NeurIPS2018).
Note: This paper was selected for spotlight presentation.
[paper]
[poster]
[slides]
[video]
[code]
[Press Release]
[ScienceDaily]
[PHYS.ORG]
[ASIAN SCIENTISTS]
[ISE Magazine]
[RIKEN RESEARCH]
[日刊工業新聞]
[ITmedia]
T. Ishida, G. Niu, W. Hu, and M. Sugiyama.
Learning from complementary labels.
In Advances in Neural Information Processing Systems 30 (NeurIPS2017).
[paper]
[日刊工業新聞]
Z. Lu, C. Xu, B. Du, T. Ishida, L. Zhang, & M. Sugiyama.
LocalDrop: A hybrid regularization for deep neural networks.
IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.44, No.7, pp.3590-3601, 2022.
[paper]
H. Ishiguro, T. Ishida, & M. Sugiyama.
Learning from Noisy Complementary Labels with Robust Loss Functions.
IEICE Transactions on Information and Systems, Vol.E105-D, No.2, pp.-, Feb. 2022.
[paper]
T. Ishida.
Forecasting Nikkei 225 Returns By Using Internet Search Frequency Data.
In Securities Analysts Journal, Vol.52, No.6, pp.83-93 (selected as Research Notes), 2014.
M. Sugiyama, H. Bao, T. Ishida, N. Lu, T. Sakai, & G. Niu.
Machine Learning from Weak Supervision: An Empirical Risk Minimization Approach.
Adaptive Computation and Machine Learning series, The MIT Press, 2022.
[link]
RIKEN Center for Advanced Intelligence Project, Tokyo, Japan.
From May 2021: Visiting Scientist, Imperfect Information Learning Team
The University of Tokyo, Chiba & Tokyo, Japan.
From April 2021: Lecturer, Department of Complexity Science and Engineering, Graduate School of Frontier Sciences
From April 2021: Affiliated Lecturer, Department of Computer Science, Graduate School of Information Science and Technology
From April 2021: Affiliated Lecturer, Department of Information Science, Faculty of Science
Amazon, Seattle, US.
July 2020 – October 2020: Applied Scientist Intern, Performance Advertising Tech Group
Sumitomo Mitsui DS Asset Management, Tokyo, Japan.
July 2017 – July 2019: Assistant Manager
April 2013 – June 2017: Associate
JST ACT-X Researcher, from 2020
Toyota/Dwango AI Scholarship, 2020 – 2021
JSPS Research Fellowship for Young Scientists (DC2), 2020 – 2021
Funai Information Technology Award for Young Researchers, 2022 (received in 2023)
IEICE TC-IBISML Research Award Finalist, 2020 (received in 2021)
Dean's Award for Outstanding Achievement, Graduate School of Frontier Sciences 2021
Award Finalist, IBIS2020
Financial Assistance Award, NeurIPS 2020
Travel Award, ICML 2019
Travel Award, NeurIPS 2018
Travel Award, NeurIPS 2017
FY2022 -- FY2023: Member, IEICE, Information-Based Induction Sciences and Machine Learning (IBISML) Technical Group
2023: ICLR
2022: ICLR, AISTATS, ICML, NeurIPS
2021: NeurIPS, ACML, ICLR, UAI, ICML
2020: NeurIPS (top 10% reviewer), ICML, ICLR, AAAI, AISTATS, UAI, ACML
2019: NeurIPS (top 50% reviewer), ICML, AAAI, AISTATS, UAI, ACML
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Image Processing
Journal of Information Processing
Machine Learning
Artificial Intelligence (AIJ)
Transactions on Machine Learning Research (TMLR)
3rd edition of Reproducibility Challenge @ NeurIPS 2019
IJCAI 2021 Workshop on Weakly Supervised Representation Learning
Advanced Data Analysis (Graduate) with Prof. Masashi Sugiyama (in Japanese, UTokyo): 2021 S1S2
Statistical Machine Learning (Undergraduate) with Prof. Issei Sato and Prof. Masashi Sugiyama (in Japanese, UTokyo): 2021 S1S2, 2022 S1S2
Statistics and Optimization (Undergraduate) with Prof. Issei Sato and Prof. Masashi Sugiyama (in Japanese, UTokyo): 2021 A1A2, 2022 A1A2
Intelligent Systems (Undergraduate) with Prof. Issei Sato, Prof. Masashi Sugiyama, and Prof. Yusuke Miyao (in Japanese, UTokyo): 2021 A1A2, 2022 A1A2