I am a Research Scientist with the Imperfect Information Learning Team at RIKEN AIP.
Additionally, I am a Lecturer at The University of Tokyo, co-running Machine Learning and Statistical Data Analysis Lab (Sugiyama-Yokoya-Ishida Lab).
My interest lies in machine learning, e.g., I have previously worked on weakly supervised learning (such as complementary-label learning), methods to effectively train neural nets (such as flooding), and estimating the upper bound of prediction performance (such as Bayes error estimation).
I earned my PhD from the University of Tokyo in 2021, advised by Prof. Masashi Sugiyama.
During my PhD, I completed an Applied Scientist internship at Amazon.com and was fortunate to become a PhD Fellow at Google and a Research Fellow at JSPS (DC2).
Prior to that, I spent some years in the finance industry, and I was an Assistant Manager at Sumitomo Mitsui DS Asset Management.
I received the MSc from the University of Tokyo in 2017 and the Bachelor of Economics from Keio University in 2013.
Email: ishi at k.u-tokyo dot ac dot jp
Links: Github, X (@tksii), Google Scholar, Japanese page (researchmap)
I am no longer accepting master and PhD students at the Department of Complexity Science and Engineering and the Department of Computer Science. (I continue to jointly accept master students with Prof. Sugiyama at the department of Computer Science.)
T. Ishida, I. Yamane, N. Charoenphakdee, G. Niu, M. Sugiyama.
Is the Performance of My Deep Network Too Good to Be True? A Direct Approach to Estimating the Bayes Error in Binary Classification.
In Proceedings of Eleventh International Conference on Learning Representations (ICLR2023).
[arXiv]
[OpenReview]
[code]
[Fashion-MNIST-H (Papers with Code)]
[Video]
Selected for oral (notable-top-5%) presentation!
I. Yamane, Y. Chevaleyre, T. Ishida, F. Yger.
Mediated Uncoupled Learning and Validation with Bregman Divergences: Loss Family with Maximal Generality.
In Proceedings of the 26th International Conference on Artificial Intelligence and Statistics (AISTATS2023).
[paper]
[code]
[video]
Z. Lu, C. Xu, B. Du, T. Ishida, L. Zhang, & M. Sugiyama.
LocalDrop: A hybrid regularization for deep neural networks.
IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.44, No.7, pp.3590-3601, 2022.
[paper]
H. Ishiguro, T. Ishida, & M. Sugiyama.
Learning from Noisy Complementary Labels with Robust Loss Functions.
IEICE Transactions on Information and Systems, Vol.E105-D, No.2, pp.-, Feb. 2022.
[paper]
T. Ishida, I. Yamane, T. Sakai, G. Niu, M. Sugiyama.
Do We Need Zero Training Loss After Achieving Zero Training Error?
In Proceedings of Thirty-seventh International Conference on Machine Learning (ICML2020).
[paper]
[code]
[video]
T. Ishida, G. Niu, A. K. Menon, and M. Sugiyama.
Complementary-label learning for arbitrary losses and models.
In Proceedings of Thirty-sixth International Conference on Machine Learning (ICML2019).
[paper]
[poster]
[slides]
[video]
[code]
T. Ishida, G. Niu, and M. Sugiyama.
Binary classification from positive-confidence data.
In Advances in Neural Information Processing Systems 31 (NeurIPS2018).
[paper]
[poster]
[slides]
[video]
[code]
[Press Release]
[ScienceDaily]
[PHYS.ORG]
[ASIAN SCIENTISTS]
[ISE Magazine]
[RIKEN RESEARCH]
[日刊工業新聞]
[ITmedia]
Selected for spotlight presentation!
T. Ishida, G. Niu, W. Hu, and M. Sugiyama. Learning from complementary labels. In Advances in Neural Information Processing Systems 30 (NeurIPS2017). [paper] [日刊工業新聞]
T. Ishida. Forecasting Nikkei 225 Returns By Using Internet Search Frequency Data. In Securities Analysts Journal, Vol.52, No.6, pp.83-93, 2014. Selected as Research Notes.
M. Sugiyama, H. Bao, T. Ishida, N. Lu, T. Sakai, & G. Niu.
Machine Learning from Weak Supervision: An Empirical Risk Minimization Approach.
Adaptive Computation and Machine Learning series, The MIT Press, 2022.
[link]
CI: Grant-in-Aid for Scientific Research (B), JSPS, 2022--2025
PI: Grant-in-Aid for Early-Career Scientists, JSPS, 2022--2026
PI: Frontier of mathematics and information science, ACT-X, JST, 2020--2023
PI: Grant-in-Aid for JSPS Fellows, JSPS, 2020--2021
Funai Information Technology Award for Young Researchers, 2022 (received in 2023)
IEICE TC-IBISML Research Award Finalist, 2020 (received in 2021)
Dean's Award for Outstanding Achievement, Graduate School of Frontier Sciences 2021
Toyota/Dwango AI Scholarship, 2020 – 2021
Award Finalist, IBIS2020
Top 10% reviewer, NeurIPS 2020
JSPS Research Fellowship for Young Scientists (DC2), 2020 – 2021
Top 50% reviewer, NeurIPS 2019
Committee member: FY2022 -- FY2023 IEICE, Information-Based Induction Sciences and Machine Learning (IBISML) Technical Group
Workshop organizer: PC member, IBIS2023. Executive Group, TrustML Young Scientist Seminars. Organizer, NeurIPS Meetup Japan 2021.
Conference PC/reviewer: [2024] ICLR, AISTATS, [2023] ICLR, [2022] ICLR, AISTATS, ICML, NeurIPS, [2021] NeurIPS, ACML, ICLR, UAI, ICML, [2020] NeurIPS (top 10% reviewer), ICML, ICLR, AAAI, AISTATS, UAI, ACML, [2019] NeurIPS (top 50% reviewer), ICML, AAAI, AISTATS, UAI, ACML
Journal reviewer: IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Transactions on Image Processing, Journal of Information Processing, Machine Learning, Artificial Intelligence (AIJ), Transactions on Machine Learning Research (TMLR)
Workshop reviewer: 3rd edition of Reproducibility Challenge @ NeurIPS 2019, IJCAI 2021 Workshop on Weakly Supervised Representation Learning
Advanced Data Analysis (Graduate) with Prof. Masashi Sugiyama: 2021 S1S2 (Japanese), 2023 S1S2 (English)
Statistical Machine Learning (Undergraduate) with Prof. Issei Sato and Prof. Masashi Sugiyama (in Japanese): 2021 S1S2, 2022 S1S2, 2023 S1S2
Statistics and Optimization (Undergraduate) with Prof. Issei Sato and Prof. Masashi Sugiyama (in Japanese): 2021 A1A2, 2022 A1A2, 2023 A1A2
Intelligent Systems (Undergraduate) with Prof. Issei Sato, Prof. Masashi Sugiyama, and Prof. Yusuke Miyao (in Japanese): 2021 A1A2, 2022 A1A2, 2023 A1A2