Takashi Ishida

alt text 

Takashi Ishida / 石田 隆

  • Lecturer, Department of Complexity Science and Engineering, Graduate School of Frontier Sciences, The University of Tokyo.
  • Affiliated Lecturer, Department of Computer Science, Graduate School of Information Science and Technology, The University of Tokyo.
  • Affiliated Lecturer, Department of Information Science, Faculty of Science, The University of Tokyo.
  • Visiting Scientist, Imperfect Information Learning Team, RIKEN AIP.

Email: ishi at k.u-tokyo dot ac dot jp
URL: https://takashiishida.github.io
Japanese page: researchmap
Other links: Sugiyama-Yokoya-Ishida Lab, Google Scholar

Biography

I am a Lecturer at Department of Complexity Science and Engineering, Graduate School of Frontier Sciences, The University of Tokyo. I am also affiliated with Department of Computer Science, Graduate School of Information Science and Technology and Department of Information Science, Faculty of Science. I received my PhD from the University of Tokyo in 2021, advised by Prof. Masashi Sugiyama. Prior to that, I received the MSc from the University of Tokyo in September 2017 and the Bachelor of Economics from Keio University in March 2013.

Publications

Conference Papers (Full Review)

  1. T. Ishida, I. Yamane, N. Charoenphakdee, G. Niu, M. Sugiyama.
    Is the Performance of My Deep Network Too Good to Be True? A Direct Approach to Estimating the Bayes Error in Binary Classification.
    In Proceedings of Eleventh International Conference on Learning Representations (ICLR2023).
    Note: This paper was selected for oral (notable-top-5%) presentation.
    [arXiv] [OpenReview] [code] [Fashion-MNIST-H (Papers with Code)] [Video]

  2. I. Yamane, Y. Chevaleyre, T. Ishida, F. Yger.
    Mediated Uncoupled Learning and Validation with Bregman Divergences: Loss Family with Maximal Generality.
    In Proceedings of the 26th International Conference on Artificial Intelligence and Statistics (AISTATS2023).
    [paper] [code] [video]

  3. T. Ishida, I. Yamane, T. Sakai, G. Niu, M. Sugiyama.
    Do We Need Zero Training Loss After Achieving Zero Training Error?
    In Proceedings of Thirty-seventh International Conference on Machine Learning (ICML2020).
    [paper] [code] [video]

  4. T. Ishida, G. Niu, A. K. Menon, and M. Sugiyama.
    Complementary-label learning for arbitrary losses and models.
    In Proceedings of Thirty-sixth International Conference on Machine Learning (ICML2019).
    [paper] [poster] [slides] [video] [code]

  5. T. Ishida, G. Niu, and M. Sugiyama.
    Binary classification from positive-confidence data.
    In Advances in Neural Information Processing Systems 31 (NeurIPS2018).
    Note: This paper was selected for spotlight presentation.
    [paper] [poster] [slides] [video] [code] [Press Release] [ScienceDaily] [PHYS.ORG] [ASIAN SCIENTISTS] [ISE Magazine] [RIKEN RESEARCH] [日刊工業新聞] [ITmedia]

  6. T. Ishida, G. Niu, W. Hu, and M. Sugiyama.
    Learning from complementary labels.
    In Advances in Neural Information Processing Systems 30 (NeurIPS2017).
    [paper] [日刊工業新聞]

Journal Papers (Full Review)

  1. Z. Lu, C. Xu, B. Du, T. Ishida, L. Zhang, & M. Sugiyama.
    LocalDrop: A hybrid regularization for deep neural networks.
    IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.44, No.7, pp.3590-3601, 2022.
    [paper]

  2. H. Ishiguro, T. Ishida, & M. Sugiyama.
    Learning from Noisy Complementary Labels with Robust Loss Functions.
    IEICE Transactions on Information and Systems, Vol.E105-D, No.2, pp.-, Feb. 2022.
    [paper]

  3. T. Ishida.
    Forecasting Nikkei 225 Returns By Using Internet Search Frequency Data.
    In Securities Analysts Journal, Vol.52, No.6, pp.83-93 (selected as Research Notes), 2014.

Books

  1. M. Sugiyama, H. Bao, T. Ishida, N. Lu, T. Sakai, & G. Niu.
    Machine Learning from Weak Supervision: An Empirical Risk Minimization Approach.
    Adaptive Computation and Machine Learning series, The MIT Press, 2022.
    [link]

Experiences

Grants and Fellowships

Awards

Professional Service

Committee

Workshop Organizer

Program Committee/Reviewer

Journal Reviewer

Workshop Reviewer

Courses