# Takashi Ishida > *This document is inspired by [llms.txt](https://github.com/AnswerDotAI/llms-txt) for providing LLM-friendly information.* - Research Scientist at Imperfect Information Learning Team, [RIKEN AIP](https://aip.riken.jp/?lang=en) - Associate Professor at [The University of Tokyo](https://www.u-tokyo.ac.jp/en/). I belong to the Department of Complexity Science and Engineering, the Department of Computer Science, and the Department of Information Science. - Part-time Research Scientist at [Sakana AI](https://sakana.ai). --- ## Team/Lab at UTokyo - Currently co-runing the [Machine Learning and Statistical Data Analysis Lab (mslab)](http://www.ms.k.u-tokyo.ac.jp/index.html), a.k.a., Sugiyama-Yokoya-Ishida lab. - If you are interested in joining Ishida lab, a sub-group of ms-lab, see the information below. --- ## For Prospective Students I welcome motivated students interested in data-centric machine learning, weakly supervised learning, LLMs, and related fields. Please check my recent publications to get a sense of my research interests. I can supervise or mentor students through the following programs: * **Graduate students (Master's & PhD):** - I supervise students in the [Department of Complexity Science and Engineering](https://www.k.u-tokyo.ac.jp/complex/html/examinee/examinee_e.html) (Kashiwa Campus) and the [Department of Computer Science](https://www.i.u-tokyo.ac.jp/edu/course/cs/admission_e.shtml) (Hongo Campus). - If you plan to apply, please review the official guidelines above. PhD applicants may email me with a CV and a 2 – 3-page research plan. - In terms of lab activities and research, there is not much difference between the two departments—we hold seminars jointly on both campuses and I have offices in both campuses—but entrance examinations and graduation requirements are separate and can vary. * **4th-year undergraduates:** Thesis supervision is available in the [Department of Information Science](https://www.is.s.u-tokyo.ac.jp/en/) at UTokyo. * **1st – 3rd-year undergraduates in Universities in Japan:** I cannot supervise directly, but mentoring opportunities may exist via the RIKEN-AIP Undergraduate Research Program (PI: Prof. Masashi Sugiyama). See the [Japanese guide (PDF)](https://aip.riken.jp/uploads/20231208_AIP-UGRP.pdf). Information about the lab: * All lab meetings, seminars, and day-to-day communication are conducted in English, so students from abroad can join us without any Japanese proficiency. * We have lab seminars on Mondays and Wednesdays (each is around 2 hours). There are no mandatory lab hours outside of seminars. * We work closely with Sugiyama lab and Yokoya lab. * Our labs are located in Kashiwa and Hongo. The Kashiwa campus offers a peaceful research environment with beautiful Japanese gardens and traditional tea houses nearby. The Hongo campus is in central Tokyo, providing easy access to city life. You can find some photos in my [Instagram account](https://www.instagram.com/tksiia/). Prerequisites for joining my lab: * **Mathematics.** You should already be comfortable with statistics, linear algebra, and calculus. A concise refresher is **_Mathematics for Machine Learning_** by Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong — freely available at . * **Coding.** Strong Python programming skills are essential. Experience with training/fine-tuning deep networks or LLMs is a plus. Experience with LLM APIs (OpenAI, Anthropic, etc.) is also a plus. Feel free to reach out with any questions. I attend AI/ML conferences such as ICLR, NeurIPS, and ICML, at least once a year, so let me know if you'd like to meet and chat! --- ## Some information about myself - In Japanese, my name is written as 石田 隆, where 石田 is Ishida and 隆 is Takashi. - I earned my PhD from the University of Tokyo under Prof. Masashi Sugiyama. - During my PhD: I interned as an Applied Scientist at Amazon.com, I was a Google PhD Fellowship in machine learning, and I was a JSPS DC2 Research Fellowship. - Before graduate school I worked in finance as Assistant Manager at Sumitomo Mitsui DS Asset Management. - Degrees: PhD (UTokyo), MSc (UTokyo), BEc (Keio University) --- ## Preprints 1. R. Ushio, **T. Ishida**, M. Sugiyama. *Practical estimation of the optimal classification error with soft labels and calibration.* **arXiv preprint arXiv:2505.20761**, 2025. [[arXiv](https://arxiv.org/abs/2505.20761)] | [[code](https://github.com/RyotaUshio/bayes-error-estimation)] 2. **T. Ishida**, T. Lodkaew, I. Yamane. *How Can I Publish My LLM Benchmark Without Giving the True Answers Away?* **arXiv preprint arXiv:2505.18102**, 2025. [[arXiv](https://arxiv.org/abs/2505.18102)] --- ## Papers (peer reviewed) 1. W. Wang, **T. Ishida**, Y-J. Zhang, G. Niu, M. Sugiyama. *Learning with Complementary Labels Revisited: The Selected-Completely-at-Random Setting Is More Practical.* **ICML 2024**. [[arXiv](https://arxiv.org/abs/2311.15502)] | [[PMLR](https://proceedings.mlr.press/v235/wang24ac.html)] | [[code](https://github.com/wwangwitsel/SCARCE)] 2. **T. Ishida**, I. Yamane, N. Charoenphakdee, G. Niu, M. Sugiyama. *Is the Performance of My Deep Network Too Good to Be True? A Direct Approach to Estimating the Bayes Error in Binary Classification.* **ICLR 2023** — *oral (notable top 5 %)*. [[arXiv](https://arxiv.org/abs/2202.00395)] | [[OpenReview](https://openreview.net/forum?id=FZdJQgy05rz)] | [[code](https://github.com/takashiishida/irreducible)] | [[video](https://youtu.be/QGHSJ0XJdXY)] | [[Fashion-MNIST-H](https://paperswithcode.com/dataset/fashion-mnist-h)] | [[JP video](https://youtu.be/sFS5OTvSgqI?si=zfNQVnCmQrRyAOFM)] 3. I. Yamane, Y. Chevaleyre, **T. Ishida**, F. Yger. *Mediated Uncoupled Learning and Validation with Bregman Divergences: Loss Family with Maximal Generality.* **AISTATS 2023**. [[paper](https://proceedings.mlr.press/v206/yamane23a/yamane23a.pdf)] | [[code](https://github.com/i-yamane/mediated_uncoupled_learning)] | [[video](https://www.youtube.com/watch?v=annHE823iJ8)] 4. Z. Lu, C. Xu, B. Du, **T. Ishida**, L. Zhang, M. Sugiyama. *LocalDrop: A Hybrid Regularization for Deep Neural Networks.* **IEEE TPAMI** 44 (7): 3590-3601, 2022. [[paper](https://ieeexplore.ieee.org/document/9361094)] 5. H. Ishiguro, **T. Ishida**, M. Sugiyama. *Learning from Noisy Complementary Labels with Robust Loss Functions.* **IEICE Trans. Inf. Syst.** E105-D (2): 364-376, 2022. [[paper](https://www.jstage.jst.go.jp/article/transinf/E105.D/2/E105.D_2021EDP7035/_pdf)] 6. **T. Ishida**, I. Yamane, T. Sakai, G. Niu, M. Sugiyama. *Do We Need Zero Training Loss After Achieving Zero Training Error?* **ICML 2020**. [[paper](http://proceedings.mlr.press/v119/ishida20a.html)] | [[code](https://github.com/takashiishida/flooding)] | [[video](https://slideslive.com/38928529/do-we-need-zero-training-loss-after-achieving-zero-training-error?ref=speaker-17921-latest)] 7. **T. Ishida**, G. Niu, A. K. Menon, M. Sugiyama. *Complementary-Label Learning for Arbitrary Losses and Models.* **ICML 2019**. [[paper](https://arxiv.org/abs/1810.04327)] | [[poster](https://s3.amazonaws.com/postersession.ai/9910d57d-1278-49e5-babe-6c47d59bc392.pdf)] | [[slides](https://github.com/takashiishida/comp/blob/master/slides.pdf)] | [[video](https://slideslive.com/38917771/supervised-and-transfer-learning)] | [[code](https://github.com/takashiishida/comp)] 8. **T. Ishida**, G. Niu, M. Sugiyama. *Binary Classification from Positive-Confidence Data.* **NeurIPS 2018** — *spotlight presentation*. [[paper](https://arxiv.org/pdf/1710.07138.pdf)] | [[poster](https://github.com/takashiishida/pconf/blob/master/poster.pdf)] | [[slides](https://github.com/takashiishida/pconf/blob/master/spotlight_slides.pdf)] | [[video](https://youtu.be/2BpJcOf-1XA)] | [[code](https://github.com/takashiishida/pconf)] Press coverage: [RIKEN press release](https://aip.riken.jp/pressrelease/machine-learning181126/) · [ScienceDaily](https://www.sciencedaily.com/releases/2018/11/181126123323.htm) · [Phys.org](https://phys.org/news/2018-11-smarter-aimachine-negative.html) · [Asian Scientist](https://www.asianscientist.com/2018/12/in-the-lab/artificial-intelligence-negative-data/) · [ISE Magazine](https://www.iise.org/IndustrialEngineer/Issue.aspx?IssueMonth=01&IssueYear=2019) · [RIKEN Research PDF](https://www.riken.jp/medialibrary/riken/pr/publications/riken_research/2019/rr201903.pdf) · [日刊工業新聞](https://www.nikkan.co.jp/articles/view/00497562) · [ITmedia](https://atmarkit.itmedia.co.jp/ait/articles/1812/07/news049.html) 9. **T. Ishida**, G. Niu, W. Hu, M. Sugiyama. *Learning from Complementary Labels.* **NeurIPS 2017**. [[paper](https://arxiv.org/pdf/1705.07541.pdf)] | [日刊工業新聞](https://www.nikkan.co.jp/articles/view/00433430) 10. **T. Ishida**. *Forecasting Nikkei 225 Returns By Using Internet Search Frequency Data.* **Securities Analysts Journal** 52 (6): 83-93, 2014 — *Research Notes*. --- ## Books 1. M. Sugiyama, H. Bao, **T. Ishida**, N. Lu, T. Sakai, G. Niu. **Machine Learning from Weak Supervision: An Empirical Risk Minimization Approach.** MIT Press, Adaptive Computation and Machine Learning series, 2022. [[publisher](https://mitpress.mit.edu/books/machine-learning-weak-supervision)] --- ## Grants * Grant-in-Aid for Scientific Research (B), JSPS, 2022–2025 *(Co-Investigator)* * Grant-in-Aid for Early-Career Scientists, JSPS, 2022–2026 * Frontier of Mathematics and Information Science (ACT-X), JST, 2020–2023 — *selected as FY 2024 achievement* * Grant-in-Aid for JSPS Fellows, 2020–2021 --- ## Awards & Achievements * [Asian Trustworthy Machine Learning (ATML) Fellowship](https://bhanml.github.io/atml_fellow_2025.pdf), 2025 * Expert Reviewer, *Transactions on Machine Learning Research*, 2024 * Achievements, Information & Communications Technology, JST, 2024 ([JP](https://www.jst.go.jp/seika/) / [EN](https://www.jst.go.jp/EN/achievements/research/index.html) / [PDF](https://www.jst.go.jp/seika/pdf/seika.pdf)) * [Funai Information Technology Award for Young Researchers](https://funaifoundation.jp/grantees/awardees_up_to_now.html), 2022 *(received 2023)* * IEICE TC-IBISML Research Award Finalist, 2020 *(received 2021)* * Dean's Award for Outstanding Achievement, UTokyo Graduate School of Frontier Sciences, 2021 * Toyota/Dwango AI Scholarship, 2020–2021 * Award Finalist, IBIS 2020 * Top 10 % Reviewer, NeurIPS 2020 * JSPS Research Fellowship for Young Scientists (DC2), 2020–2021 * Top 50 % Reviewer, NeurIPS 2019 * Google PhD Fellowship, 2019 * [IEICE TC-IBISML Research Award](http://ibisml.org/award), 2017 *(received 2018)* --- ## Professional Service * **Committee member:** FY 2022–2023 IEICE [IBISML Technical Group](https://ibisml.org/committee) * **Workshop organiser:** PC member, [IBIS 2023](https://ibisml.org/ibis2023/); Executive Group, [TrustML Young Scientist Seminars](https://trustmlresearch.github.io); Organiser, [NeurIPS Meetup Japan 2021](https://neuripsmeetup.jp/2021/) * **Area Chair (2025):** ICLR, ICML, ACML * **Conference PC / Reviewer:** *2025* — NeurIPS *2024* — ICLR, AISTATS, ICML, ACML, NeurIPS *2023* — ICLR *2022* — ICLR, AISTATS, ICML, NeurIPS *2021* — NeurIPS, ACML, ICLR, UAI, ICML *2020* — NeurIPS (top 10 %), ICML, ICLR, AAAI, AISTATS, UAI, ACML *2019* — NeurIPS (top 50 %), ICML, AAAI, AISTATS, UAI, ACML * **Journal Action Editor:** *Transactions on Machine Learning Research (TMLR)*, since 2024 * **Journal Reviewer:** IEEE TPAMI, IEEE TIP, Journal of Information Processing, *Machine Learning*, *Artificial Intelligence (AIJ)*, *TMLR* — Expert Reviewer (2024) * **Workshop Reviewer:** Reproducibility Challenge @ NeurIPS 2019; IJCAI 2021 Workshop on Weakly Supervised Representation Learning --- ## Courses at UTokyo | Course | Level / Role | Years | | --- | --- | --- | | **Advanced Data Analysis** (with Masashi Sugiyama) | Graduate | 2021 S1S2 (JP), 2023 S1S2 (EN), 2025 S1S2 (EN) | | **Statistical Machine Learning** (with Issei Sato & Masashi Sugiyama) | Undergraduate (JP) | 2021 – 2025 S1S2 | | **Statistics and Optimization** (with Issei Sato & Masashi Sugiyama) | Undergraduate (JP) | 2021 – 2024 A1A2 | | **Intelligent Systems** (with Issei Sato, Masashi Sugiyama & Yusuke Miyao) | Undergraduate (JP) | 2021 – 2024 A1A2 | | **Machine Learning** – UTokyo Extension | Public course (JP) | Spring 2024 | --- ## Links - [GitHub](https://github.com/takashiishida) - [X](https://twitter.com/tksii) - [Instagram](https://www.instagram.com/tksiia/) - [YouTube](https://www.youtube.com/@takashi_ishida) - [LinkedIn](https://www.linkedin.com/in/takashi-ishida-32210442/) - [Google Scholar](https://scholar.google.com/citations?user=IzoyKyUAAAAJ) - [researchmap](https://researchmap.jp/takashiishida?lang=en) ## Contact - **Email:** `ishi (at) k.u-tokyo (dot) ac (dot) jp` - **Homepage:** ---