Hi, I’m Daiki Chijiwa

Me

About

I am a researcher at NTT (Computer and Data Science Laboratories, Japan) since 2019. Previously I majored in Mathematics and studied complex algebraic geometry and Hodge theory during the master’s degree program.

Current research interests: neural networks, meta-learning, statistical learning theory

Brief CV

Publications

Preprints

  1. M. Yamada, T. Yamashita, S. Yamaguchi, D. Chijiwa, Revisiting Permutation Symmetry for Merging Models between Different Datasets, arXiv:2306.05641
  2. D. Chijiwa, Transferring Learning Trajectories of Neural Networks [animation], arXiv:2305.14122
  3. S. Yamaguchi, S. Kanai, A. Kumagai, D. Chijiwa, H. Kashima, Transfer Learning with Pre-trained Conditional Generative Models, arXiv:2204.12833

Journals / International Conferences

  1. S. Yamaguchi, D. Chijiwa, S. Kanai, A. Kumagai, H. Kashima, Regularizing Neural Networks with Meta-Learning Generative Models, Advances in Neural Information Processing Systems (NeurIPS), 2023
  2. D. Chijiwa, S. Yamaguchi, A. Kumagai, Y. Ida, Meta-ticket: Finding optimal subnetworks for few-shot learning within randomly initialized neural networks, Advances in Neural Information Processing Systems (NeurIPS), 2022
  3. D. Chijiwa, S. Yamaguchi, Y. Ida, K. Umakoshi, T. Inoue, Pruning randomly initialized neural networks with iterative randomization, Advances in Neural Information Processing Systems (NeurIPS, selected as Spotlight), 2021

Non-archival Workshops

  1. S. Yamaguchi, D. Chijiwa, S. Kanai, A. Kumagai, H. Kashima, Regularizing Neural Networks with Meta-Learning Generative Models, Data-centric Machine Learning Research (DMLR) Workshop (ICML 2023, Honolulu, Hawaii)
  2. D. Chijiwa, On the Problem of Transferring Learning Trajectories Between Neural Networks, Workshop on High-dimensional Learning Dynamics (ICML 2023, Honolulu, Hawaii)

Master’s Thesis

  1. D. Chijiwa, On certain algebraic cycles on abelian varieties of Fermat type, 2019.