News

2023.04.10Recruitment

A01 Postdoctoral Position

A01 POSTDOCTORAL POSITION

Two currently opening positions at ATR (https://www.atr.jp/index_e.html) Tsuchiya Group : To apply: email CV and cover letter to naotsugu.tsuchiya@monash.edu

  1. a large-scale online or face-to-face similalrity psychophysics experiment. 

  2. a mathematical approach towards establishing qualia structure using category theory (category algebra and states on it) 

  3. a mathematical approach towards establishing qualia structure using category-sheaf theory

All positions will require some travels to Melbourne, Australia (Tsuchiya) and Nagahama Bio University (PI Saigo) and AIST (PI Phillips), Japan. 


Please check the relevant papers below before sending your application. 

(We will not respond to those inquiries that appear to be not targeted to our project.)


PhD Degree in computer science, data science, psychology, neuroscience, or related field is required. 

Relevant skills:

  • experience with python and/or Matlab
  • experience with category theory, ideally, applied category theory using functional programming 
  • interest in and ability to work on team-based projects
  • enthusiasm and experience in consciousness research, especially the issue of qualia



Online psychophysics postdoc position:

Directly relevant papers

  1. *Kawakita, G., *Zeleznikow-Johnston, A., **Tsuchiya, N., & **Oizumi, M. (2023, January 7). Is my "red" your "red"?: Unsupervised alignment of qualia structures via optimal transport. https://doi.org/10.31234/osf.io/h3pqm   *-equal contribution, **-equal contribution 

  2. Ariel Zeleznikow-Johnston, Yasunori Aizawa, Makiko Yamada, Naotsugu Tsuchiya Stage 2 Accepted preregistered research report (2023). Journal of Cognitive Neuroscience. Are colour experiences the same across the visual field? DOI: https://doi.org/10.1162/jocn_a_01962   https://osf.io/yuq2v 


Related projects: 

  1. Chuyin, Z., Koh, Z., Gallagher, R., Nishimoto, S., & Tsuchiya, N. (2022) F1000Research. What can we experience and report on a rapidly presented image? Intersubjective measures of specificity of freely reported contents of consciousness.  Behavioral data, analysis code, stimuli, experimental code are all available  [version 2; peer review: 2 approved]. 11:69 (https://doi.org/10.12688/f1000research.75364.2

  2. Qianchen L, Gallagher R, Tsuchiya N (2022) Apr 5, Stage 2 manuscript: “How much can we differentiate at a brief glance: Revealing the truer limit in conscious contents through the Massive Report Paradigm (MRP)" Royal Society Open Science https://doi.org/10.1098/rsos.210394

Category theory postdoc:

Directly relevant papers

  1. Naotsugu Tsuchiya & Hayato Saigo “A relational approach to consciousness: categories of level and contents of consciousness“ (2021) Neuroscience of Consciousness   Volume 2021, Issue 2, 2021, niab034, https://doi.org/10.1093/nc/niab034 

  2. Phillips, S. (2019) Sheaving—a Universal Construction for Semantic Compositionality in Philosophical Transactions of the Royal Society B: https://royalsocietypublishing.org/doi/10.1098/rstb.2019.0303

  3. Saigo, H. Category Algebras and States on Categories. Symmetry 2021, 13, 1172. https://www.mdpi.com/2073-8994/13/7/1172

  4. Tsuchiya, N., Saigo, H., & Phillips, S. (2022, Dec 14, accepted by Frontiers in Psychology). An adjunction hypothesis between qualia and reports. Retrieved from psyarxiv.com/q8ndj   Volume 13 - 2022 | doi: 10.3389/fpsyg.2022.1053977 

  5.  Tsuchiya, Naotsugu, Steven Phillips, and Hayato Saigo. “Enriched Category as a Model of Qualia Structure Based on Similarity Judgements.” Consciousness and Cognition 101 (May 1, 2022): 103319. https://doi.org/10.1016/j.concog.2022.103319

For an overview of category theory 

  1. Phillips, S. (2022b). What is category theory to cognitive science? Compositional representation and comparison. Frontiers in Psychology, 13, 1048975. DOI: https://www.frontiersin.org/articles/10.3389/fpsyg.2022.1048975/full

  2. 森口、土谷、西郷「認知発達研究における構造――圏論からのアプローチ――」(2023, Feb 6 Accepted) 心理学研究 https://psyarxiv.com/9bh3d/