Cognitive Studies: Bulletin of the Japanese Cognitive Science Society
Online ISSN : 1881-5995
Print ISSN : 1341-7924
ISSN-L : 1341-7924
Current issue
Cognitive studies: Bulletin of the Japanese Cognitive Science Society
Displaying 1-18 of 18 articles from this issue
Foreword
Feature A new dimension of the humanities: Exploring through the lens of quantum theory
  • Shogo Tanaka, Atsushi Iriki
    Article type: Other
    2026Volume 33Issue 1 Pages 3-6
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS
    Download PDF (611K)
  • Keiichi Omura
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 7-20
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    This article examines how fieldwork and ethnography in anthropology can be revitalized through the application of “diffractive methodology,” as proposed by Karen Barad within the framework of “agential realism.” First, I outline the trajectory of modern anthropology from the late 19th century to the present, demonstrating that the prevailing methodological foundations of fieldwork and ethnography, which have been instrumental in establishing anthropology as a social science, have historically rested on “essentialist” assumptions. These assumptions, in turn, have inadvertently supported and accelerated the processes of colonialism. In response to this problem, I propose a reconfiguration of fieldwork and ethnographic methodologies informed by “diffractive methodology.” This approach, grounded in the relational and performative ontology of agential realism, seeks to move beyond essentialism by recognizing the entangled agencies of researchers, participants, and the material-discursive environments in which knowledge is produced. Finally, I explore the potential contributions of this reconfigured methodological fieldwork to revitalizing anthropology as a decolonial science. By embedding diffractive practices into ethnographic research, anthropology may more effectively resist colonial legacies and foster epistemologies that are situated, relational, and ethically responsive to diverse ways of living.

    Download PDF (657K)
  • Shogo Tanaka
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 21-35
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    This paper explores a structural affinity between quantum physics and phenomenological philosophy by rethinking the measurement problem in quantum mechanics from an ontological rather than epistemological perspective. Tracing the historical parallel between early 20th-century developments in physics and philosophy, the study argues that quantum measurement, like perception in phenomenology, entails an irreducible interaction of observer and observed. Revisiting the Copenhagen interpretation and the thought experiments of Schrödinger and Heisenberg, the paper critiques the limits of classical realism and representational epistemology. Drawing on the phenomenological developments from Husserl to Heidegger and Merleau-Ponty, it proposes that the quantum apparatus, like the lived body, constitutes an expressive field that discloses reality through interaction. This leads to a reinterpretation of ontology in terms of sensibility and event. In the final section, Merleau-Ponty’s ontology of the flesh is connected to Karen Barad’s agential realism, emphasizing the intra-active constitution of phenomena and the generativity of touching. The paper concludes by proposing a post-Cartesian ontology in which reality is not composed of pre-existing objects, but arises from the intersection of sensibilities. This framework not only provides new philosophical grounds for understanding quantum phenomena as neither fully representable nor entirely indeterminate, but also establishes a foundational ontology that bridges physics and the humanities—offering fresh perspectives on being as relational, generative, and inseparable from embodied experience.

    Download PDF (586K)
  • Miho Fuyama, Hideyuki Hoshi, Tomohiro Ishizu, Andrei Khrennikov
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 36-45
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    We begin with a brief history of applying quantum theory to cognition, which includes two approaches: quantum modeling, linking cognitive processes to physical quantum mechanisms in the brain, and quantum-like modeling (QLM), which uses quantum formalism and methodology without assuming reduction to quantum physical processes. QLM has proven effective as a phenomenological approach in the fields like psychology, economics, and social sciences, especially in modeling decision-making in the form of option selection. Recently, QLM has been extended to art and literature (AL), where aesthetic experiences present challenges not found in decision-making. Traditional projective measurements are insufficient here; instead, positive operator-valued measures (POVMs) offer a better fit for modeling such experiences. Though still in early stages, studies have shown that book reading can produce question order effects and violations of the QQ equality, both indicating that projective measurements may not apply. Overall, QLM in AL requires more sophisticated tools from quantum measurement theory, reflecting the complexity of aesthetic cognition.

    Download PDF (664K)
  • Makiko Yamada
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 46-54
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    This article offers a theoretical reconceptualization of negative capability (NC), understood as the capacity to remain with indeterminacy and suspend premature closure, within a cognitive scientific framework. While NC has traditionally been discussed in poetic and clinical contexts, the argument situates it in relation to indeterminacy and introduces the Quantum Zeno effect (QZE) as a structural model for describing the interplay between observation and state transitions. NC is characterized here as a metacognitive capacity that modulates transitions between cognitive states through the frequency and timing of observation or evaluation. By articulating the theoretical intersections among NC, indeterminacy, and QZE, the article recasts NC as a form of cognitive tolerance of, and regulatory control over, undecided states under indeterminacy, shaped by temporal and attentional dynamics. This framework illuminates how the mind can sustain ambiguity, delay commitment, and reorganize meaning over time, and it points toward future directions for modeling, operationalization, and empirical investigation of NC within cognitive science.

    Download PDF (481K)
  • Atsushi Iriki
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 55-72
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    This paper critically reexamined the foundational assumption of “linear causality” in modern science and proposes a novel theoretical framework: path integral causality. Scientific thought has gradually lost the multifaceted understanding of causation exemplified by Aristotle’s theory of four causes. Since the rise of modern science, this has culminated in a reductionist, linear model of causality that privileges straightforward cause above all. While this model has fueled technological progress, it has also fostered systemic fatigue, such as institutional rigidity, the elimination of alternatives, and a widening divide between the sciences and the humanities. In contrast, this paper applied the path integral formulation of quantum mechanics to biological evolution and the human and social sciences, proposing a nonlinear causal model that incorporates the superposition and interference of latent possibilities. Under this framework, non-reproducible phenomena — such as evolution, history, institutions, and language — are not reducible to a single causal chain but are better understood as interactions among multiple, latent potentialities. Here, unrealized possibilities are re-evaluated, and future options are preserved and unfolded in novel forms. Recent advances in quantum computing enable the concrete visualization of such nonlinear and multilayered causal structures, offering a foundation for the reintegration of the sciences and the humanities. The new frontiers facing contemporary society — space, the deep sea, and AI — can thus be redefined as synchronic spaces of coexisting possibilities under this expanded causal model. In conclusion, a science grounded in this new form of synchronic causality can transcend reductive models, providing a theoretical basis for envisioning a future rooted in diversity and ethics.

    Download PDF (2935K)
  • Kazuhiro Sakurada, Leoto Honma, Yuki Nakanishi, Yui Miyauchi, Kohsuke ...
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 73-90
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    Humans make decisions and inferences that violate Bayesian (classical) probability, such as conjunction/disjunction fallacies in probabilistic judgment and order effects on judgments. This has led to the proposal of cognitive models using quantum probability theory. Quantum probability theory refers to rules for assigning probabilities to events based on quantum mechanics, without incorporating any elements of physics. However, no answer has been found to the question of why human cognition follows the laws of quantum probability. Living organisms are nonlinear, nonequilibrium, open many-body systems exhibiting duality of particle and wave properties. Biological entanglement, the principle of indeterminacy, context dependency, and path integrals are observed in life phenomena. We developed an organism mechanics theory to explain the quantum-like properties found in empirical observations through natural principles. This paper first discusses the quantum extension of this organism mechanics theory, then presents a cognitive model based on this quantum extension. It also demonstrates that this new cognitive model can model five functions currently lacking in AI: purposeful, spontaneous engagement with reality; sensory experience of the real world; a biological body; autonomous learning; and development that gradually matures through accumulated experience.

    Download PDF (4016K)
  • Kazunori Kondo
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 91-105
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    This paper addresses the epistemological barrier of “causality” faced by modern science, highlighting the transition from classical science—grounded in objective, deterministic frameworks—to quantum theory, where observation and system are inseparable and non-commutativity emerges. By situating the causality problem as a meta-level conflict, as in the Einstein-Bohr debate, it emphasizes the need to rethink causality through the lens of contextuality and partial logical structures. Building on the framework of Generalized Probabilistic Theories (GPT) and recent advances in orthomodular lattice theory (OML), the paper proposes a novel model in which reality is understood as a network of context-dependent, branching causal relations, rather than a linear deterministic chain. This approach bridges operational realism (phenomena as state-effect interactions) and structural realism (reality in formal relationships), while also integrating the Aristotelian “Four Causes” as layered constraints in contemporary scientific models. The paper further advances the idea that causality must be conceived as a network of dynamically interacting contextual constraints and tendencies (“agency”), reflecting both quantum-like phenomena and broader cognitive or social systems. This holistic perspective redefines causality not as a primitive, but as an emergent property of phenomena arising from overlapping contextual logics and their dynamic interactions.

    Download PDF (433K)
  • Masanao Ozawa
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 106-117
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    The question order effect refers to the phenomenon where changing the order of questions alters responses. Since its discovery in social psychology in the 1940s, the question order effect has challenged the validity and reliability of social surveys, including public opinion polls, and prompted various theoretical explanations. However, it has been considered difficult to quantitatively explain this effect using conventional statistical methods. In recent years, research progressing in the field of quantum-like cognition has developed models applying quantum instrument theory, a general theory of quantum measurement, to explain this effect. This paper explains the methodological characteristics of the quantum-like cognitive models and the advances in research on the question order effect achieved through those models.

    Download PDF (551K)
  • Hayato Saigo
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 118-125
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    This paper discusses why the mathematics of quantum theory can also be applied to cognition. Here, “mathematics of quantum theory” refers to noncommutative probability theory, which is defined as studies of noncommutative probability spaces — pairs consisting of a noncommutative algebras and a states. Interestingly, noncommutative probability spaces can be generated from categories. If categorical structures are significant in cognition, it follows that noncommutative probability theory would also be a “mathmatics of cognitive science”.

    Download PDF (444K)
  • Haruki Emori
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 126-138
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    Conventional models based on classical probability theory face limitations in providing a unified explanation for complex psychological phenomena in human cognition. This paper presents quantum-like models, grounded in non-commutative probability theory, a mathematical structure used for foundations of quantum theory, to describe these phenomena and introduce a new framework to the field of humanities. We introduce instrument models and path integral models as representative examples of this approach. We then present a methodology for empirically validating those quantum-like models by using a practical quantum computer as a quantum simulator. Specifically, we detail two quantum algorithms: one that reproduces the Clinton–Gore statistics from a public opinion poll, and another that implements an arbitrary instrument model on a quantum computer. This approach extends the concept of quantum falsifiability, the idea that predictions of the quantum theory can only be verified or falsified by devices that follow quantum principles, from physical to human psychological phenomena. This paper goes beyond mere technological application, presenting a novel scientific methodology that uses a quantum computer as a device to reproduce phenomena themselves. We anticipate this intellectual shift will dissolve the boundaries between the humanities and natural sciences, fostering a reintegration of knowledge at a critical turning point.

    Download PDF (567K)
  • Atsushi Iriki, Shogo Tanaka
    Article type: Invited Paper
    2026Volume 33Issue 1 Pages 139-144
    Published: March 01, 2026
    Released on J-STAGE: March 15, 2026
    JOURNAL FREE ACCESS

    This concluding article synthesizes the ten contributions of this special issue and articulates the intellectual significance of what we call the “quantum turn” in the humanities. Each paper, ranging from anthropology, phenomenology, and literary studies to neuroscience, mathematical physics, and quantum information science, shares a set of underlying principles: noncommutativity, observation as interaction, path-integral causality, and the open-system view of indeterminacy. Together they reveal a new logical architecture of knowledge in which theory and interpretation are dynamically entangled. Philosophically, this shift reconfigures epistemology, ontology, and axiology by extending the notion of measurement from physics to human perception, cognition, and meaning-making. Technologically, the emergence of quantum computers as quantum simulators allows the internal logic of these models to be tested through genuine quantum interference, far beyond classical emulation. This capability makes the experimental humanities (where concepts can be simulated and observed within the apparatus itself) a tangible reality. The “quantum turn” thus represents not merely a metaphor but a concrete transformation in how we conceive, verify, and co-create knowledge in the twenty-first century.

    Download PDF (686K)
Outstanding Presenter Award-winning Papers
Book Reviews
feedback
Top