Search results:
Found 3
Listing 1  3 of 3 
Sort by

Choose an application
This book deals with applications of quantum mechanical techniques to areas outside of quantum mechanics, socalled quantumlike modeling. Research in this area has grown over the last 15 years. But even already more than 50 years ago, the interaction between Physics Nobelist Pauli and the psychologist Carl Jung in the 1950's on seeking to find analogous uses of the complementarity principle from quantum mechanics in psychology needs noting. This book does NOT want to advance that society is quantum mechanical! The macroscopic world is manifestly not quantum mechanical. But this rules not out that one can use concepts and the mathematical apparatus from quantum physics in a macroscopic environment. A mainstay ingredient of quantum mechanics, is 'quantum probability' and this tool has been proven to be useful in the mathematical modelling of decision making. In the most basic experiment of quantum physics, the double slit experiment, it is known (from the works of A. Khrennikov) that the law of total probability is violated. It is now well documented that several decision making paradoxes in psychology and economics (such as the Ellsberg paradox) do exhibit this violation of the law of total probability. When data is collected with experiments which test 'nonrational' decision making behaviour, one can observe that such data often exhibits a complex noncommutative structure, which may be even more complex than if one considers the structure allied to the basic two slit experiment. The community exploring quantumlike models has tried to address how quantum probability can help in better explaining those paradoxes. Research has now been published in very high standing journals on resolving some of the paradoxes with the mathematics of quantum physics. The aim of this book is to collect the contributions of world's leading experts in quantum like modeling in decision making, psychology, cognition, economics, and finance.
Quantumlike models  mathematical formalism of quantum theory  quantum probability  decision making  psychology  cognition  emotions
Choose an application
We confess that the first part of our title is somewhat of a misnomer. Bayesian reasoning is a normative approach to probabilistic belief revision and, as such, it is in need of no improvement. Rather, it is the typical individual whose reasoning and judgments often fall short of the Bayesian ideal who is the focus of improvement. What have we learnt from over a halfcentury of research and theory on this topic that could explain why people are often nonBayesian? Can Bayesian reasoning be facilitated, and if so why? These are the questions that motivate this Frontiers in Psychology Research Topic. Bayes' theorem, named after English statistician, philosopher, and Presbyterian minister, Thomas Bayes, offers a method for updating one’s prior probability of an hypothesis H on the basis of new data D such that P(HD) = P(DH)P(H)/P(D). The first wave of psychological research, pioneered by Ward Edwards, revealed that people were overly conservative in updating their posterior probabilities (i.e., P(DH)). A second wave, spearheaded by Daniel Kahneman and Amos Tversky, showed that people often ignored prior probabilities or base rates, where the priors had a frequentist interpretation, and hence were not Bayesians at all. In the 1990s, a third wave of research spurred by Leda Cosmides and John Tooby and by Gerd Gigerenzer and Ulrich Hoffrage showed that people can reason more like a Bayesian if only the information provided takes the form of (nonrelativized) natural frequencies. Although Kahneman and Tversky had already noted the advantages of frequency representations, it was the third wave scholars who pushed the prescriptive agenda, arguing that there are feasible and effective methods for improving belief revision. Most scholars now agree that natural frequency representations do facilitate Bayesian reasoning. However, they do not agree on why this is so. The original third wave scholars favor an evolutionary account that posits human brain adaptation to natural frequency processing. But almost as soon as this view was proposed, other scholars challenged it, arguing that such evolutionary assumptions were not needed. The dominant opposing view has been that the benefit of natural frequencies is mainly due to the fact that such representations make the nested set relations perfectly transparent. Thus, people can more easily see what information they need to focus on and how to simply combine it. This Research Topic aims to take stock of where we are at present. Are we in a protofourth wave? If so, does it offer a synthesis of recent theoretical disagreements? The second part of the title orients the reader to the two main subtopics: what works and why? In terms of the first subtopic, we seek contributions that advance understanding of how to improve people’s abilities to revise their beliefs and to integrate probabilistic information effectively. The second subtopic centers on explaining why methods that improve nonBayesian reasoning work as well as they do. In addressing that issue, we welcome both critical analyses of existing theories as well as fresh perspectives. For both subtopics, we welcome the full range of manuscript types.
Bayesian reasoning  belief revision  Risk Communication  subjective probability  human judgment  psychological methods  individual differences  Bayesianism  probabilistic judgment
Choose an application
From ABO typing during the first half of the 20th century, to the use of enzymes and protein contained in blood serums and finally direct DNA typing, biology has been serving forensic purposes for many decades. Statistics, in turn, has been constantly underpinning the discussions of the probative value of results of biological analyses, in particular when defendants could not be considered as excluded as potential sources because of different genetic traits. The marriage between genetics and statistics has never been an easy one, though, as is illustrated by fierce arguments that peaked in the socalled "DNA wars" in some American courtrooms in the mid1990s. This controversy has contributed to a lively production of research and publications on various interpretative topics, such as the collection of relevant data, foundations in population genetics as well as theoretical and practical considerations in probability and statistics. Both DNA profiling as a technique and the associated statistical considerations are now widely accepted as robust, but this does not yet guarantee or imply a neat transition to their application in court. Indeed, statistical principles applied to results of forensic DNA profiling analyses are a necessary, yet not a sufficient preliminary requirement for the contextually meaningful use of DNA in the law. Ultimately, the appropriate use of DNA in the forensic context relies on inference, i.e. reasoning reasonably in the face of uncertainty. This is all the more challenging that such thought processes need to be adopted by stakeholders from various backgrounds and holding diverse interests. Although several topics of the DNA controversy have been settled over time, some others are still debated (such as the question of how to deal with the probability of error), while yet others  purportedly settled topics  saw some recent revivals (e.g., the question of how to deal with database searches). In addition, new challenging topics have emerged over the last decade, such as the analysis and interpretation of traces containing only low quantities of DNA where artefacts of varying nature may affect results. Both technical and interpretative research involving statistics thus represent areas where ongoing research is necessary, and where scholars from the natural sciences and the law should collaborate. The articles in this Research Topic thus aim to investigate, from an interdisciplinary perspective, the current understanding of the strengths and limitations of DNA profiling results in legal applications. This Research Topic accepts contributions in all frontiers article type categories and places an emphasis on topics with a multidisciplinary perspective that explore (while not being limited to) statistical genetics for forensic scientists, case studies and reports, evaluation and interpretation of forensic findings, communication of expert findings to laypersons, quantitative legal reasoning and factfinding using probability.
Forensic DNA profiling  interpretation  Statistics and the law  probability theory  Commercialization  DNA transfer  Lowtemplate DNA analysis  forensic molecular biology  Bacterial DNA
Listing 1  3 of 3 
Sort by
