Search results: Found 2

Listing 1 - 2 of 2
Sort by
Quantum Nonlocality

Author:
ISBN: 9783038979487 9783038979494 Year: Pages: 238 DOI: 10.3390/books978-3-03897-949-4 Language: English
Publisher: MDPI - Multidisciplinary Digital Publishing Institute
Subject: Science (General) --- Physics (General)
Added to DOAB on : 2019-06-26 08:44:06
License:

Loading...
Export citation

Choose an application

Abstract

This book presents the current views of leading physicists on the bizarre property of quantum theory: nonlocality. Einstein viewed this theory as “spooky action at a distance” which, together with randomness, resulted in him being unable to accept quantum theory. The contributions in the book describe, in detail, the bizarre aspects of nonlocality, such as Einstein–Podolsky–Rosen steering and quantum teleportation—a phenomenon which cannot be explained in the framework of classical physics, due its foundations in quantum entanglement. The contributions describe the role of nonlocality in the rapidly developing field of quantum information. Nonlocal quantum effects in various systems, from solid-state quantum devices to organic molecules in proteins, are discussed. The most surprising papers in this book challenge the concept of the nonlocality of Nature, and look for possible modifications, extensions, and new formulations—from retrocausality to novel types of multiple-world theories. These attempts have not yet been fully successful, but they provide hope for modifying quantum theory according to Einstein’s vision.

Keywords

quantum nonlocality --- quantum mechanics --- Stern–Gerlach experiment --- quantum measurement --- pre- and post-selected systems --- retro-causal channel --- channel capacity --- channel entropy --- axioms for quantum theory --- PR box --- nonlocal correlations --- classical limit --- retrocausality --- quantum correlations --- quantum bounds --- nonlocality --- tsallis entropy --- ion channels --- selectivity filter --- quantum mechanics --- non-linear Schrödinger model --- biological quantum decoherence --- non-locality --- parity measurements --- entanglement --- pigeonhole principle --- controlled-NOT --- semiconductor nanodevices --- quantum transport --- density-matrix formalism --- Wigner-function simulations --- nonlocal dissipation models --- steering --- entropic uncertainty relation --- general entropies --- Bell’s theorem --- Einstein–Podolsky–Rosen argument --- local hidden variables --- local realism --- no-signalling --- parallel lives --- local polytope --- quantum nonlocality --- communication complexity --- optimization --- KS Box --- PR Box --- Non-contextuality inequality --- discrete-variable states --- continuous-variable states --- quantum teleportation of unknown qubit --- hybrid entanglement --- collapse of the quantum state --- quantum nonlocality --- communication complexity --- quantum nonlocality --- Bell test --- device-independent --- p-value --- hypothesis testing --- nonsignaling --- EPR steering --- quantum correlation --- non-locality --- entanglement --- uncertainty relations --- nonlocality --- entanglement --- quantum

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

Author:
ISBN: 9783038979364 9783038979371 Year: Pages: 344 DOI: 10.3390/books978-3-03897-937-1 Language: English
Publisher: MDPI - Multidisciplinary Digital Publishing Institute
Subject: Social Sciences --- Sociology --- Statistics
Added to DOAB on : 2019-06-26 08:44:06
License:

Loading...
Export citation

Choose an application

Abstract

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

Keywords

sparse --- robust --- divergence --- MM algorithm --- Bregman divergence --- generalized linear model --- local-polynomial regression --- model check --- nonparametric test --- quasi-likelihood --- semiparametric model --- Wald statistic --- composite likelihood --- maximum composite likelihood estimator --- Wald test statistic --- composite minimum density power divergence estimator --- Wald-type test statistics --- Bregman divergence --- general linear model --- hypothesis testing --- influence function --- robust --- Wald-type test --- log-linear models --- ordinal classification variables --- association models --- correlation models --- minimum penalized ?-divergence estimator --- consistency --- asymptotic normality --- goodness-of-fit --- bootstrap distribution estimator --- thematic quality assessment --- relative entropy --- logarithmic super divergence --- robustness --- minimum divergence inference --- generalized renyi entropy --- minimum divergence methods --- robustness --- single index model --- model assessment --- statistical distance --- non-quadratic distance --- total variation --- mixture index of fit --- Kullback-Leibler distance --- divergence measure --- ?-divergence --- relative error estimation --- robust estimation --- information geometry --- centroid --- Bregman information --- Hölder divergence --- indoor localization --- robustness --- efficiency --- Bayesian nonparametric --- Bayesian semi-parametric --- asymptotic property --- minimum disparity methods --- Hellinger distance --- Berstein von Mises theorem --- measurement errors --- robust testing --- two-sample test --- misspecified hypothesis and alternative --- 2-alternating capacities --- composite hypotheses --- corrupted data --- least-favorable hypotheses --- Neyman Pearson test --- divergence based testing --- Chernoff Stein lemma --- compressed data --- Hellinger distance --- representation formula --- iterated limits --- influence function --- consistency --- asymptotic normality --- location-scale family --- n/a

Listing 1 - 2 of 2
Sort by
Narrow your search

Publisher

MDPI - Multidisciplinary Digital Publishing Institute (2)


License

CC by-nc-nd (2)


Language

english (2)


Year
From To Submit

2019 (2)