Proceedings Volume 8121

The Nature of Light: What are Photons? IV

cover
Proceedings Volume 8121

The Nature of Light: What are Photons? IV

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 28 September 2011
Contents: 15 Sessions, 57 Papers, 0 Presentations
Conference: SPIE Optical Engineering + Applications 2011
Volume Number: 8121

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 8121
  • Photon Counting Stat and QM-CM Dichotomy
  • Gravity, Relativity, and EM Waves
  • Revisiting Derivations and QM Concepts I
  • Revisiting Derivations and QM Concepts II
  • Waves, Photons, and Computing Logics
  • Space as a Medium and its Properties
  • Diverse Photon Models I
  • Diverse Photon Models II
  • Diverse Photon Models III
  • Superposition and Interaction Process Models I
  • Superposition and Interaction Process Models II
  • Being Aware of Our Diverse Epistemologies I
  • Being Aware of Our Diverse Epistemologies II
  • Being Aware of Our Diverse Epistemologies III
Front Matter: Volume 8121
icon_mobile_dropdown
Front Matter: Volume 8121
This PDF file contains the front matter associated with SPIE Proceedings Volume 8121, including the Title Page, Copyright information, Table of Contents, Introduction, and the Conference Committee listing.
Photon Counting Stat and QM-CM Dichotomy
icon_mobile_dropdown
The singlet state and Bell-inequality tests
It is observed that a critical aspect of tests of Bell-inequalities is the employ of entities considered to be in the singlet state. This state is known to require extra-logical consideration to render it compatible with the current most popular interpretation of Quantum Theory. We show that the critical structure of this state for the analysis of these tests can be spoofed by feasible, classical effects, that thus far have not been absolutely precluded. Finally, we present statistical analysis showing that selecting for valid pairs of correlated signals by reducing the time off-set or window-width defining acceptable coincidences, actually and perversely supports the spoof mechanism.
Towards an event-based corpuscular model for optical phenomena
H. De Raedt, F. Jin, K. Michielsen
We discuss an event-based corpuscular model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory through a series of cause-and-effect processes, starting with the emission and ending with the detection of a particle. Event-based models of a single-photon detector and of light propagation through an interface of two dielectrics are used as modular building blocks to give a unified, corpuscular description of many optical phenomena. The approach is illustrated by applications to Wheeler's delayed choice, Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments.
A modified Mach-Zehnder experiment to test the applicability of quantum theory to single-particle experiments
K. Michielsen, Th. Lippert, M. Richter, et al.
We propose a modified single-particle Mach-Zehnder interferometer experiment in which the path length of one arm may change (randomly or systematically) according to the value of an external two-valued variable x, for each passage of a particle through the interferometer. Quantum theory predicts an interference pattern that is independent of the sequence of the values of x. On the other hand, corpuscular models that reproduce the results of quantum optics experiments carried out up to this date show a reduced visibility and a shift of the interference pattern depending on the details of the sequence of the values of x. The key question to be answered in a real laboratory experiment is: Which interference pattern is observed? Despite the general believe that quantum theory might be used to describe all single particle experiments, this is an interesting question to be answered since in the proposed experiment the experimental conditions not only continuously change but they might also have causal effects on the passage of the photons through the interferometer. The proposed experiment can be used to determine to what extent quantum theory provides a description of observed events beyond the usual statistical level.
Controversy among giants: Young's experiment and loss of fringe visibility at low photon-count levels
An ideal beam splitter model for an absorber presented by Leonhardt in his book Measuring the Quantum State of Light (Cambridge University Press, 1997) has intriguing implications for the simple Young's fringe experiment in the photon-counting regime. Specifically, it suggests that different results will be obtained depending on whether the light forming the fringes is attenuated at the source or at the slits.
Gravity, Relativity, and EM Waves
icon_mobile_dropdown
Gauss's Law for gravity and observational evidence reveal no solar lensing in empty vacuum space
E. H. Dowdye Jr.
Findings show that the rays of star light are lensed primarily in the plasma rim of the sun and hardly in the vacuum space just slightly above the rim. Since the lower boundary of this vacuum space is only a fraction of a solar radius above the solar plasma rim, it is exposed to virtually the same gravitational field. The thin plasma atmosphere of the sun appears to represent an indirect interaction involving an interfering plasma medium between the gravitational field of the sun and the rays of star light. The very same light bending equation obtained by General Relativity was derived from classical assumptions of a minimum energy path of a light ray in the plasma rim, exposed to the gravitational gradient field of the sun. The resulting calculation was found to be independent of frequency. An intense search of the star filled skies reveals a clear lack of lensing among the countless numbers of stars, where there are many candidates for gravitational lensing according to the assumptions of General Relativity. Assuming the validity of the light bending rule of General Relativity, the sky should be filled with images of Einstein rings. Moreover, a lack of evidence for gravitational lensing is clearly revealed in the time resolved images of the rapidly moving stellar objects orbiting about Sagittarius A*.
The high velocity version of classical mechanics
Randy T. Dorn
A good understanding of the actual mechanism for the attraction between an electron and positron is necessary for the effective study of electron - positron phenomenon such as annihilation and pair production. This "action at a distance" force has mathematical descriptions, but the underlying phenomenon is really not well understood. Our intuitive understanding of how force is delivered through the action of an impulse comes from our everyday experience and is described by Newton's Laws. If we extend this classical mechanical line of reasoning to these more mysterious forces, it leads to the derivation of a high velocity version of F = ma. The basis of this model is Newton's notion that gravity could be attributed to multiple impacts of invisible bodies. In this model it is assumed that such an acceleration field is made up of tiny bodies that travel at the speed of light and that these bodies deliver energy to accelerated particles by elastic collisions. The result is a mathematical model comparable to relativistic equations. This similarity leads to the conclusion that it is reasonable to rearrange and interpret the relativistic equations as a velocity dependent force. There is no reason to change the classical definition of mass, momentum and energy for the physics that has heretofore been described by relativity.
Alternative realization for the composition of relativistic velocities
The reciprocity principle requests that if an observer, say in the laboratory, sees an event with a given velocity, another observer at rest with the event must see the laboratory observer with minus the same velocity. The composition of velocities in the Lorentz-Einstein scheme does not fulfill the reciprocity principle because the composition rule is neither commutative nor associative. In other words, the composition of two non-collinear Lorentz boosts cannot be expressed as a single Lorentz boost but requires in addition a rotation. The Thomas precession is a consequence of this composition procedure. Different proposals such as gyro-groups have been made to fulfill the reciprocity principle. An alternative velocity addition scheme is proposed consistent with the invariance of the speed of light and the relativity of inertial frames. An important feature of the present proposal is that the addition of velocities is commutative and associative. The velocity reciprocity principle is then immediately fulfilled. This representation is based on a transformation of a hyperbolic scator algebra. The proposed rules become identical with the special relativity addition of velocities in one dimension. They also reduce to the Galilean transformations in the low velocity limit. The Thomas gyration needs to be revised in this nonlinear realization of the special relativity postulates. The deformed Minkowski metric presented here is compared with other deformed relativity representations.
Relativity: a pillar of modern physics or a stumbling block
Gurcharn S. Sandhu
Currently, the theory of Relativity is being regarded as one of the main pillars of Modern Physics, essentially due to its perceived role in high energy physics, particle accelerators, relativistic quantum mechanics, and cosmology. Since the founding assumptions or postulates of Relativity and some of the resulting consequences confound the logic and common sense, a growing number of scientists are now questioning the validity of Relativity. The advent of Relativity has also ruled out the existence of the 19th century notion of ether medium or physical space as the container of physical reality. Thereby, the Newtonian notions of absolute motion, absolute time, and absolute reference frame have been replaced with the Einsteinian notions of relative motion, relative time, and inertial reference frames in relative motion. This relativity dominated viewpoint has effectively abandoned any critical study or advanced research in the detailed properties and processes of physical space for advancement of Fundamental Physics. In this paper both special theory of relativity and general relativity have been critically examined for their current relevance and future potential. We find that even though Relativity appears to be a major stumbling block in the progress of Modern Physics, the issue needs to be finally settled by a viable experiment [Phys. Essays 23, 442 (2010)] that can detect absolute motion and establish a universal reference frame.
Shapiro delay: a frequency dependent transit time effect, not a space time effect
Shahin Ghazanshahi, Edward H. Dowdye Jr.
Irvin L. Shapiro first noticed in 1964 that the transit time required for a microwave signal to propagate through space, arrive at a satellite orbiting Venus or Mercury and then return back to the earth by the sun to be received at the observatory, had a measurable time delay that varied as a function of the impact parameter of the microwave beam relative to the sun. The delays were observed to be in the order of 100's of microseconds when the impact parameter of the microwave beam was at a minimum. These measurements permitted a precise determination of the electron density profile of the solar wind as a function of the radial distance r from the sun. The electron density profile of the solar wind is found to behave very nearly as an inverse square of the radial distance r from the sun. The solar wind is found engulf the outmost planets of the solar system. The bulk of all the measurements were done using microwave frequencies from 500 MHz to 8.8MHz. Significant findings of this research reveal that, for all microwave signals propagating in the solar wind atmosphere of the solar system, the waves are subjected to a frequency dependent plasma index of refraction n that exceeds unity, i.e., n > 1.0. For optical, IR and UV wavelengths, the plasma index of refraction n is 1.0000000000 for these wavelengths which are virtually unaffected by the solar wind electron density. As a consequence of these findings, the Shapiro delay cannot be related to a space-time effect of General Relativity which is independent of frequency.
Explanation of relativistic phenomena on the basis of interactions of particle energy, applied energy, and field energy
Viraj Fernando
This paper formulates a coherent theory of occurrence of relativistic phenomena in their interconnection in a feedback loop, unlike other theories, by tracing back concepts that Newton held but suppressed, when he developed his mechanics in his Principia. Despite Newtonian mechanics being based on a closed system, Newton in the General Scholium has indicated that in reality thee is in an open system, where 'a certain most subtle spirit' participates and directs all interactions from motion of bodies, to motion of light, to how the human brain operates. We have identified this 'most subtle spirit' as the non-empirical 'universal governing field' and that no empirical interaction in this universe can occur without exchange of energy between the empirical interactants and the governing field. By analyzing the energy momentum equation, we have demonstrated that everything empirical has a non-empirical substratum, identified with Spinoza's primitive substance, which binds everything in the universe to the field and through it to one another. Algorithm of Motion is founded on the Pythagorean character of the energy-momentum equation, which is applicable to all velocities 0<v<c. By the application of this Algorithm, principal relativistic phenomena are explained, inclusive of accounting for time change in a GPS clock due to orbital motion. By Leibniz' Principle of Relativity, we find the physical basis of the Lorentz transformation. How gravitation occurs is explained, and validated by accounting for the gravitational time change in a GPS clock. This theory ends the prevailing artificial schism of physics, with one theory valid for slow motion and the other theory valid for fast motion.
Revisiting Derivations and QM Concepts I
icon_mobile_dropdown
Interplay between theories of quantum and classical signals: classical representation of entanglement
Andrei Khrennikov
The idea that quantum mechanics (QM) is simply a version of classical field theory is very old. Recently this idea has been realized in a new mathematical framework - on the basis of theory of random fields (L2-valued random variables). Surprisingly (for at least orthodox Copenhagenist) fundamental predictions of QM can be reproduced on the basis of a purely wave model (prequantum classical statistical field theory, PCSFT). In particular, all quantum correlations (including correlations of composite systems in entangled states) can be represented as correlations of classical random signals. These signals fluctuate at the space-time scale which is essentially finer than the scale of quantum measurements. At the moment we are not able to monitor such signals. However, one can expect that increasing of the precision of measurements will provide such a possibility. In this paper we show that bosonic and fermionic correlations can be obtained in the classical field framework. Finally, we stress that QM can be reduced to theory of classical random fields only in the presence of a relatively strong background field.
Planck's constant h not only governs atomic shell energies, moreover, is also responsible for the neutrons and protons internal structure (charge and size)
Planck's constant h is made responsible for the neutron's and proton's internal structure. Heavy discrepancies between the neutron's inherent magnetic flux and the smallest magnetic flux in physics Φo pose this problem. To solve the oxymoron without joggling on natural constants this led to a new model for protons and neutrons; no quark model was used. Basic results are: The neutron consists of a highly charged ( Q.e) central core mass X ( 292,5 [MeV/c2] ) surrounded by tetrahedral grouped πo mesons. Nuclear forces are repelling. Coulomb forces between the core and the mesons keep the system in balance. The neutron system keeps its balance by bubbling out electrons from the core. At Q = 208 the free neutron becomes a proton, an electron is emanated to the outside world; an internal charge asymmetry occurs leading to confiscating core charges; the system shrinks. At Q = 47 the proton system comes to rest. The magnetic moment of the proton could be worked out analytically from the neutron's value. The naked core mass X could be a cue to the puzzle "dark matter".
The physical origin of the uncertainty theorem
Albrecht Giese
The uncertainty principle is an important element of quantum mechanics. It deals with certain pairs of physical parameters which cannot be determined to an arbitrary level of precision at the same time. According to the so-called Copenhagen interpretation of quantum mechanics, this uncertainty is an intrinsic property of the physical world. - This paper intends to show that there are good reasons for adopting a different view. According to the author, the uncertainty is not a property of the physical world but rather a limitation of our knowledge about the actual state of a physical process. This view conforms to the quantum theory of Louis de Broglie and to Einstein's interpretation.
Microscope and spectroscope results are not limited by Heisenberg's Uncertainty Principle!
A reviewing of many published experimental and theoretical papers demonstrate that the resolving powers of microscopes, spectroscopes and telescopes can be enhanced by orders of magnitude better than old classical limits by various advanced techniques including de-convolution of the CW-response function of these instruments. Heisenberg's original analogy of limited resolution of a microscope, to support his mathematical uncertainty relation, is no longer justifiable today. Modern techniques of detecting single isolated atoms through fluorescence also over-ride this generalized uncertainty principle. Various nano-technology techniques are also making atoms observable and location precisely measurable. Even the traditional time-frequency uncertainty relation or bandwidth limit δvδt ≥ 1 can be circumvented while doing spectrometry with short pulses by deriving and de-convolving the pulse-response function of the spectrometer just as we do for CW input.
Arising of entangled photon in the high finesse nanocavity
Vladislav Cheltsov, Anton Cheltsov
This work is a continuation of papers presented at the Optics and Photonics Symposium 2009 and is devoted to the case of the high finesse nanocavity with the average photon escaping rate Γ = η c/R << g -coupling constant. The case is of special interest as possible pretender for a qubit in quantum computers. The probability distribution to find photon in the (ωk , t )-space, investigated in the interval 0≤Γ/4g <<1, has triplet structure with very low central peak and satellites at frequencies ≈ ( ωa ± g). The latter emerge as result of emission from two upper atomic split sublevels. The peak is produced by ensuing reabsorptions of satellites by atom through its upper sublevels. Oscillating as t2•cos(gt) and decaying fast, the peak is accompanied with the simultaneously arising satellites, When the peak disappears the satellites become stable. The stability is quenched with continuum of final states. The profile of structure consisting of two identical components has the time-dependence t • exp(−Γt/4) and the width of satellites is by order less than the distance between them. These components with frequencies (ωa ± g ) have the average photon energies equal to 1/2(ωa ± g ) where factor "1/2 "accounts for normalization condition. The satellites amplitudes reach maximum at Γ/4g = 0.05. The profile of the structure has the form Γt • exp (−Γt/4) with maximum attained for Γ/4g =0,05 and average photon cavity life-time proportional to 4lnΓ/Γ. We name the structure "entangled photon."
Revisiting Derivations and QM Concepts II
icon_mobile_dropdown
Understanding the masses of elementary particles: a step towards understanding the massless photon?
A so far unnoticed simple explanation of elementary particle masses is given by m = N * melectron/α, where alpha (=1/137) is the fine structure constant. On the other hand photons can be described by two oppositely oscillating clouds of e / √α elementary charges. Such a model describes a number of features of the photon in a quantitatively correct manner. For example, the energy of the oscillating clouds is E = h ν, the spin is 1 and the spatial dimension is λ / 2 π. When the charge e / √α is assigned to the Planck mass mPl, the resulting charge density is e / (mPl√α) = 8,62 * 10-11 Cb / kg. This is identical to √ (G / ko) where G is the gravitational constant and ko the Coulomb constant. When one assigns this very small charge density to any matter, gravitation can be completely described as Coulomb interaction between such charges of the corresponding masses. Thus, there is a tight quantitative connection between the photon, nonzero rest masses and gravitation / Coulomb interaction.
Diffraction described by virtual particle momentum exchange: the "diffraction force"
Particle diffraction can be described by an ensemble of particle paths determined through a Fourier analysis of a scattering lattice where the momentum exchange probabilities are defined at the location of scattering, not the point of detection. This description is compatible with optical wave theories and quantum particle models and provides deeper insights to the nature of quantum uncertainty. In this paper the Rayleigh-Sommerfeld and Fresnel-Kirchoff theories are analyzed for diffraction by a narrow slit and a straight edge to demonstrate the dependence of particle scattering on the distance of virtual particle exchange. The quantized momentum exchange is defined by the Heisenberg uncertainty principle and is consistent with the formalism of QED. This exchange of momentum manifests the "diffraction force" that appears to be a universal construct as it applies to neutral and charged particles. This analysis indicates virtual particles might form an exchange channel that bridges the space of momentum exchange.
Quantum points/patterns, Part 1: from geometrical points to quantum points in a sheaf framework
We consider some generalization of the theory of quantum states, which is based on the analysis of long standing problems and unsatisfactory situation with the possible interpretations of quantum mechanics. We demonstrate that the consideration of quantum states as sheaves can provide, in principle, more deep understanding of some phenomena. The key ingredients of the proposed construction are the families of sections of sheaves with values in the category of the functional realizations of infinite-dimensional Hilbert spaces with special (multiscale) filtration. Three different symmetries, kinematical (on space-time), hidden/dynamical (on sections of sheaves), unified (on filtration of the full scale of spaces) are generic objects generating the full zoo of quantum phenomena.
Quantum points/patterns, Part 2: from quantum points to quantum patterns via multiresolution
It is obvious that we still have not any unified framework covering a zoo of interpretations of Quantum Mechanics, as well as satisfactory understanding of main ingredients of a phenomena like entanglement. The starting point is an idea to describe properly the key ingredient of the area, namely point/particle-like objects (physical quantum points/particles or, at least, structureless but quantum objects) and to change point (wave) functions by sheaves to the sheaf wave functions (Quantum Sheaves). In such an approach Quantum States are sections of the coherent sheaves or contravariant functors from the kinematical category describing space-time to other one, Quantum Dynamical Category, properly describing the complex dynamics of Quantum Patterns. The objects of this category are some filtrations on the functional realization of Hilbert space of Quantum States. In this Part 2, the sequel of Part 1, we present a family of methods which can describe important details of complex behaviour in quantum ensembles: the creation of nontrivial patterns, localized, chaotic, entangled or decoherent, from the fundamental basic localized (nonlinear) eigenmodes (in contrast with orthodox gaussian-like) in various collective models arising from the quantum hierarchies described by Wigner-like equations.
Waves, Photons, and Computing Logics
icon_mobile_dropdown
Wave-particle duality? not in optical computing
Metaphysics has only one absolute requirement: It must account for the known physics. But many metaphysics account for light and they cannot all be right. We have only one metaphysical principle that is widely accepted (Einstein's minimum simplicity rule) and it gives no one answer. Even if we could enforce it, how would we prove its validity without a (meta)3principle? People like me who work with light are never confused about whether we are dealing with a particle or a wave. I find it useful to view light in terms even broader than the usual wave-particle description. I add a third kind of wave that is not measurable but also not restricted by the physics of the measurable. I find it difficult to account for light any other way.
Cryptography and system state estimation using polarization states
Subhash Kak, Pramode Verma, Greg MacDonald
We present new results on cryptography and system state estimation using polarization states of photons. Current quantum cryptography applications are based on the BB84 protocol which is not secure against photon siphoning attacks. Recent research has established that the information that can be obtained from a pure state in repeated experiments is potentially infinite. This can be harnessed by sending a burst of photons confined to a very narrow time window, each such burst containing several bits of information. The proposed method represents a new way of transmitting secret information. While polarization shift-keying methods have been proposed earlier, our method is somewhat different in that it proposes to discover the polarization state of identical photons in a burst from a laser which codes binary information. We also present results on estimating the state of a system based on the polarization of the received photons which can have applications in intrusion detection.
Space as a Medium and its Properties
icon_mobile_dropdown
The extraterrestrial Casimir Effect
Riccardo C. Storti
Application of the Electro-Gravi-Magnetic (EGM) Photon radiation method to the Casimir Effect (CE), suggests that the experimentally verified (terrestrially) neutrally charged Parallel-Plate configuration force, may differ within extraterrestrial gravitational environments from the gravitationally independent formulation by Casimir. Consequently, the derivation presented herein implies that a gravitationally dependent CE may become an important design factor in nanotechnology for extraterrestrial applications (ignoring finite conductivity + temperature effects and evading the requirement for Casimir Force corrections due to surface roughness).
Doppler phenomena determined by photon-cosmic field interactions
Viraj Fernando
This paper concerns a new field of study − Energy Mechanics. It is discerned that all systems are open and there is an ingress and egress of energy from particles and photons into the cosmic governing field in all interactions without exception. And all phenomena are determined by nature's algorithms which regulate the ingress and egress of energy from and to the cosmic governing field. All interactions, whether they are gravitational electrical, motions of particles or photons, follow the same or similar patterns with the mediation of the very same field. Doppler shifts, aberration, and Fesnel's formula for motion of light are all explained as photon cosmic field interactions determined by algorithms. What this means is that we have stumbled into the "Unified Field" that Einstein unsuccessfully and desperately searched for in more than 70% of his adult life (for the last 35 years of his life).
The constancy of "c" everywhere requires the cosmic space to be a stationary and complex tension field
C. Roychoudhuri, A. Michael Barootkoob, Michael Ambroselli
Atoms and molecules that emit light, do not impart the ultimate velocity "c" on the emitted photon wave packets. Their propensity for perpetually propagating at this highest velocity in every possible direction must be leveraging a sustaining complex cosmic tension field (C2TF; ether of past centuries), which constitutes the space itself and hence stationary. Then the null results of Michelson-Morley experiments, positive and the null results of Fresnel-drag experiments and the positive Bradley telescope aberration should be explained as a drag of the C2TF by the Earth. We support this previously rejected hypothesis through various self consistent arguments and experiments. We present a null result for longitudinal Fresnel drag, in contrast to Fizeau's positive result; since we did not introduce any relative velocity between the light source and the phase-delay introducing material in our interferometer. We also propose that C2TF has a built-in weak dissipative property towards electromagnetic waves, so its frequency decreases very slowly with the distance of propagation through the C2TF. This hypothesis would eliminate the need for an expanding universe. We recast Hubble constant to accommodate the required Doppler shifts. The observable manifest universe consists only of EM waves and material particles. For C2TF to provide the unifying substrate for a new filed theory, we need to hypothesize that all stable particles are localized complex 3D non-linear, resonant but harmonic undulations of the C2TF. The non-linear strengths of the localized resonant undulations also introduces spatially extended but distance dependent distortions around the site of the resonances. These distortions are effectively different kinds of potential gradients manifest on the substrate of the C2TF, giving rise to the various forces. We now recognize four of them. The origin of mass is purely the inertia of movement of these resonances along these different potential gradients they experience. We further assert that the notion of self-interference, either for EM waves, or for particles, proposed in support of the hypothesis of wave-particle duality, is logically inconsistent with our currently successful mathematics and hence we should abandon this unnecessary duality hypothesis within the formalism of current QM.
The necessity of two fields in wave phenomena
Wave phenomena involve perturbations whose behavior is equivalent in space and time. The perturbations may be of very different nature but they all have to conform with the notion of a field, that is, a scalar or vector quantity defined for all points in space. Some wave phenomena are described in terms of only one field. For example water waves where the perturbation is the level above or below from the equilibrium position. Nonetheless, electromagnetic waves require the existence of two fields. I shall argue that in fact, all wave phenomena involve two fields although we sometimes perform the description in terms of only one field. To this end, the concept of cyclic or dynamical equilibrium will be put forward where the system continuously moves between two states where it exchanges two forms of energy. In a mechanical system it may be, for example, kinetic and potential energy. Differential equations that form an Ermakov pair require the existence of two linearly independent fields. These equations possess an invariant. For the time dependent harmonic oscillator, such an invariant exists only for time dependent potentials that are physically attainable. According to this view, two fields must be present in any physical system that exhibits wave behavior. In the case of gravity, if it exhibits wave behavior, there must be a complementary field that also carries energy. It is also interesting that the complex cosmic tension field proposed by Chandrasekar involves a complex field because complex functions formally describe two complementary fields.
Diverse Photon Models I
icon_mobile_dropdown
The nature of the photon in the view of a generalized particle model
Albrecht Giese
The particle model presented here is able to explain the structure of leptons and quarks without reference to quantum mechanics. In particular, it is able to explain quantitatively the existence of inertial mass without any use of a Higgs field. - An essential difference to the Standard Model of present-day particle physics is the fact that in the model presented, particles are viewed as being not point-like but extended. In addition it becomes apparent that the strong force is the universal force that is effective in all particles.
The birth of a photon
Quantum mechanics does neither give information on details of the emission of a photon from a single atom or molecule nor on the physical size of a photon. Experiments to get answers to such questions have, at least in principle, become possible only recently with the advent of single molecule photon sources. If diffraction of light at wavelength sized apertures is already a property of individual photons, some clear statements on emisission can be made: The energy of an individual photon, which at the start of the process is concentrated in an Angstrom sized atom or molecule, is diluted to micrometer dimensions, i.e. by many orders of magnitude. Since single photons can still be detected at least 18 m after their emission, the dilution must be terminated at some point. A photon model presented in an accompanying paper suggests a size for a photon of λ / 2π and explains what does stop and revert the expansion process, thus starting an oscillation.
Non-equilibrium mechanisms of light in the microwave region
Quantum mechanics and quantum chemistry have taught for more than 100 years that "photons" associated with microwaves cannot exert photochemical effects because their "photon energies" are smaller than chemical bond energies. Those quantum theories have been strongly contradicted within the last few decades by physical experiments demonstrating non-equilibrium, photochemical and photomaterial activity by microwaves. Reactions among scientists to these real physical models and proofs have varied from disbelief and denial, to acceptance of the real physical phenomena and demands for revisions to quantum theory. At the previous "Nature of Light" meeting, an advance in the foundations of quantum mechanics was presented. Those discoveries have revealed the source of these conflicts between quantum theory and microwave experiments. Critical variables and constants were missing from quantum theory due to a minor mathematical inadvertence in Planck's original quantum work. As a result, erroneous concepts were formed nearly a century ago regarding the energetics and mechanisms of lower frequency light, such as in the microwave region. The new discoveries have revealed that the traditional concept of "photons" mistakenly attributed elementary particle status to what is actually an arbitrarily time-based collection of sub-photonic, elementary particles. In a mathematical dimensional sense, those time-based energy measurements cannot be mathematically equivalent to bond energies as historically believed. Only an "isolated quantity of energy", as De Broglie referred to it, can be equivalent to bond energy. With the aid of the new variables and constants, the non-equilibrium mechanisms of light in the microwave region can now be described. They include resonant absorption, splitting frequency stimulation leading to electronic excitation, and resonant acoustic transduction. Numerous practical engineering applications can be envisioned for non-equilibrium microwaves.
Diverse Photon Models II
icon_mobile_dropdown
Creation and fusion of photons
Andrew Meulenberg Jr.
We seek physical mechanisms underlying a model of the interaction of light with light (and with matter) by examining the process of photon creation. A model of the atomic orbitals and optical transitions is described considering a classical argument that does not presume the integer values of electron-orbital angular momentum proposed by Bohr. Assuming only the known properties of light (e.g., Eγ = h ν and Lγ = h/2π), the electron-orbital energies about a nucleus (including a ground state) are predicted and calculated in an intuitive (and mathematically simple) manner. This first-order model considers neither electron-spin nor relativistic effects. A ground state is predicted based on the reduced probability of coupling net energy from a driver into a lower-frequency oscillator. The long-term stability of the ground state is further explained in terms of angular-momentum requirements of the photon. A simple derivation for the radius and angular momentum of photons is provided. It is proposed that, when a collinearly-propagating photon density gets high enough, their fusion back into electromagnetic fields (spherical or plane wave) is a similar process to their formation, but in reverse. The similarities and differences will be described assuming a surface-tension-like mechanism necessary for the existence of photons.
Birth of a two body photon
Randy T. Dorn
The two body photon model assumes that an electron and a positron have been accelerated to the speed of light. The tenets of relativity theory would lead one to believe this to be impossible. It has long been thought that a body cannot be accelerated to the speed of light because the relativistic mass or momentum would become infinite. This conceptual problem is addressed with the realization that it is not necessary to resort to the concept of a velocity dependent mass. Instead, the force should be considered to be velocity dependent. The relativistic equations of motion can be rearranged and interpreted such that the force varies with velocity instead of the mass. When the velocity reaches the speed of light, instead of dividing by zero and interpreting the equations as implying a nonphysical infinite mass, the equations show a finite mass with an applied force of zero. The equations can still take on the indeterminate form of 0/0, but standard mathematical analysis of this situation will show that the equations predict that a body can reach the speed of light. Furthermore, under the influence of an inverse square force, a body can be accelerated to the speed of light over very short distances and provide the initial conditions necessary for the birth of a two body photon.
A QED-compatible wave theory of light, electrons, and their interactions
The photoelectric effect, the Compton effects, and now anticorrelation experiments have been claimed to prove that light consists of particles flying through a void. However the particle model is decisively falsified by the known wave qualities of electromagnetic radiation. Quantum Electrodynamics also uses spreading wave amplitudes; not flying particles. The evidence indicates that light is a wave and electrons are electromagnetic wave-structures that absorb and emit light in discrete wave-packets. These wave-quanta are emitted directionally and then spread in space in proportion to their wavelength. Their waves superposition with all other waves in space. In low intensity experiments, the waves that an electron in a photodetector absorbs come from the superpositioning of the source wave-quanta and background radiation. The importance of background radiation is evidenced by "dark counts". Treating electrons and the quanta they absorb and emit as particles when they are composed of waves is the source of all the paradoxes, unreality and confusion in Quantum Mechanics. The word "photon" should be replaced by "quantum" in the description of this particular electronic process. The conceptual model presented here explains the known phenomena without producing paradoxes and unifies quantum and classical electromagnetics.
The conservation of light's energy, mass, and momentum
An advance in the foundations of quantum mechanics was presented at the previous "Nature of Light" meeting which brings new insights to the conservation of light's energy, mass and momentum. Those discoveries suggest that the "photon" is a time-based collection of sub-photonic elementary light particles. Incorporation of this new understanding into quantum mechanics has allowed the determination of universal constants for the energy, mass, and momentum of light. The energy constant for light is 6.626 X 10-34 J/osc, meaning the energy of a single oscillation of light is constant irrespective of the light's frequency or wavelength. Likewise, the mass and momentum of a single oscillation of light are constant, regardless of changes to either time or space. A realistic understanding of the conservation of energy, mass and momentum for both matter and light in a single conservation law is now possible. When a body with mass absorbs or emits light, its energy, mass and momentum change in quantized amounts according to the relationship: Δ E = Nh~ = Nm0c2 = Nρ0c where "N" is the number of oscillations absorbed absorbed or emitted by the body and h~, m0, and ρ0 are the constant energy, mass and momentum of an oscillation. Implications extend from general relativity and gravity to space sails and light driven nanomotors.
Analysis of spectrometric data and detection processes corroborate photons as diffractively evolving wave packets
Negussie Tirfessa, Chandrasekhar Roychoudhuri
In a previous paper, we have proposed that photons are diffractively evolving classical wave packet as a propagating undulation of the Complex Cosmic Tension Field (C2TF) after the excess energies are released by atomic or molecular dipoles as perturbations of the C2TF. The carrier frequency of the pulse exactly matches the quantum condition ΔEmn = hvmn and the temporal envelope function creates the Lorentzian broadening of the measured spectral lines. In this paper we present both semiclassical and QM theories of emission and compare the QM prescribed natural line width of emitted spectra and the Doppler free laser absorption spectra to further validate our photon model.
Diverse Photon Models III
icon_mobile_dropdown
Studies on reaction kinetics under coherent microwave irradiations
M. Sato, J. Fukushima, K. Kashimura, et al.
The monochromatic and single phase electromagnetic fields generate ordered motions in the electrons and ions in the solid and liquid materials. The coherent motion stored kinetic energy. If the kinetic energy is large enough to destroy the crystal structures, the phase deformations happens at much under temperatures expected in the thermally equilibrium state. If it could couple to the optical light generated by thermal motions of the material, it would excite stimulated Brillouin emissions. The stimulated emission supply the energy under the thermally non-equilibrium state.
Light's infinitely variable energy velocities in view of the constant speed of light
The discovery of Einstein's hidden variables was presented at the previous "Nature of Light" meeting, and revealed that Max Planck's famous quantum formula was incomplete. The complete quantum formula revealed a previously hidden energy constant for light, 6.626 X 10-34 J/osc (the energy quantum of a single oscillation of light) and a measurement time variable. The "photon" is a time-based collection of sub-photonic elementary particles, namely single oscillations of light. An understanding of the constant speed of light as well as the relative and additive velocities of light's energy quantum is now possible. What emerges is a remarkably fresh and yet classical perspective. Einstein's three-dimensional light-quantum model applied to the recently discovered energy constant suggests the constant energy of an oscillation of light is distributed along its wavelength and is absorbed and emitted as a whole quantum. In a vacuum, light's oscillations travel at the constant speed of light (Lorentzian) regardless of their wavelength. The time-rate (velocity) with which the whole energy quantum of an oscillation is absorbed or emitted varies with its wavelength. The longer the wavelength, the longer it takes for the entire oscillation energy to be absorbed. Light's infinitely variable energy velocities are consistent with the Galilean principle of relative and additive velocities. A realistic mechanism for superluminal absorption and emission becomes apparent and a new corollary is found: Light propagates from every transmitter at the same speed, but reaches receivers at different frequencies, depending on the relative difference between the speed of the transmitter and receiver.
Investigation concerning the radiation behaviour of an elementary dipole transition
The irradiance of an atomic dipole transition -screened at microscopic distances from its origin- reveals interesting details not always evident when dealing with light phenomena. The basis of this investigations are pure classical. The HERTZ vector- formalism was used (BORN & WOLF). The special features of the electrodynamics radiation behaviour of such an atomic transition solely became evident when generally made disregards were suspended. However, the complexity of the originating equations forced one to treat the problem numerically. All computations were done due to a dipole elongation of 0,1Å with an oscillation frequency corresponding to the YAG-laser wavelength, λY = 1,064 μm. Strikingly a Fourier analysis of the irradiance (Poynting vector) doesn't replicate this frequency, moreover, it reveals harmonics. Up to ~ 0,1 μm the fourth harmonic dominates, second harmonic is also appearing albeit at a minor amount. Beyond 0,1 μm fourth and second harmonic exchange their appearance. Up to 100nm from the dipole centre sixth and eighth harmonics are also present but at minor strengths. Outside the source centre the optical field is perceived as light wave and practically, instead of the presumed YAG wavelength, we measure double this frequency, namely green light. At distances below 0,1 μm the fourth harmonic prevails being capable of performing a two photon absorption.
Virtual and real photons
Andrew Meulenberg Jr.
Maxwell did not believe in photons. However, his equations lead to electro-magnetic field structures that are considered to be photonic by Quantum ElectroDynamics (QED). They are complete, relativistically correct, and unchallenged after nearly 150 years. However, even though his far-field solution has been considered as the basis for photons, as they stand and are interpreted, they are better fitted to the concept of virtual rather than to real photons. Comparison between staticcharge fields, near-field coupling, and photonic radiation will be made and the distinctions identified. The question of similarities in, and differences between, the two will be addressed. Implied assumptions in Feynman's "Lectures" could lead one to believe that he had provided a general classical electrodynamics proof that an orbital electron must radiate. While his derivation is correct, two of the conditions defined do not always apply in this case. As a result, the potential for misinterpretation of his proof (as he himself did earlier) for this particular case has some interesting implications. He did not make the distinction between radiation from a bound electron driven by an external alternating field and one falling in a nuclear potential. Similar failures lead to misinterpreting the differences between virtual and real photons.
Superposition and Interaction Process Models I
icon_mobile_dropdown
Evidence for unmediated momentum transfer between light waves
W. R. Hudgins, A. Meulenberg, S. Ramadass
Dowling and Gea-Benacloche (1992) proved mathematically that "...under certain circumstances, it is possible to consistently interpret interference as the specular reflection of two electromagnetic waves off each other..." Combining experiment, model, and logic, we confirm this statement. Contrary to the supposition that electromagnetic waves/photons cannot interact, it is possible to interpret the results to indicate that identical out-of-phase waves and opposite polarity photons repel or, at least, cannot pass through each other. No energy is detected in the dark/null zones of wave interference. Because energy appears to be absent, the exchange of momentum through the null zone must redirect/repel light waves into bright zones. Our Zero-Slit Experiment (ZSX) provides diffraction-free interference in air between two portions of a divided laser beam. This experiment was initially an attempt to completely cancel a section of these two beams by recombining them in air when they are 180° out-of-phase. We have reduced interference patterns close to a double-bright zone (with 3 null zones), but no further. Within the limits of laser-beam spreading, we have studied the resulting interference patterns and compared them with models of collision between identical particles. It may be possible to distinguish light from other Bosons, if the model of opposite phases repelling is valid. An EM field-line model of the photon is presented to explain the interactions needed to produce momentum transfer.
Why does the wave particle dualism appear to become evident particularly at optical wavelengths
At radio wavelength "photons" are not really needed. The description of this part of the electromagentic spectrum as waves is completeley satisfying. In turn, in the range of cosmic radiation, the description of photons as "particles" is sufficient. However, this may not be intrinsic basic physics but caused by the choice of detectors. Radio detectors are solid metal rods - antennae. The energy density and the wavelength of a radio photon are much smaller than that of the antenna. In the range of cosmic radiation, the detectors are essentially single atoms. The energy density of such a photon is much higher, the wavelength much smaller. The primary process of photon detection at optical wavelengths detectors usually occcurs at a single atom, a limited group of atoms or a band in a solid state detector. There, the energy density is comparable to the energy density of the detected photon. Depending on the detailed conditions, a wave or a beam of particles is perceived.
Nature of EM waves as observed and reported by detectors for radio, visible, and gamma frequencies
Michael Ambroselli, Peter Poulos, Chandrasekhar Roychoudhuri
Characteristic transformations measured by detectors indicate what conclusions we can draw about the signal we study. This paper will review the models behind the interaction processes experienced by detectors sensing radio, visible and gamma-ray signals and analyze why the conclusions about the nature of EM-waves should necessarily appear to be different and contradictory. The physical interaction processes between the EM-waves and the material in the different detectors - the LCR circuit, photo and gamma detectors - differ considerably. So drawing a definitive conclusion regarding the nature of EM-waves is fraught with pitfalls. We discuss how to bring conceptual continuity among these different interpretations of the interaction processes by imposing some logical congruence upon them.
Physical processes behind a Ti:Sa femtosecond oscillator
M. Fernández-Guasti, E. Nava, F. Acosta, et al.
The sharply peaked comb structure that arises from a mode-locked cavity is usually explained in terms of the superposition of monochromatic waves with integer wavelength multiples of the round trip of the cavity. However, the non interaction (or non interference) of waves implies that these wave-fields cannot sum themselves to reorganize either their amplitudes or their energies. The summation has to be carried out either by a nonlinear medium whose output involves the wave-mixing and/or it is performed by the detector. The output of a femtosecond Titanium Sapphire oscillator is analyzed with the above mentioned framework in mind. The spectrum is obtained in mode-locked and non mode-locked operation via a grating spectrometer for different cavity detunnings. The time dependence is measured via a fast photo-diode to record the repetition rate. A frequency resolved optical gating (FROG) device is used to resolve the temporal shape of the femtosecond pulses. The data is examined from two viewpoints: The superposition process is carried out (a) by the filed amplitudes themselves, or (b) by some interacting material dipoles.
Re-interpreting "coherence" in light of Non-Interaction of Waves, or the NIW-Principle
The autocorrelation, or the Wiener-Khintchine, theorem plays a pivotal role in optical coherence theory. Its proof derives from the time-frequency Fourier theorem. The derivation requires either dropping the cross-products (interference terms) between the different filed amplitudes corresponding to different frequencies, or taking time integration over the entire duration of the signal. The physical interpretation of these mathematical steps implies either (i) non-interference (noninteraction) between different frequencies, or (ii) the registered data is valid for interpretation when the detector is set for long time integration. We have already proposed the generic principle of Non-Interaction Waves (NIW), or absence of interference between light beams. The hypothesis of non-interaction between different frequencies was used by Michelson to frame the theory behind his Fourier Transform Spectroscopy, which is correct only when the detector possesses a long integrating time constant like a human eye, a photographic plate, or a photo detector circuit with a long LCR time constant. A fast detector gives heterodyne signal. So, the correlation factor derived by the prevailing coherence theory, and measured through fringe visibility, is essentially the quantum property of the detecting molecules compounded by the rest of the follow-on instrumentation. Low visibility fringes (low correlation factor) does not reflect intrinsic property of light alone; it is a light-matter joint response characteristics. So, we re-define coherence by directly referring to the key characteristics of light beams being analyzed as: (i) spectral correlation (presence of multi frequency), (ii) temporal correlation (time varying amplitude of light), (iii) spatial correlation (independent multi-point source), and (iv) complex correlation (mixture of previous characteristics).
Superposition and Interaction Process Models II
icon_mobile_dropdown
Appreciating the principle of Non-Interaction of Waves (NIW-principle) by modeling Talbot diffraction patterns at different planes
We present an approach to demonstrate the Non-Interaction of Waves (NIW)-principle by showing that dark fringes in the near-field Talbot diffraction patterns are not devoid of energy. We believe that a detector is simply incapable of absorbing any energy at the dark fringe locations simply because the resultant of the induced stimulations on a local detecting dipole due to all the E-vectors is zero. The joint stimulation is strongest at the bright fringe locations. The amplitude (& hence potentially detectable energy) flow through the "dark fringe locations" is demonstrated by obstructing the "bright fringe" locations at the half-Talbot plane with an identical grating that generated this diffraction image. Then, by propagating the transmitted complex amplitudes through the dark fringes, we would like to show that the Talbot plane can still receive more energy than that could have been recorded out of those same dark fringe locations at the half Talbot plane.
Visualizing the mode evolution process in passive and active cavities based on the NIW-Principle
This paper will present results of computer models depicting the evolution of diffractive processes through passive and active cavities (traditional stable resonator and single mode fiber) as the number of passes (or the length of propagation) increases. The objective is to visualize how the spatially stable eigen-modes evolve with propagation. Our core assumptions are the validity of the Huygens-Fresnel hypothesis of secondary wavelets and the recently articulated Non- Interaction of Waves (NIW) principle in this conference series. The NIW-principle implies that even the diffracted secondary wavelets propagate conjointly but without re-arranging their spatial energy distribution unless one inserts some interacting material medium within the diffracting beam. Accordingly, we anticipate that the evolution of the measurable energy distribution in the diffraction pattern will be different in the presence of gain medium whose gain profile varies in the direction orthogonal to the cavity axis. We also anticipate that a cavity with high gain profile will generate the stable spatial eigen-mode with a fewer number of passes through the cavity than with lower gain, or no gain. We will also present the mode evolution process when the seed signal is a pulse of length that is shorter than that of the cavity. We believe this paper will provide useful insights to the students who are introduced to the concept of spatially well defined Gaussian laser modes for the first time.
Coherence and frequency spectrum of a Nd:YAG laser: generation and observation devices
The coherence of a Nd:YAG CW laser is analyzed using a Michelson interferometer. Fringe contrast is measured as the path difference is varied by changing the length of one arm. The fringe contrast, as expected, is maximum when there is no path difference between arms. However, the fringe contrast does not decrease monotonically. It decreases and then increases several times before fading away. This behaviour is reminiscent of the fringe contrast depending on aperture and the uncovering of the Fresnel zones. In order to evaluate the mode structure it is necessary to consider the geometric parameters and Q factor of the cavity, the medium gain curve and the type of broadening. The non interference of waves principle requires that two (or more) modes competition or their interference can only take place though matter non linear interaction. Therefore, and in addition, it is important to consider the setup and type of detectors employed to monitor the frequency and/or time dependence. In as much as speckle is recognized as an interference phenomenon taking place at the detector plane, say the retina, the role of the sensing element in the detection of mode beats should also be decisive.
Visualizing superposition process and appreciating the principle of non-interaction of waves
Michael Ambroselli, Chandrasekhar Roychoudhuri
We demonstrate the dynamic evolution of superposition effects to underscore the importance of visualizing interaction processes. The model recognizes the principle of Non-Interaction of Waves. Recordable linear fringes, bisecting the Poynting vectors of the two crossing beams, have time evolving amplitude patterns in the bright fringes because the two superposed E-vectors oscillate through zero values while staying locked in phase. If a detector registers steady, stable bright fringes, it must do so by time integration. The QM recipe to model energy exchange by taking the square modulus of the sum of the complex amplitudes has this time integration built into it. Thus, we will also underscore the importance of assigning proper physical processes to the mathematical relationships: the algebraic symbols should represent physical parameters of the interactants and the mathematical operators connecting the symbols should represent allowed physical interaction processes and the guiding force.
Being Aware of Our Diverse Epistemologies I
icon_mobile_dropdown
The nature of light in Indian epistemology
Indian epistemology is of interest to the physicist for its framework of reality includes observers in a fundamental manner. The nature of light in this epistemology is presented with a background on the principal ideas related to space, time and matter. In the Indian physics tradition of Vaiseshika, observables arise as a consequence of motion of which two kinds, intrinsic and extrinsic, are postulated. The atom in itself is not observable for it is taken to be an abstraction with a potential to acquire various attributes based on multiplied arrangements and vibrations. Other Indian approaches to epistemology privilege the subject over matter. In contrast to Western approaches where it is an epiphenomenon, consciousness is taken to have a real existence.
Two-slit interference and wave-particle duality for single photons from Observer's Mathematics point of view
When we consider and analyze physical events with the purpose of creating corresponding mathematical models we often assume that the mathematical apparatus used in modeling, at least the simplest mathematical apparatus, is infallible. In particular, this relates to the use of "infinitely small" and "infinitely large" quantities in arithmetic and the use of Newton Cauchy definitions of a limit and derivative in analysis. We believe that is where the main problem lies in contemporary study of nature. We have introduced a new concept of Observer's Mathematics (see www.mathrelativity.com). Observer's Mathematics creates new arithmetic, algebra, geometry, topology, analysis and logic which do not contain the concept of continuum, but locally coincide with the standard fields. We proved the following theorems: 1) Theorem A (Two-slit interference). Let Ψ1 be a wave from slit 1, Ψ2 - from slit 2, and Ψ = Ψ1 + Ψ2. Then the probability of that Ψ is a wave equals to 0.5; and 2) Theorem B (Wave-particle duality for single photons). If v is small enough, then λ is a random variable.
Two types of arguing in physics: a critical discussion
Arguments in physics regarding light are often based on either mechanistic concepts, or on a "which way" discussion where the question of "indistinguishability" is crucial. The last kind of arguing is based on the concept of an indivisible, point-like photon, a concept stemming historically from the most common explanation of the photoelectric effect. There seems, however, to be an important lack of consistency between arguing based on indivisible particle-like photons and the actual quantum formalism for calculating detailed outcomes from various experiments. Crucial parts of the actual calculations of diffraction and interference phenomena seem in fact to be very similar in the classical and the quantum descriptions, and are based either on classic electromagnetic fields, or on quantum fields. It is the interpretations that differ. It would be interesting to downgrade the concept of an indivisible particle-like photon and upgrade the importance of the quantum field description we really use in the detailed calculations. It is our impression that we by that could avoid quite a few weird conclusions we live with today. A few examples are given.
Simple alternative model of the dual nature of light and its Gedanken experiment
François Hénault
In this paper is presented a simple alternative model of the dual nature of light, based on the deliberate inversion of the original statement from P. A. M. Dirac: "Each photon interferes only with itself. Interference between different photons never occurs." Such an inversion implies that photons and light quanta are considered as different classes of objects, but stays apparently compatible with results reported from different recent experiments. A Gedanken experiment having the potential capacity to test the proposed model in single photon regime is described, and its possible outcomes are discussed. The proposed setup could also be utilized to assess the modern interpretation of the principle of complementarity.
Being Aware of Our Diverse Epistemologies II
icon_mobile_dropdown
Beyond relativity and quantum mechanics: space physics
Albert Einstein imposed an observer-based epistemology upon physics. Relativity and Quantum Mechanics limit physics to describing and modeling the observer's sensations and measurements. Their "underlying reality" consists only of ideas that serve to model the observer's experience. These positivistic models cannot be used to form physical theories of Cosmic phenomena. To do this, we must again remove the observer from the center of physics. When we relate motion to Cosmic space instead of to observers and we attempt to explain the causes of Cosmic phenomena, we are forced to admit that Cosmic space is a substance. We need a new physics of space. We can begin by replacing Relativity with a modified Lorentzian-Newtonian model of spatial flow, and Quantum Mechanics with a wave-based theory of light and electrons. Space physics will require the reinterpretation of all known phenomena, concepts, and mathematical models.
Did Michelson and Morley test the wrong phenomenon?
"Our Universe does not have a real cosmic medium". "Light is fundamentally dual, comprising a wave aspect and a particle aspect". These two statements are jointly present in most physics textbooks, but they are logically incompatible. The argument against mediation is based on the outcome of the Michelson-Morley experiment. It only works under the assumption that light is a running wave in the ether, similar to a sound wave in the air. Without such an assumption, invariance only proves that the speed of light is not the speed of a wave. If light has a dual nature, then the speed of light is the speed of the particle aspect. In quantum mechanics, it can be described as the speed of the photon. Classical models would suggest that the speed of light is equal to the rate of mutual generation of electric and magnetic fields across space. Alternatively, the speed of light can be described as the speed of a train of fields, propagating like a stream of particles. There is no room for running waves in these models, except as a constituent element of static fields. For a coherent account of our knowledge, the second statement should be preserved, and the first one should be rejected.
Being Aware of Our Diverse Epistemologies III
icon_mobile_dropdown
Epistemology of quantum mechanics: the Växjö viewpoint
Andrei Khrennikov
This paper summarizes the experience of the Växjö series of conferences - the longest series of conferences on foundations of quantum mechanics. One of the main lessons of this series is that the present state of development of quantum theory does not exclude a possibility to elaborate a local realistic interpretation. One of such interpretations, the Växjö interpretation, combines realism and contextuality. And it became recognized worldwide.
Experiment versus theory: do physicists still know the difference?
Physics as an experimental science has two facets. It plays an essential and indisputable role in the development of modern technologies, providing quantitative bases in the form of operational definitions of interactions and interactants. Physics also attempts to provide a coherent and concise description of the dynamics of physical universe at macroscopic and microscopic scales. While its accomplishments for technological enterprises is a source of envy for other disciplines, physics has much less to celebrate in conceptual clarity as it attempts to describe the physical universe. As physicists intensely engage in their pursuits of fundamental discoveries in experiment and to propound all-encompassing theories/models, the boundaries between experiment and theory become blurred. In modern times, theoretical assumptions are very much part of the preparation of experiments and interpretation of results of measurements. One must question the very meaning of test of an experiment against theoretical predictions. In this paper, we reason this with illustrative examples from foundations of physics, cosmology and particle physics.
The coming revolution in physics
Ted Silverman
The foundations of Relativity and Quantum theory are compromised by profound conceptual difficulties that have blocked unification of these 'twin pillars' of physical science and thus a fundamental understanding of gravity, light and physical phenomena generally. Current efforts to supersede these difficulties without directly addressing them, as exemplified by the approaches that go by the name of String Theory, are unlikely to lead to long term success. Moreover, it is found that quantization-of-action and Lorentz invariance are general, emergent properties of dynamical systems comprising periodic fluctuations in the underlying degrees of freedom. Therefore, mathematical models that "wire in" these attributes - instead of deriving them from the underlying model - do not furnish a theoretical foundation for their interpretation. With these considerations in view, a framework for a quantum theory of gravity and a new understanding of light is presented and discussed.
Appreciation of the nature of light demands enhancement over the prevailing scientific epistemology
Based on attempts to resolve the problem of various self contradictory assumptions behind the prevailing belief on single photon interference, we have analyzed the process steps behind our experimental measurements and named the process as the Interaction Process Mapping Epistemology (IPM-E). This has helped us recognize that the quantum mechanical Measurement Problem has a much universal and deeper root in nature. Our scientific theorization process suffers from a Perpetual Information Challenge (PIC), which cannot be overcome by elegant and/or sophisticated mathematical theories alone. Iterative imaginative application of IPM-E needs to be used as a metaphorical analytical continuation to fill up the missing information gaps. IPM-E has also guided us to recognize the generic NIW-principle (Non-Interaction of Waves) in the linear domain, not explicitly recognized in current books and literature. Superposition effects become manifest through light-matter interactions. Detecting dipoles gets stimulated by multiple superposed beams; it sums the simultaneous multiple stimulations into a single resultant undulation, which then guides the resultant energy exchange. The consequent transformation in the detector corresponds to observed fringes. They neither represent interference of light; nor represent selective arrival or non-arrival of photons on the detector. Photons do not possess any force of mutual interaction to generate their redistribution. Implementation of IPM-E requires us to recognize our subjective interpretation propensity with which we are burdened due to our evolutionary successes.