Você está na página 1de 35

Dualidad onda corpsculo

(Redirigido desde Dualidad onda corpusculo)

Imagen ilustrativa de la dualidad onda-partcula, en el cual se puede ver cmo un mismo fenmeno puede tener dos percepciones distintas.

La dualidad onda-corpsculo, tambin llamada dualidad onda-partcula, resolvi una aparente paradoja, demostrando que la luz puede poseer propiedades de partcula y propiedades ondulatorias. De acuerdo con la fsica clsica existen diferencias entre onda y partcula. Una partcula ocupa un lugar en el espacio y tiene masa mientras que una onda se extiende en el espacio caracterizndose por tener una velocidad definida y masa nula. Actualmente se considera que la dualidad onda-partcula es un concepto de la mecnica cuntica segn el cual no hay diferencias fundamentales entre partculas y ondas: las partculas pueden comportarse como ondas y viceversa.(Stephen Hawking, 2001) ste es un hecho comprobado experimentalmente en mltiples ocasiones. Fue introducido por LouisVictor de Broglie, fsico francs de principios del siglo XX. En1924 en su tesis doctoral propuso la existencia de ondas de materia, es decir que toda materia tena una onda asociada a ella. Esta idea revolucionaria, fundada en la analoga con que la radiacin tena una partcula asociada, propiedad ya demostrada entonces, no despert gran inters, pese a lo acertado de sus planteamientos, ya que no tena evidencias de producirse. Sin embargo, Einstein reconoci su importancia y cinco aos despus, en 1929, De Broglie recibi el Nobel en Fsica por su trabajo. Su trabajo deca que la longitud de onda de la onda asociada a la materia era

donde h es la constante de Planck y p es la cantidad de movimiento de la partcula de materia.


Contenido
[ocultar]

1 Historia 2 Huygens y Newton 3 Fresnel, Maxwell y Young 4 Einstein y los fotones 5 De Broglie 6 Naturaleza ondulatoria de los objetos mayores 7 Teora y filosofa 8 Aplicaciones 9 Vase tambin 10 Referencias

[editar]Historia

10.1 Notas

Al finalizar el siglo XIX, gracias a la teora atmica, se saba que toda materia estaba formada por partculas elementales llamadas tomos. La electricidad se pens primero como un fluido, pero Joseph John Thomsondemostr que consista en un flujo de partculas llamadas electrones, en sus experimentos con rayos catdicos. Todos estos descubrimientos llevaron a la idea de que una gran parte de la Naturaleza estaba compuesta por partculas. Al mismo tiempo, las ondas eran bien entendidas, junto con sus fenmenos, como ladifraccin y la interferencia. Se crea, pues, que la luz era una onda, tal y como demostr el Experimento de Young y efectos tales como la difraccin de Fraunhofer. Cuando se alcanz el siglo XX, no obstante, aparecieron problemas con este punto de vista. El efecto fotoelctrico, tal como fue analizado por Albert Einstein en 1905, demostr que la luz tambin posea propiedades de partculas. Ms adelante, la difraccin de electrones fue predicha y demostrada experimentalmente, con lo cual, los electrones posean propiedades que haban sido atribuidas tanto a partculas como a ondas.

Esta confusin que enfrentaba, aparentemente, las propiedades de partculas y de ondas fue resuelta por el establecimiento de la mecnica cuntica, en la primera mitad del siglo XX. La mecnica cuntica nos sirve como marco de trabajo unificado para comprender que toda materia puede tener propiedades de onda y propiedades de partcula. Toda partcula de la naturaleza, sea un protn, un electrn, tomo o cual fuese, se describe mediante una ecuacin diferencial, generalmente, la Ecuacin de Schrdinger. Las soluciones a estas ecuaciones se conocen como funciones de onda, dado que son inherentemente ondulatorias en su forma. Pueden difractarse e interferirse, llevndonos a los efectos ondulatorios ya observados. Adems, las funciones de onda se interpretan como descriptores de la probabilidad de encontrar una partcula en un punto del espacio dado. Quiere decirse esto que si se busca una partcula, se encontrar una con una probabilidad dada por la raz cuadrada de la funcin de onda. En el mundo macroscpico no se observan las propiedades ondulatorias de los objetos dado que dichaslongitudes de onda, como en las personas, son demasiado pequeas. La longitud de onda se da, en esencia, como la inversa del tamao del objeto multiplicada por la constante de Planck h, un nmero extremadamente pequeo.
[editar]Huygens

y Newton

La luz, onda y corpsculo. Dos teoras diferentes convergen gracias a la fsica cuntica.

Las primeras teoras comprensibles de la luz fueron expuestas por Christiaan Huygens, quien propuso una teora ondulatoria de la misma, y en particular, demostrando que cada punto de un frente de onda que avanza es de hecho el centro de una nueva perturbacin y la fuente de un nuevo tren de ondas. Sin embargo, su teora tena debilidades en otros puntos y fue pronto ensombrecida por la Teora Corpuscular de Isaac Newton.

Aunque previamente Sir Isaac Newton, haba discutido este prolegmeno vanguardista con Pierre Fermat, otro reconocido fsico de la ptica en el siglo XVII, el objetivo de la difraccin de la luz no se hizo patente hasta la clebre reunin que tuviera con el genial Karl Kounichi, creador delprincipio de primalidad y su mxima de secuencialidad, realizada en la campia de Woolsthorpe durante la gran epidemia de Peste de 1665. Apoyado en las premisas de sus contemporneos, Newton propone que la luz es formada por pequeas partculas, con las cuales se explica fcilmente el fenmeno de la reflexin. Con un poco ms de dificultad, pudo explicar tambin la refraccin a travs de lentes y la separacin de la luz solar en colores mediante unprisma. Debido a la enorme estatura intelectual de Newton, su teora fue la dominante por un periodo de un siglo aproximadamente, mientras que la teora de Huygens fue olvidada. Con el descubrimiento de la difraccin en elsiglo XIX, sin embargo, la teora ondulatoria fue recuperada y durante el siglo XX el debate entre ambas sobrevivi durante un largo tiempo.
[editar]Fresnel,

Maxwell y Young

A comienzo del siglo XIX, con el experimento de la doble rendija, Young y Fresnel certificaron cientficamente las teoras de Huygens. El experimento demostr que la luz, cuando atraviesa una rendija, muestra un patrn caracterstico de interferencias similar al de las ondas producidas en el agua. La longitud de onda puede ser calculada mediante dichos patrones. Maxwell, a finales del mismo siglo, explic la luz como la propagacin de una onda electromagntica mediante las ecuaciones de Maxwell. Tales ecuaciones, ampliamente demostradas mediante la experiencia, hicieron que Huygens fuese de nuevo aceptado.
[editar]Einstein

y los fotones

Efecto fotoelctrico: La luz arranca electrones de la placa.

En 1905, Einstein logr una notable explicacin del efecto fotoelctrico, un experimento hasta entonces preocupante que la teora ondulatoria era incapaz de explicar. Lo hizo postulando la existencia de fotones, cuantos de luz con propiedades de partculas. En el efecto fotoelctrico se observaba que si un haz de luz incida en una placa de metal produca electricidad en el circuito. Presumiblemente, la luz liberaba los electrones del metal, provocando su flujo. Sin embargo, mientras que una luz azul dbil era suficiente para provocar este efecto, la ms fuerte e intensa luz roja no lo provocaba. De acuerdo con la teora ondulatoria, la fuerza o amplitud de la luz se hallaba en proporcin con su brillantez: La luz ms brillante debera ser ms que suficiente para crear el paso de electrones por el circuito. Sin embargo, extraamente, no lo produca. Einstein lleg a la conclusin de que los electrones eran expelidos fuera del metal por la incidencia de fotones. Cada fotn individual acarreaba una cantidad de energa E, que se encontraba relacionada con la frecuencia de la luz, mediante la siguiente ecuacin:

donde h es la constante de Planck (cuyo valor es 6,626 1034 Js). Slo los fotones con una frecuencia alta (por encima de un valor umbral especfico) podan provocar la corriente de electrones. Por ejemplo, la luz azul emita unos fotones con una energa suficiente para arrancar los electrones del metal, mientras que la luz roja no. Una luz ms intensa por encima del umbral mnimo puede arrancar ms electrones, pero ninguna cantidad de luz por debajo del mismo podr arrancar uno solo, por muy intenso que sea su brillo. Einstein gan el Premio Nobel de Fsica en 1921 por su teora del efecto fotoelctrico.
[editar]De

Broglie

En 1924, el fsico francs, Louis-Victor de Broglie (1892-1987), formul una hiptesis en la que afirmaba que: Toda la materia presenta caractersticas tanto ondulatorias como corpusculares comportndose de uno u otro modo dependiendo del experimento especfico. Para postular esta propiedad de la materia De Broglie se bas en la explicacin del efecto fotoelctrico, que poco antes haba dado Albert Einstein sugiriendo la naturaleza cuntica de la luz. Para Einstein, la energa transportada por las ondas luminosas estaba cuantizada, distribuida en pequeos paquetes energa o cuantos de luz, que ms tarde seran denominados fotones, y cuya energa dependa de la frecuencia de la luz a travs de la relacin: , donde es la frecuencia de la onda luminosa y la constante de

Planck. Albert Einstein propona de esta forma, que en determinados procesos las ondas electromagnticas que forman la luz se comportan como corpsculos. De Broglie se pregunt que por qu no podra ser de manera inversa, es decir, que una partcula material (un corpsculo) pudiese mostrar el mismo comportamiento que una onda. El fsico francs relacion la longitud de onda, (lambda) con la cantidad de movimiento de la partcula, mediante la frmula:

donde es la longitud de la onda asociada a la partcula de masa m que se mueve a una velocidad v, y h es la constante de Planck. El producto del vector es tambin el mdulo

, o cantidad de movimiento de la partcula. Viendo la frmula se aprecia

fcilmente, que a medida que la masa del cuerpo o su velocidad aumenta, disminuye considerablemente la longitud de onda. Esta hiptesis se confirm tres aos despus para los electrones, con la observacin de los resultados delexperimento de la doble rendija de Young en la difraccin de electrones en dos investigaciones independientes. En la Universidad de Aberdeen, George Paget Thomson pas un haz de electrones a travs de una delgada placa de metal y observ los diferentes esquemas predichos. En los Laboratorios Bell, Clinton Joseph Davisson y Lester Halbert Germer guiaron su haz a travs de una celda cristalina. La ecuacin de De Broglie se puede aplicar a toda la materia. Los cuerpos macroscpicos, tambin tendran asociada una onda, pero, dado que su masa es muy grande, la longitud de onda resulta tan pequea que en ellos se hace imposible apreciar sus caractersticas ondulatorias. De Broglie recibi el Premio Nobel de Fsica en 1929 por esta hiptesis. Thomson y Davisson compartieron el Nobel de 1937 por su trabajo experimental.
[editar]Naturaleza

ondulatoria de los objetos mayores

Similares experimentos han sido repetidos con neutrones y protones, el ms famoso de ellos realizado porEstermann y Otto Stern en 1929. Experimentos ms recientes realizados con tomos y molculas demuestran que actan tambin como ondas. Una serie de experimentos enfatizando la accin de la gravedad en relacin con la dualidad onda-corpsculo fueron realizados en la dcada de los 70 usando

un interfermetro de neutrones. Los neutrones, parte delncleo atmico, constituyen gran parte de la masa del mismo y por tanto, de la materia. Los neutrones sonfermiones y esto, en cierto sentido, son la quintaesencia de las partculas. Empero, en el interfermetro de neutrones, no actan slo como ondas mecanocunticas sino que tambin dichas ondas se encontraban directamente sujetas a la fuerza de la gravedad. A pesar de que esto no fue ninguna sorpresa, ya que se saba que la gravedad poda desviar la luz e incluso actuaba sobre los fotones (el experimento fallido sobre los fotones de Pound y Rebka), nunca se haba observado anteriormente actuar sobre las ondas mecanocunticas de los fermiones, los constituyentes de la materia ordinaria. En 1999 se inform de la difraccin del fulereno de C60 por investigadores de la Universidad de Viena.1 El fulereno es un objeto masivo, con una masa atmica de 720. La longitud de onda de De Broglie es de 2,5picmetros, mientras que el dimetro molecular es de 1 nanmetro, esto es, 400 veces mayor. Hasta el 2005, ste es el mayor objeto sobre el que se han observado propiedades ondulatorias mecanocunticas de manera directa. La interpretacin de dichos experimentos an crea controversia, ya que se asumieron los argumentos de la dualidad onda corpsculo y la validez de la ecuacin de De Broglie en su formulacin.
[editar]Teora

y filosofa

La paradoja de la dualidad onda-corpsculo es resuelta en el marco terico de la mecnica cuntica. Dicho marco es profundo y complejo, adems de imposible de resumir brevemente. Cada partcula en la naturaleza, sea fotn, electrn, tomo o lo que sea, puede describirse en trminos de la solucin de una ecuacin diferencial, tpicamente de la ecuacin de Schrdinger, pero tambin de la ecuacin de Dirac. Estas soluciones son funciones matemticas llamadas funciones de onda. Las funciones de onda pueden difractar e interferir con otras o consigo mismas, adems de otros fenmenos ondulatorios predecibles descritos en el experimento de la doble rendija. Las funciones de onda se interpretan a menudo como la probabilidad de encontrar la correspondiente partculaen un punto dado del espacio en un momento dado. Por ejemplo, en un experimento que contenga una partcula en movimiento, uno puede buscar que la partcula llegue a una localizacin en particular en un momento dado usando un aparato de deteccin que apunte a ese lugar. Mientras que el comportamiento cuntico sigue unas funciones determinsticas bien definidas (como las

funciones de onda), la solucin a tales ecuaciones son probabilsticas. La probabilidad de que el detector encuentre la partcula es calculada usando la integral del producto de la funcin de onda y su complejo conjugado. Mientras que la funcin de onda puede pensarse como una propagacin de la partcula en el espacio, en la prctica el detector ver o no ver la partcula entera en cuestin, nunca podr ver una porcin de la misma, como dos tercios de un electrn. He aqu la extraa dualidad: La partcula se propaga en el espacio de manera ondulatoria y probabilstica pero llega al detector como un corpsculo completo y localizado. Esta paradoja conceptual tiene explicaciones en forma de la interpretacin de Copenhague, el formulacin de integrales de caminos o la teora universos mltiples. Es importante puntualizar que todas estas interpretaciones son equivalentes y resultan en la misma prediccin, pese a que ofrecen unas interpretaciones filosficas muy diferentes. Mientras la mecnica cuntica hace predicciones precisas sobre el resultado de dichos experimentos, su significado filosfico an se busca y se discute. Dicho debate ha evolucionado como una ampliacin del esfuerzo por comprender la dualidad ondacorpsculo. Qu significa para un protn comportarse como onda y como partcula? Cmo puede ser un antielectrn matemticamente equivalente a un electrn movindose hacia atrs en el tiempo bajo determinadas circunstancias, y qu implicaciones tiene esto para nuestra experiencia unidireccional del tiempo? Cmo puede una partcula teletransportarse a travs de una barrera mientras que un baln de ftbol no puede atravesar un muro de cemento? Las implicaciones de estas facetas de la mecnica cuntica an siguen desconcertando a muchos de los que se interesan por ella. Algunos fsicos ntimamente relacionados con el esfuerzo por alcanzar las reglas de la mecnica cuntica han visto este debate filosfico sobre la dualidad onda-corpsculo como los intentos de sobreponer la experiencia humana en el mundo cuntico. Dado que, por naturaleza, este mundo es completamente no intuitivo, la teora cuntica debe ser aprendida bajo sus propios trminos independientes de la experiencia basada en la intuicin del mundo macroscpico. El mrito cientfico de buscar tan profundamente por un significado a la mecnica cuntica es, para ellos, sospechoso. El teorema de Bell y los experimentos que inspira son un buen ejemplo de la bsqueda de los fundamentos de la mecnica cuntica. Desde el punto de vista de un fsico, la incapacidad de la nueva filosofa cuntica de satisfacer un criterio comprobable o la imposibilidad de encontrar un fallo en la predictibilidad de las teoras actuales la reduce a una posicin nula, incluso al riesgo de degenerar en unapseudociencia.

[editar]Aplicaciones

La dualidad onda-corpsculo se usa en el microscopio de electrones, donde la pequea longitud de onda asociada al electrn puede ser usada para ver objetos mucho menores que los observados usando luz visible.
[editar]Vase

tambin

Cuanto Electromagnetismo Energa Fotn Mecnica cuntica o mecnica ondulatoria Movimiento ondulatorio Onda Luz Masa

[editar]Referencias

R. Nave. Dualidad Onda-Corpsculo HyperPhysics. Georgia State University, Department of Physics and Astronomy. (en ingls) Anton Zeilinger. Difraccin e interferencia con el fulereno C60. University of Vienna. (en ingls)

[editar]Notas

1.

Nature, volumen 401, pginas de la 680 a 682: Wave-particle duality of C60 por M. Arndt, O. Nairz, J. Voss-Andreae, C. Keller, G. van der Zouw, A. Zeilinger, 14 de octubre de 1999. Naturaleza onda corpsculo del Fulereno C60 (pdf) (en ingls)

Waveparticle duality
From Wikipedia, the free encyclopedia

Quantum mechanics

Uncertainty principle

Introduction Mathematical formulations

Background[show] Fundamental concepts[hide] Quantum state Wave function Superposition Entanglement Complementarity Duality Uncertainty Measurement Exclusion Decoherence Ehrenfest theorem Tunnelling Nonlocality Experiments[show] Formulations[show] Equations[show] Interpretations[show] Advanced topics[show] Scientists[show]

vde

Waveparticle duality postulates that all particles exhibit both wave and particle properties. A central concept of quantum mechanics, this duality addresses the inability of classical concepts like "particle" and "wave" to fully describe the behavior of quantum-scale objects. Standard interpretations of quantum mechanics explain this paradox as a fundamental property of the Universe, while alternative interpretations explain the duality as an emergent, second-order consequence of various limitations of the observer. This treatment focuses on explaining the behavior from the perspective of the widely used Copenhagen interpretation, in which waveparticle duality is one aspect of the concept

of complementarity, that a phenomenon can be viewed in one way or in another, but not both simultaneously. The idea of duality originated in a debate over the nature of light and matter that dates back to the 17th century, when competing theories of light were proposed by Christiaan Huygens and Isaac Newton: light was thought either to consist of waves (Huygens) or of particles(Newton). Through the work of Max Planck, Albert Einstein, Louis de Broglie, Arthur Compton, Niels Bohr, and many others, current scientific theory holds that all particles also have a wave nature (and vice versa).[1] This phenomenon has been verified not only for elementary particles, but also for compound particles like atoms and even molecules. In fact, according to traditional formulations of non-relativisticquantum mechanics, waveparticle duality applies to all objects, even macroscopic ones; but because of their small wavelengths, the wave properties of macroscopic objects cannot be detected.[2]
Contents
[hide]

1 Brief history of wave and particle viewpoints 2 The turn of the 19th century and the paradigm shift

2.1 Particles of electricity? 2.2 Radiation quantization 2.3 The photoelectric effect illuminated

3 Developmental milestones 3.1 Huygens and Newton 3.2 Young, Fresnel, and Maxwell 3.3 Planck's formula for black-body radiation 3.4 Einstein's explanation of the photoelectric effect 3.5 De Broglie's wavelength 3.6 Heisenberg's uncertainty principle

4 Wave behavior of large objects 5 Treatment in modern quantum mechanics 6 Alternative views

6.1 Particle-only view 6.2 Wave-only view 6.3 Neither-wave-nor-particle view 6.4 Relational approach to wave particle duality

7 Applications 8 See also 9 Notes and references 10 External links

[edit]Brief

history of wave and particle viewpoints

Aristotle was one of the first to publicly hypothesize about the nature of light, proposing that light is a disturbance in the element air. (That is, it is a wave-like phenomenon). On the other hand, Democritus the original atomist argued that all things in the universe, including light, are composed of indivisible sub-components (light being some form of solar atom).[3] At the beginning of the 11th Century, the Arabic scientist Alhazen wrote the first comprehensive treatise on optics; describing refraction, reflection, and the operation of a pinhole lens via rays of light traveling from the point of emission to the eye. He asserted that these rays were composed of particles of light. In 1630, Ren Descartes popularized and accredited in the West the opposing wave description in his treatise on light, showing that the behavior of light could be re-created by modeling wave-like disturbances in his universal medium ("plenum"). Beginning in 1670 and progressing over three decades, Isaac Newton developed and championed his corpuscular hypothesis, arguing that the perfectly straight lines of reflection demonstrated light's particle nature; only particles could travel in such straight lines. He explained refraction by positing that particles of light accelerated laterally upon entering a denser medium. Around the same time, Newton's contemporaries Robert Hooke and Christian Huygens and later Augustin-Jean Fresnel mathematically refined the wave viewpoint, showing that if light traveled at different speeds in different media (such as water and air), refraction could be easily explained as the medium-dependent propagation of light waves. The resulting HuygensFresnel principle was extremely successful at reproducing light's behavior and, subsequently supported byThomas Young's discovery of double-slit interference, was the beginning of the end for the particle light camp.[4]

Thomas Young's sketch of two-slit diffraction of waves, 1803.

The final blow against corpuscular theory came when James Clerk Maxwell discovered that he could combine four simple equations, which had been previously discovered, along with a slight modification to describe self propagating waves of oscillating electric and magnetic fields. When the propagation speed of these electromagnetic waves was calculated, the speed of light fell out. It quickly became apparent that visible light, ultraviolet light, and infrared light (phenomena thought previously to be unrelated) were all electromagnetic waves of differing frequency. The wave theory had prevailed or at least it seemed to. While the 19th century had seen the success of the wave theory at describing light, it had also witnessed the rise of the atomic theory at describing matter. In 1789, Antoine Lavoisier securely differentiated chemistry from alchemy by introducing rigor and precision into his laboratory techniques; allowing him to deduce the conservation of mass and categorize many new chemical elements and compounds. However, the nature of these essential chemical elements remained unknown. In 1799, Joseph Louis Proust advanced chemistry towards the atom by showing that elements combined in definite proportions. This led John Dalton to resurrect Democritus' atom in 1803, when he proposed that elements were invisible sub components; which explained why the varying oxides of metals (e.g. stannous oxide and cassiterite, SnO and SnO2 respectively) possess a 1:2 ratio of oxygen to one another. But Dalton and other chemists of the time had not considered that some elements occur in monatomic form (like Helium) and others in diatomic form (like Hydrogen), or that water was H2O, not the simpler and more intuitive HO thus the atomic weights presented at the time were varied and often incorrect. Additionally, the formation of HO by two parts of hydrogen gas and one part of oxygen gas would require an atom of oxygen to split in half (or two half-atoms of hydrogen to come together). This problem was solved by Amedeo Avogadro, who studied the reacting volumes of gases as they formed liquids and solids. By postulating that equal volumes of elemental gas contain an equal number of atoms, he was able to show that H2O was formed from two parts H2 and one part O2. By discovering diatomic gases, Avogadro completed the basic atomic theory, allowing the correct molecular formulae of most known compounds as well as the correct weights of atoms to be deduced and categorized in a consistent manner. The final stroke in classical atomic theory came when Dimitri Mendeleev saw an order in recurring chemical properties, and created a table presenting the elements in unprecedented

order and symmetry. But there were holes in Mendeleev's table, with no element to fill them in. His critics initially cited this as a fatal flaw, but were silenced when new elements were discovered that perfectly fit into these holes. The success of the periodic table effectively converted any remaining opposition to atomic theory; even though no single atom had ever been observed in the laboratory, chemistry was now an atomic science.
[edit]The

turn of the 19th century and the paradigm shift


of electricity?

[edit]Particles

At the close of the 19th century, the reductionism of atomic theory began to advance into the atom itself; determining, through physics, the nature of the atom and the operation of chemical reactions. Electricity, first thought to be a fluid, was now understood to consist of particles called electrons. This was first demonstrated by J. J. Thomson in 1897 when, using a cathode ray tube, he found that an electrical charge would travel across a vacuum (which would possess infinite resistance in classical theory). Since the vacuum offered no medium for an electric fluid to travel, this discovery could only be explained via a particle carrying a negative charge and moving through the vacuum. This electron flew in the face of classical electrodynamics, which had successfully treated electricity as a fluid for many years (leading to the invention of batteries, electric motors, dynamos, and arc lamps). More importantly, the intimate relation between electric charge and electromagnetism had been well documented following the discoveries of Michael Faraday and Clerk Maxwell. Since electromagnetism was known to be a wave generated by a changing electric or magnetic field (a continuous, wave-like entity itself) an atomic/particle description of electricity and charge was a non sequitur. And classical electrodynamics was not the only classical theory rendered incomplete.
[edit]Radiation

quantization

Black-body radiation, the emission of electromagnetic energy due to an object's heat, could not be explained from classical arguments alone. The equipartition theorem of classical mechanics, the basis of all classical thermodynamic theories, stated that an object's energy is partitioned equally among the object's vibrational modes. This worked well when describing thermal objects, whose vibrational modes were defined as the speeds of their constituent atoms, and the speed distribution derived from egalitarian partitioning of these vibrational modes closely matched experimental results. Speeds much higher than the average speed were suppressed by the fact that kinetic energy is quadratic doubling the speed requires four times the energy thus the number of atoms occupying high energy modes (high speeds) quickly drops off because the constant, equal partition can excite successively fewer atoms. Low speed modes would ostensibly dominate the distribution, since low speed modes would require ever less energy, and prima facie a zero-speed mode would require zero energy and its energy partition would contain an infinite number of atoms. But this would only occur in the absence of atomic interaction; when

collisions are allowed, the low speed modes are immediately suppressed by jostling from the higher energy atoms, exciting them to higher energy modes. An equilibrium is swiftly reached where most atoms occupy a speed proportional to the temperature of the object (thus defining temperature as the average kinetic energy of the object). But applying the same reasoning to the electromagnetic emission of such a thermal object was not so successful. It had been long known that thermal objects emit light. Hot metal glows red, and upon further heating, white (this is the underlying principle of the incandescent bulb). Since light was known to be waves of electromagnetism, physicists hoped to describe this emission via classical laws. This became known as the black body problem. Since the equipartition theorem worked so well in describing the vibrational modes of the thermal object itself, it was trivial to assume that it would perform equally well in describing the radiative emission of such objects. But a problem quickly arose when determining the vibrational modes of light. To simplify the problem (by limiting the vibrational modes) a lowest allowable wavelength was defined by placing the thermal object in a cavity. Any electromagnetic mode at equilibrium (i.e. any standing wave) could only exist if it used the walls of the cavities as nodes. Thus there were no waves/modes with a wavelength larger than twice the length (L) of the cavity.

Standing waves in a cavity

The first few allowable modes would therefore have wavelengths of : 2L, L, 2L/3, L/2, etc. (each successive wavelength adding one node to the wave). However, while the wavelength could never exceed 2L, there was no such limit on decreasing the wavelength, and adding nodes to reduce the wavelength could proceed ad infinitum. Suddenly it became apparent that the short wavelength modes completely dominated the distribution, since ever shorter wavelength modes could be crammed into the cavity. If each mode received an equal partition of energy, the short wavelength modes would consume all the energy. This became clear when plotting the RayleighJeans law which, while correctly predicting the intensity of long wavelength emissions, predicted infinite total energy as the intensity diverges to infinity for short wavelengths. This became known as the ultraviolet catastrophe.

The solution arrived in 1900 when Max Planck hypothesized that the frequency of light emitted by the black body depended on the frequency of the oscillator that emitted it, and the energy of these oscillators increased linearly with frequency (according to his constant h, where E = h). This was not an unsound proposal considering that macroscopic oscillators operate similarly: when studying five simple harmonic oscillators of equal amplitude but different frequency, the oscillator with the highest frequency possesses the highest energy (though this relationship is not linear like Planck's). By demanding that high-frequency light must be emitted by an oscillator of equal frequency, and further requiring that this oscillator occupy higher energy than one of a lesser frequency, Planck avoided any catastrophe; giving an equal partition to high-frequency oscillators produced successively fewer oscillators and less emitted light. And as in the MaxwellBoltzmann distribution, the low-frequency, low-energy oscillators were suppressed by the onslaught of thermal jiggling from higher energy oscillators, which necessarily increased their energy and frequency. The most revolutionary aspect of Planck's treatment of the black body is that it inherently relies on an integer number of oscillators in thermal equilibrium with the electromagnetic field. These oscillators give their entire energy to the electromagnetic field, creating a quantum of light, as often as they are excited by the electromagnetic field, absorbing a quantum of light and beginning to oscillate at the corresponding frequency. Planck had intentionally created an atomic theory of the black body, but had unintentionally generated an atomic theory of light, where the black body never generates quanta of light at a given frequency with an energy less than h. However, once realizing that he had quantized the electromagnetic field, he denounced particles of light as a limitation of his approximation, not a property of reality.
[edit]The

photoelectric effect illuminated

Yet while Planck had solved the ultraviolet catastrophe by using atoms and a quantized electromagnetic field, most physicists immediately agreed that Planck's "light quanta" were unavoidable flaws in his model. A more complete derivation of black body radiation would produce a fully continuous, fully wavelike electromagnetic field with no quantization. However, in 1905 Albert Einstein took Planck's black body model in itself and saw a wonderful solution to another outstanding problem of the day: the photoelectric effect. Ever since the discovery of electrons eight years previously, electrons had been the thing to study in physics laboratories worldwide. Nikola Tesla discovered in 1901 that when a metal was illuminated by high-frequency light (e.g. ultraviolet light), electrons were ejected from the metal at high energy. This work was based on the previous knowledge that light incident upon metals produces a current, but Tesla was the first to describe it as a particle phenomenon. The following year, Philipp Lenard discovered that (within the range of the experimental parameters he was using) the energy of these ejected electrons did not depend on the intensity of the incoming light, but

on its frequency. So if one shines a little low-frequency light upon a metal, a few low energy electrons are ejected. If one now shines a very intense beam of low-frequency light upon the same metal, a whole slew of electrons are ejected; however they possess the same low energy, there are merely more of them. In order to get high energy electrons, one must illuminate the metal with high-frequency light. The more light there is, the more electrons are ejected. Like blackbody radiation, this was at odds with a theory invoking continuous transfer of energy between radiation and matter. However, it can still be explained using a fully classical description of light, as long as matter is quantum mechanical in nature.[5] If one used Planck's energy quanta, and demanded that electromagnetic radiation at a given frequency could only transfer energy to matter in integer multiples of an energy quantum h, then the photoelectric effect could be explained very simply. Low-frequency light only ejects low-energy electrons because each electron is excited by the absorption of a single photon. Increasing the intensity of the lowfrequency light (increasing the number of photons) only increases the number of excited electrons, not their energy, because the energy of each photon remains low. Only by increasing the frequency of the light, and thus increasing the energy of the photons, can one eject electrons with higher energy. Thus, using Planck's constant h to determine the energy of the photons based upon their frequency, the energy of ejected electrons should also increase linearly with frequency; the gradient of the line being Planck's constant. These results were not confirmed until 1915, when Robert Andrews Millikan, who had previously determined the charge of the electron, produced experimental results in perfect accord with Einstein's predictions. While the energy of ejected electrons reflected Planck's constant, the existence of photons was not explicitly proven until the discovery of the photon antibunching effect, of which a modern experiment can be performed in undergraduate-level labs.[6] This phenomenon could only be explained via photons, and not through any semi-classical theory (which could alternatively explain the photoelectric effect). When Einstein received his Nobel Prize in 1921, it was not for his more difficult and mathematically laborious special and general relativity, but for the simple, yet totally revolutionary, suggestion of quantized light. Einstein's "light quanta" would not be called photons until 1925, but even in 1905 they represented the quintessential example of waveparticle duality. Electromagnetic radiation propagates following linear wave equations, but can only be emitted or absorbed as discrete elements, thus acting as a wave and a particle simultaneously.
[edit]Developmental [edit]Huygens

milestones

and Newton

The earliest comprehensive theory of light was advanced by Christiaan Huygens, who proposed a wave theory of light, and in particular demonstrated how waves might interfere to form a wavefront, propagating in a straight line. However, the theory had difficulties in other matters, and was soon overshadowed by Isaac Newton's corpuscular theory of light. That is, Newton proposed that light

consisted of small particles, with which he could easily explain the phenomenon of reflection. With considerably more difficulty, he could also explain refraction through a lens, and the splitting of sunlight into a rainbow by a prism. Newton's particle viewpoint went essentially unchallenged for over a century.[7]
[edit]Young,

Fresnel, and Maxwell

In the early 19th century, the double-slit experiments by Young and Fresnel provided evidence for Huygens' wave theories. The double-slit experiments showed that when light is sent through a grid, a characteristic interference pattern is observed, very similar to the pattern resulting from the interference of water waves; the wavelength of light can be computed from such patterns. The wave view did not immediately displace the ray and particle view, but began to dominate scientific thinking about light in the mid 19th century, since it could explain polarization phenomena that the alternatives could not.[8] In the late 19th century, James Clerk Maxwell explained light as the propagation of electromagnetic waves according to the Maxwell equations. These equations were verified by experiment by Heinrich Hertz in 1887, and the wave theory became widely accepted.
[edit]Planck's

formula for black-body radiation

Main article: Planck's law In 1901, Max Planck published an analysis that succeeded in reproducing the observed spectrum of light emitted by a glowing object. To accomplish this, Planck had to make an ad hoc mathematical assumption of quantized energy of the oscillators (atoms of the black body) that emit radiation. It was Einstein who later proposed that it is the electromagnetic radiation itself that is quantized, and not the energy of radiating atoms.
[edit]Einstein's

explanation of the photoelectric effect

Main article: Photoelectric effect

The photoelectric effect. Incoming photons on the left strike a metal plate (bottom), and eject electrons, depicted as flying off to the right.

In 1905, Albert Einstein provided an explanation of the photoelectric effect, a hitherto troubling experiment that the wave theory of light seemed incapable of explaining. He did so by postulating the existence of photons, quanta of light energy with particulate qualities. In the photoelectric effect, it was observed that shining a light on certain metals would lead to an electric current in a circuit. Presumably, the light was knocking electrons out of the metal, causing current to flow. However, using the case of potassium as an example, it was also observed that while a dim blue light was enough to cause a current, even the strongest, brightest red light available with the technology of the time caused no current at all. According to the classical theory of light and matter, the strength or amplitude of a light wave was in proportion to its brightness: a bright light should have been easily strong enough to create a large current. Yet, oddly, this was not so. Einstein explained this conundrum by postulating that the electrons can receive energy from electromagnetic field only in discrete portions (quanta that were called photons): an amount of energy E that was related to the frequency f of the light by

where h is Planck's constant (6.626 1034 J seconds). Only photons of a high enough frequency (above a certain threshold value) could knock an electron free. For example, photons of blue light had sufficient energy to free an electron from the metal, but photons of red light did not. More intense light above the threshold frequency could release more electrons, but no amount of light (using technology available at the time) below the threshold frequency could release an electron. To "violate" this law would require extremely high intensity lasers which had not yet been invented. Intensity-dependent phenomena have now been studied in detail with such lasers.[9] Einstein was awarded the Nobel Prize in Physics in 1921 for his discovery of the law of the photoelectric effect.
[edit]De

Broglie's wavelength

Main article: Matter wave In 1924, Louis-Victor de Broglie formulated the de Broglie hypothesis, claiming that all matter,[10]
[11]

not just light, has a wave-like nature; he related wavelength (denoted as ),

and momentum (denoted as p):

This is a generalization of Einstein's equation above, since the momentum of a photon is given by p = and the wavelength (in a vacuum) by = , where c is the speed of light in vacuum.

De Broglie's formula was confirmed three years later for electrons (which differ from photons in having a rest mass) with the observation of electron diffraction in two independent experiments. At the University of Aberdeen, George Paget Thomson passed a beam of electrons through a thin metal film and observed the predicted interference patterns. At Bell Labs Clinton Joseph Davisson and Lester Halbert Germer guided their beam through a crystalline grid. De Broglie was awarded the Nobel Prize for Physics in 1929 for his hypothesis. Thomson and Davisson shared the Nobel Prize for Physics in 1937 for their experimental work.
[edit]Heisenberg's

uncertainty principle

Main article: Heisenberg uncertainty principle In his work on formulating quantum mechanics, Werner Heisenberg postulated his uncertainty principle, which states:

where

here indicates standard deviation, a measure of spread or uncertainty;


x and p are a particle's position and linear momentum respectively. is the reduced Planck's constant (Planck's constant divided by 2). Heisenberg originally explained this as a consequence of the process of measuring: Measuring position accurately would disturb momentum and viceversa, offering an example (the "gamma-ray microscope") that depended crucially on the de Broglie hypothesis. It is now thought, however, that this only partly explains the phenomenon, but that the uncertainty also exists in the particle itself, even before the measurement is made. In fact, the modern explanation of the uncertainty principle, extending the Copenhagen interpretation first put forward by Bohr and Heisenberg, depends even more centrally on the wave nature of a particle: Just as it is nonsensical to discuss the precise location of a wave on a string, particles do not have perfectly precise positions; likewise, just as it is nonsensical to discuss the wavelength of a "pulse" wave traveling down a string, particles do

not have perfectly precise momenta (which corresponds to the inverse of wavelength). Moreover, when position is relatively well defined, the wave is pulse-like and has a very ill-defined wavelength (and thus momentum). And conversely, when momentum (and thus wavelength) is relatively well defined, the wave looks long and sinusoidal, and therefore it has a very ill-defined position. De Broglie himself had proposed a pilot wave construct to explain the observed waveparticle duality. In this view, each particle has a well-defined position and momentum, but is guided by a wave function derived from Schrdinger's equation. The pilot wave theory was initially rejected because it generated non-local effects when applied to systems involving more than one particle. Non-locality, however, soon became established as an integral feature of quantum theory (see EPR paradox), and David Bohm extended de Broglie's model to explicitly include it. In the resulting representation, also called the de BroglieBohm theory or Bohmian mechanics,[12], the waveparticle duality is not a property of matter itself, but an appearance generated by the particle's motion subject to a guiding equation or quantum potential.
[edit]Wave

behavior of large objects

Since the demonstrations of wave-like properties in photons and electrons, similar experiments have been conducted with neutrons and protons. Among the most famous experiments are those of Estermann and Otto Stern in 1929.[13] Authors of similar recent experiments with atoms and molecules, described below, claim that these larger particles also act like waves. A dramatic series of experiments emphasizing the action of gravity in relation to waveparticle duality were conducted in the 1970s using the neutron interferometer.[14] Neutrons, one of the components of the atomic nucleus, provide much of the mass of a nucleus and thus of ordinary matter. In the neutron interferometer, they act as quantum-mechanical waves directly subject to the force of gravity. While the results were not surprising since gravity was known to act on everything, including light (see tests of general relativity and thePound-Rebka falling photon experiment), the selfinterference of the quantum mechanical wave of a massive fermion in a gravitational field had never been experimentally confirmed before.

In 1999, the diffraction of C60 fullerenes by researchers from the University of Vienna was reported.[15] Fullerenes are comparatively large and massive objects, having an atomic mass of about 720 u. The de Broglie wavelength is 2.5 pm, whereas the diameter of the molecule is about 1 nm, about 400 times larger. As of 2005, this is the largest object for which quantum-mechanical wave-like properties have been directly observed in far-field diffraction. In 2003 the Vienna group also demonstrated the wave nature of tetraphenylporphyrin[16]a flat biodye with an extension of about 2 nm and a mass of 614 u. For this demonstration they employed a near-field Talbot Lau interferometer.[17][18] In the same interferometer they also found interference fringes for C60F48., a fluorinated buckyball with a mass of about 1600 u, composed of 108 atoms.[16] Large molecules are already so complex that they give experimental access to some aspects of the quantum-classical interface, i.e. to certain decoherence mechanisms.[19][20] Whether objects heavier than the Planck mass (about the weight of a large bacterium) have a de Broglie wavelength is theoretically unclear and experimentally unreachable; above the Planck mass a particle's Compton wavelength would be smaller than the Planck length and its own Schwarzschild radius, a scale at which current theories of physics may break down or need to be replaced by more general ones.[21] Recently Couder, Fort et al. showed that we can use macroscopic oil droplets on vibrating surface as a model of wave-particle duality - localized droplet creates periodical waves around and interaction with them leads to quantumlike phenomena: interference in double-slit experiment,[22] unpredictable tunneling[23] (depending in complicated way on practically hidden state of field) and orbit quantization[24] (that particle has to 'find a resonance' with field perturbations it creates - after one orbit, its internal phase has to return to the initial state).
[edit]Treatment

in modern quantum mechanics

Waveparticle duality is deeply embedded into the foundations of quantum mechanics, so well that modern practitioners rarely discuss it as such. In the formalism of the theory, all the information about a particle is encoded in its wave function, a complex valued function roughly analogous to the amplitude of a wave at each point in space. This function evolves according

to a differential equation (generically called the Schrdinger equation), and this equation gives rise[clarification needed] to wave-like phenomena such as interference and diffraction. The particle-like behavior is most evident due to phenomena associated with measurement in quantum mechanics. Upon measuring the location of the particle, the wave-function will randomly "collapse," or rather, "decoheres" to a sharply peaked function at some location, with the likelihood of any particular location equal to the squared amplitude of the wave-function there.
[clarification needed]

The measurement will return a well-defined position, (subject

to uncertainty), a property traditionally associated with particles. Although this picture is somewhat simplified (to the non-relativistic case), it is adequate[citation needed] to capture the essence of current thinking on the phenomena historically called "waveparticle duality".[clarification needed] (See also: Particle in a box, Mathematical formulation of quantum mechanics.)

Wave-particle with a measurable wavelength has momentum p, wave-vector k and energy E. The uncertainty in momentum p or energy E is small. The wave-like probability distribution does not specify a single place where the particle is, due to the periodicity of the wave, any of the amplitude (or antinode) locations are the most likley regions the particle is at

time t and position x, so the uncertainty in position xor time t are both large. This fits in with the Uncertainty principle.

Inverse of the previous situation - if is unknown, so is p, k, andE, but is clearer in x and t. So this time x or t is smaller than p or E. Illustration of wave-particle duality in one dimension for one particle - probability of finding a particle in space is distributed as a (complex valued) waveform through space, mathematically described by the particle's wavefunction . The colour opacity (%) of the particles corresponds to the probability density of finding the particle at the points on the x-axis (waves correspond to , particles location to *). The wave-particle model is consistent with De Broglie's hypothesis and Heisenberg's uncertainty principle. Note that uncertainty in wavelength is not the same as uncertainty in position x.

[edit]Alternative

views

An editor has expressed a concern that this article lends undue weight to certain ideas, incidents, controversies or matters relative to the article subject as a whole. Please help to create a more balanced presentation. Discuss and resolve this issue before removing this message. (July 2009) [edit]Particle-only

view

The pilot wave model, originally developed by Louis de Broglie and further developed by David Bohm into the hidden variable theory proposes that there is no duality, but rather particles are guided, in a deterministic fashion, by a

pilot wave (or its "quantum potential") which will direct them to areas of constructive interference in preference to areas ofdestructive interference. This idea is held by a significant minority within the physics community.[25] At least one physicist considers the wave-duality a misnomer, as L. Ballentine, Quantum Mechanics, A Modern Development, p. 4, explains: When first discovered, particle diffraction was a source of great puzzlement. Are "particles" really "waves?" In the early experiments, the diffraction patterns were detected holistically by means of a photographic plate, which could not detect individual particles. As a result, the notion grew that particle and wave properties were mutually incompatible, or complementary, in the sense that different measurement apparatuses would be required to observe them. That idea, however, was only an unfortunate generalization from a technological limitation. Today it is possible to detect the arrival of individual electrons, and to see the diffraction pattern emerge as a statistical pattern made up of many small spots (Tonomura et al., 1989). Evidently, quantum particles are indeed particles, but whose behaviour is very different from classical physics would have us to expect. Afshar's[26] experiment (2007) has demonstrated that it is possible to simultaneously observe both wave and particle properties of photons. Biddulph (2010)[27] has explained this by applying techniques from deterministic chaos to non-chaotic systems, in particular a computable version of Palmer's Universal Invariant Set proposition[28] (2009), which allows the apparent weirdness of quantum phenomena to be explained as artefacts of the quantum apparatus not a fundamental property of nature. Waves are shown to be the only means of describing motion, since smooth motion on a continuum is impossible. If a particle visits every point on its trajectory then the motion is an algorithm for each point. Turing[29] has shown that almost all numbers are non-computable, which means that there is no possible algorithm, so the set of points on a trajectory is sparse. This implies that motion is either jerky or wave-like. By removing the need to load the particle with the properties of space and time, a fully deterministic, local and causal description of quantum phenomena is possible by use of a simple dynamical operator on a Universal Invariant Set.

[edit]Wave-only

view

At least one scientist proposes that the duality can be replaced by a "waveonly" view. Carver Mead's Collective Electrodynamics: Quantum Foundations of Electromagnetism (2000) analyzes the behavior of electrons and photons purely in terms of electron wave functions, and attributes the apparent particle-like behavior to quantization effects and eigenstates. According to reviewer David Haddon:[30] Mead has cut the Gordian knot of quantum complementarity. He claims that atoms, with their neutrons, protons, and electrons, are not particles at all but pure waves of matter. Mead cites as the gross evidence of the exclusively wave nature of both light and matter the discovery between 1933 and 1996 of ten examples of pure wave phenomena, including the ubiquitous laser of CD players, the selfpropagating electrical currents of superconductors, and the Bose Einstein condensate of atoms. Albert Einstein, who, in his search for a Unified Field Theory, did not accept wave-particle duality, wrote:[31] This double nature of radiation (and of material corpuscles)...has been interpreted by quantum-mechanics in an ingenious and amazingly successful fashion. This interpretation...appears to me as only a temporary way out... And theoretical physicist Mendel Sachs, who completed Einstein's unified field theory, writes:[32] Instead, one has a single, holistic continuum, wherein what were formerly called discrete, separable particles of matter are instead the infinite number of distinguishable, though correlated manifestations of this continuum, that in principle is the universe. Hence, waveparticle dualism, which is foundational for the quantum theory, is replaced by wave (continuous field) monism. The many-worlds interpretation (MWI) is sometimes presented as a wavesonly theory, including by its originator, Hugh Everett who referred to MWI as "the wave interpretation".[33]

The Three Wave Hypothesis of R. Horodecki relates the particle to wave.[34]


[35]

The hypothesis implies that a massive particle is an intrinsically spatially

as well as temporally extended wave phenomenon by a nonlinear law. According to M. I. Sanduk this hypothesis is related to a hypothetical bevel gear model.[36] Then both concepts of particle and wave may be attributed to an observation problem of the gear.[37]
[edit]Neither-wave-nor-particle

view

It has been argued that there are never exact particles or waves, but only some compromise or intermediate between them. One consideration is that zero dimensional mathematical points cannot be observed. Another is that the formal representation of such points, the Kronecker delta function is unphysical, because it cannot be renormalized. Parallel arguments apply to pure wave states. "Such positions states are idealised wavefunctions [..] Whereas the momentum states are infinitely spread out, the position states are infinitely concentrated. Neither is normaliseable[..]"
[38]

[edit]Relational

approach to waveparticle duality

Relational quantum mechanics is developed which regards the detection event as establishing a relationship between the quantized field and the detector. The inherent ambiguity associated with applying Heisenberg's uncertainty principle and thus waveparticle duality is subsequently avoided.
[39]

[edit]Applications

Although it is difficult to draw a line separating waveparticle duality from the rest of quantum mechanics, it is nevertheless possible to list some applications of this basic idea.

Waveparticle duality is exploited in electron microscopy, where the small wavelengths associated with the electron can be used to view objects much smaller than what is visible using visible light.

Similarly, neutron diffraction uses neutrons with a wavelength of about 0.1 nm, the typical spacing of atoms in a solid, to determine the structure of solids.

[edit]See

also

Arago spot Afshar experiment Basic concepts of quantum mechanics Electron wave-packet interference Hanbury Brown and Twiss effect Photon polarization Scattering theory Wavelet Wheeler's delayed choice experiment

[edit]Notes

and references

^ 39. Noel, David. The Photon Hoop Model for light. http://www.aoi.com.au/bcw/PhotonHoop/.

1. 2.

^ Walter Greiner (2001). Quantum Mechanics: An Introduction. Springer.ISBN 3540674586. ^ R. Eisberg and R. Resnick (1985). Quantum Physics of Atoms, Molecules, Solids, Nuclei, and Particles (2nd ed.). John Wiley & Sons. pp. 5960. ISBN 047187373X. "For both large and small wavelengths, both matter and radiation have both particle and wave aspects.... But the wave aspects of their motion become more difficult to observe as their wavelengths become shorter.... For ordinary macroscopic particles the mass is so large that the momentum is always sufficiently large to make the de Broglie wavelength small enough to be beyond the range of experimental detection, and classical mechanics reigns supreme."

3. 4. 5.

^ Nathaniel Page Stites, M.A./M.S. "Light I: Particle or Wave?," Visionlearning Vol. PHY-1 (3), 2005. http://www.visionlearning.com/library/module_viewer.php?mid=132 ^ Thomas Young: The Double Slit Experiment ^ Lamb, Willis E.; Scully, Marlan O. (1968). "The photoelectric effect without photons".

6. 7. 8.

^ http://www.hep.princeton.edu/~mcdonald/examples/QM/thorn_ajp_72_1210_04.pdf ^ "light", The Columbia Encyclopedia, Sixth Edition. 200105. ^ Buchwald, Jed (1989). The Rise of the Wave Theory of Light: Optical Theory and Experiment in the Early Nineteenth Century. Chicago: University of Chicago Press.ISBN 0226078868. OCLC 59210058 18069573 59210058

9.

^ Zhang, Q (1996). "Intensity dependence of the photoelectric effect induced by a circularly polarized laser beam". Physics Letters A 216 (1-5): 125. Bibcode1996PhLA..216..125Z. doi:10.1016/0375-9601(96)00259-9.

10.^ Donald H Menzel, "Fundamental formulas of Physics", volume 1, page 153; Gives
the de Broglie wavelengths for composite particles such as protons and neutrons.

11.^ Brian Greene, The Elegant Universe, page 104 "all matter has a wave-like
character"

12.^ Bohmian Mechanics, Stanford Encyclopedia of Philosophy. 13.^ Estermann, I.; Stern O. (1930). "Beugung von Molekularstrahlen". Zeitschrift fr
Physik61 (1-2): 95125. Bibcode 1930ZPhy...61...95E. doi:10.1007/BF01340293.

14.^ R. Colella, A. W. Overhauser and S. A. Werner, Observation of Gravitationally


Induced Quantum Interference, Phys. Rev. Lett. 34, 14721474 (1975).

15.^ Arndt, Markus; O. Nairz, J. Voss-Andreae, C. Keller, G. van der Zouw, A.


Zeilinger (14 October 1999). "Waveparticle duality of C60". Nature 401 (6754): 680 682. Bibcode1999Natur.401..680A. doi:10.1038/44348. PMID 18494170.

16.^

a b

Hackermller, Lucia; Stefan Uttenthaler, Klaus Hornberger, Elisabeth Reiger,

Bjrn Brezger, Anton Zeilinger and Markus Arndt (2003). "The wave nature of biomolecules and fluorofullerenes". Phys. Rev. Lett. 91 (9): 090408. arXiv:quantph/0309016. Bibcode2003PhRvL..91i0408H. doi:10.1103/PhysRevLett.91.090408. PM ID 14525169.

17.^ Clauser, John F.; S. Li (1994). "Talbot von Lau interefometry with cold slow
potassium atoms.". Phys. Rev. A 49 (4): R2213 17. Bibcode 1994PhRvA..49.2213C.doi:10.1103/PhysRevA.49.R2213.

18.^ Brezger, Bjrn; Lucia Hackermller, Stefan Uttenthaler, Julia Petschinka, Markus
Arndt and Anton Zeilinger (2002). "Matter-wave interferometer for large molecules". Phys. Rev. Lett. 88 (10): 100404. arXiv:quant-

ph/0202158. Bibcode 2002PhRvL..88j0404B.doi:10.1103/PhysRevLett.88.100404. PM ID 11909334.

19.^ Hornberger, Klaus; Stefan Uttenthaler,Bjrn Brezger, Lucia Hackermller, Markus


Arndt and Anton Zeilinger (2003). "Observation of Collisional Decoherence in Interferometry".Phys. Rev. Lett. 90 (16): 160401. arXiv:quantph/0303093. Bibcode2003PhRvL..90p0401H. doi:10.1103/PhysRevLett.90.160401. P MID 12731960.

20.^ Hackermller, Lucia; Klaus Hornberger, Bjrn Brezger, Anton Zeilinger and Markus
Arndt (2004). "Decoherence of matter waves by thermal emission of radiation". Nature427 (6976): 711714. arXiv:quantph/0402146. Bibcode 2004Natur.427..711H.doi:10.1038/nature02276. PMID 1497347 8.

21.^ Peter Gabriel Bergmann, The Riddle of Gravitation, Courier Dover Publications,
1993ISBN 0486273784 online

22.^ Y. Couder, E. Fort, Single-Particle Diffraction and Interference at a Macroscopic


Scale, PRL 97, 154101 (2006) online

23.^ A. Eddi, E. Fort, F. Moisy, Y. Couder, Unpredictable Tunneling of a Classical


Wave-Particle Association, PRL 102, 240401 (2009)

24.^ E. Fort, A. Eddi, A. Boudaoud, J. Moukhtar, Y. Couder, Path-memory induced


quantization of classical orbits, PNAS October 12, 2010 vol. 107 no. 41 17515-17520

25.^ (Buchanan pp. 2931) 26.^ Afshar S.S. et al: Paradox in Wave Particle Duality. Found. Phys. 37, 295
(2007)http://arxiv.org/abs/quant-ph/0702188 arXiv:quant-ph/0702188

27.^ Biddulph A.F. The Quantum Illusion 28.^ Palmer T. N.: The Invariant Set Postulate. Proc.Roy.Soc.Lond.A465:31873207,2009http://arxiv.org/abs/0812.1148

29.^ Turing A.: Proc Land Math Soc (2) 42, 230 (1937) 30.^ David Haddon. "Recovering Rational Science". Touchstone. Retrieved 2007-09-12. 31.^ Paul Arthur Schilpp, ed, Albert Einstein: Philosopher-Scientist, Open Court
(1949),ISBN 0-87548-131-7, p 51.

32.^ Mendel Sachs, Quantum Mechanics and Gravity, Springer (2004), ISBN 3-54000800-4, p 11.

33.^ See section VI(e) of Everett's thesis: The Theory of the Universal Wave Function,
inBryce Seligman DeWitt, R. Neill Graham, eds, The Many-Worlds Interpretation of Quantum Mechanics, Princeton Series in Physics, Princeton University Press (1973),ISBN 0-691-08131-X, pp 3140.

34.^ Horodecki, R. (1981). "De broglie wave and its dual wave". Phys. Lett. A 87 (3):
9597.Bibcode 1981PhLA...87...95H. doi:10.1016/0375-9601(81)90571-5.

35.^ Horodecki, R. (1983). "Superluminal singular dual wave". Lett. Novo Cimento 38:
509511.

36.^ Sanduk, M. I. (2007). "Does the Three Wave Hypothesis Imply Hidden
Structure?".Apeiron 14 (2): 113125. Bibcode 2007Apei...14..113S.

37.^ Sanduk, M. I. (2009). Three Wave Hypothesis, Gear Model and the Rest
Mass.arXiv:0904.0790. Bibcode 2009arXiv0904.0790S.

38.^ (R Penrose, Road to Reality p521 s21.11) 39.^ http://www.quantum-relativity.org/Quantum-Relativity.pdf. See Q. Zheng and T.
Kobayashi, Quantum Optics as a Relativistic Theory of Light; Physics Essays 9 (1996) 447. Annual Report, Department of Physics, School of Science, University of Tokyo (1992) 240.

[edit]External

links

H. Nikolic. "Quantum mechanics: Myths and facts". arXiv:quantph/0609163. Young & Geller. "College Physics". B. Crowell. "Light as a Particle" (Web page). Retrieved December 10, 2006. E.H. Carlson, WaveParticle Duality: Light on Project PHYSNET R. Nave. "WaveParticle Duality" (Web page). HyperPhysics. Georgia State University, Department of Physics and Astronomy. Retrieved December 12, 2005.

Markus Arndt (2006). "Interferometry and decoherence experiments with large molecules" (Web page). University of Vienna. Retrieved May 6, 2006.[dead link]

Welle-Teilchen-Dualismus
Dieser Artikel oder nachfolgende Abschnitt ist nicht hinreichend mit Belegen (bspw. Einzelnachweisen) ausgestattet. Die fraglichen Angaben werden daher mglicherweise demnchst entfernt. Hilf bitte der Wikipedia, indem du die Angaben recherchierst und gute Belege einfgst. Nheres ist eventuell auf der Diskussionsseite oder in der Versionsgeschichte angegeben. Bitte entferne zuletzt diese Warnmarkierung.

Unter Welle-Teilchen-Dualismus versteht man einen klassischen Erklrungsansatz der Quantenmechanik, der besagt, dass Objekte aus der Quantenwelt sich in manchen Fllen nur als Wellen, in anderen als Teilchen beschreiben lassen. Mit der Interpretation der statistischen Wahrscheinlichkeiten im Rahmen der Kopenhagener Deutung (1927) bekam der Begriff eine etwas andere Bedeutung: Jede Strahlung hat sowohl Wellen- als auch Teilchencharakter, aber je nach dem durchgefhrten Experiment tritt nur der eine oder der andere in Erscheinung.
Inhaltsverzeichnis
[Verbergen]

1 Historische Anfnge 2 Einstein und die Photonen (Lichtquanten) 3 De Broglie und der Wellencharakter von Teilchen 4 Auflsung des Welle-Teilchen-Dualismus in der Quantenmechanik

4.1 Quantenmechanik und statistische Physik 4.2 Makroskopische Betrachtung

Historische Anfnge [Bearbeiten]

Die Brechung erfolgt zum Lot, weil die Lichtteilchen im optisch dichteren Medium eine geringere Ausbreitungsgeschwindigkeit haben

Auf die Frage, ob Licht aus Teilchen oder Wellen besteht, hat man im Laufe der Jahre unterschiedliche Antworten gegeben:

Huygens (1629-1695) gilt als Begrnder der Wellenoptik, konnte seine Annahmen allerdings nicht experimentell beweisen. Seinhuygenssches Prinzip wird heute noch unverndert angewendet.

Newton entwickelte ebenfalls im 17. Jahrhundert die geometrische Optik unter der Annahme, das Licht bestehe aus Teilchen (Korpuskeltheorie). Im Streit mit Huygens, ob denn nun dessen Wellentheorie (Wellenoptik) oder die Korpuskeltheorie richtig sei, siegte Newton Dank seiner greren Autoritt.

1802 zeigte Young experimentell mit dem Doppelspaltexperiment, dass Licht sich durch Interferenz auslschen lsst. Das wurde als eindeutiges Indiz fr dessen Wellencharakter interpretiert. Polarisierbarkeit sowie Vorhersage und Nachweis des Poisson-Flecks sorgten zusammen mit der Formulierung der Maxwellgleichungen Ende des 19. Jahrhunderts dafr, dass die Wellennatur des Lichtes allgemein anerkannt wurde.

Die Entdeckung und Untersuchung des photoelektrischen Effektes im gleichen Zeitraum zeigt, dass sich dieser Effekt sicher nicht mit Lichtwellen erklren lsst. Die Erklrung durch Einstein im Jahr 1905 beruht auf der Annahme von Lichtteilchen und war nach PlancksEntdeckung seines Wirkungsquantums im Jahre 1900 der zweite Startpunkt der Quantenmechanik.

Einstein und die Photonen (Lichtquanten) [Bearbeiten]


Im Jahre 1905 postulierte Albert Einstein zur Erklrung des Photoeffektes wiederum, dass das Licht aus Lichtquanten (Photonen) bestehen solle. Dabei bezog er sich auf Arbeiten von Planck zum Hohlraumstrahler aus dem Jahr 1900, in denen dieser erstmals eine Quantisierung von Energiewerten des harmonischen Oszillators annahm aus zunchst nur mathematischen Erwgungen. Das Photon stellt eine einzelne, also diskrete, Energieportion E dar (Quantisierung), so dass das Licht Energie nur in ganzzahligen Vielfachen dieser Menge aufnehmen oder abgeben kann. Die Energie E eines Photons ergab sich aus Plancks Untersuchungen zu

wobei h das plancksche Wirkungsquantum, f die Frequenz, c die Lichtgeschwindigkeit und die Wellenlnge des Photons sind. Diese Beziehung gilt auch fr mechanische Wellen, wie etwa fr Gitterschwingungen in einem Festkrper.

De Broglie und der Wellencharakter von Teilchen [Bearbeiten]


Louis de Broglie postulierte im Jahre 1924, dass auch massebehaftete Teilchen einen Wellencharakter besitzen. Er gab fr ein Teilchen mit dem Impuls p eine Wellenlnge von

an. Mit Hilfe von De Broglies Formel kann ein Beugungsverhalten von Teilchen vorhergesagt werden, welches 1927 experimentell durch Beugung eines Elektronenstrahls an einem Nickel-Kristall durch Davisson und Germer und schlielich durch das ElektronenDoppelspaltexperiment von Claus Jnsson im Jahre 1961 besttigt wurde. Der Wellencharakter der Materie ist heute auch fr weitaus grere Teilchen, beispielsweise komplexe Molekle wie Fullerene, nachgewiesen.

Auflsung des Welle-Teilchen-Dualismus in der Quantenmechanik [Bearbeiten]


Jedes Teilchen wird in der Quantenmechanik durch eine Wellenfunktion beschrieben. Die Wellenfunktion eines Teilchens ist komplexwertig und somit keine Messgre. Lediglich ihr Betragsquadrat kann als Aufenthaltswahrscheinlichkeit (genauer: als Volumendichte der Aufenthaltswahrscheinlichkeit) des Teilchens gedeutet und im Experiment bestimmt werden. Die zeitliche Entwicklung der Wellenfunktion des Teilchens und somit die Vernderung seiner Aufenthaltswahrscheinlichkeit wird durch die Schrdingergleichungbeschrieben.

Quantenmechanik und statistische Physik [Bearbeiten]


Im mikroskopischen Bereich dient der Welle-Teilchen-Dualismus als heuristische Erklrung fr einige physikalische Phnomene. So hngt nach De Broglie die Wellenlnge eines Teilchens von seiner Geschwindigkeit und somit auch von seiner Temperatur ab. Bei niedrigen Temperaturen knnen die De-Broglie-Wellenlngen von Atomen grer werden als der Atomdurchmesser und sich berlappen, wodurch teilweise die Effekte der Suprafluiditt von Helium-3 und Helium-4 erklrt werden knnen. Fr eine vollstndige und quantitative Behandlung dieser Themen muss jedoch die Quantenmechanik herangezogen werden.

Makroskopische Betrachtung [Bearbeiten]


Der Wellencharakter der Teilchen zeigt sich nicht bei makroskopischen Gegenstnden, was zwei prinzipielle Ursachen hat: Selbst bei langsamer Bewegung haben makroskopische Gegenstnde aufgrund ihrer groen Masse eine Wellenlnge, die erheblich kleiner ist als die Abmessungen des Gegenstandes. In diesem Fall kann man nicht mehr den gesamten Gegenstand als ein quantenmechanisches Objekt behandeln, sondern muss seine Bestandteile separat beschreiben.

In makroskopischen Gegenstnden laufen permanent thermodynamisch irreversible Prozesse ab und es werden Photonen (Wrmestrahlung) mit der Umgebung ausgetauscht. Beides fhrt zur Dekohrenz des Systems, was bedeutet, dass ein anfangs mglicherweise interferenzfhiger Zustand sich sehr schnell in einen nicht interferenzfhigen umwandelt, der sich dann wie ein klassisches Teilchen, also nicht wie eine Welle verhlt.

Siehe auch: Thermische Wellenlnge

Você também pode gostar