Hiển thị các bài đăng có nhãn Physics. Hiển thị tất cả bài đăng
Hiển thị các bài đăng có nhãn Physics. Hiển thị tất cả bài đăng

Thứ Sáu, 14 tháng 4, 2017

Flat Earth Theory: Does it have a scientific leg to stand on?

By:Alexandria Addesso

Although the Greek philosopher Aristotle in the 300s BCE explained in his writings that the Earth was spherical and not flat, most of the world did not come to agreement on this rational until the 1700s. Yet lately, there is in growing popularity a segment of people who currently hold the belief that the world is flat. Being that the scientific findings are constantly influx, is there any grounds for this non-spherical belief?

The most common flat-earth theory states that the Earth is a disc, with the Arctic Circle in the center of it and Antarctica around the rim in the form of a 150-foot-tall ice wall. This is a theory popularized by Orlando Ferguson, a real estate developer, in 1893. While the Earth is believed to be a disc, in this theory the sun and moon are believed to be spherical, thus explaining the Earth's day and night cycle by positing that the sun and moon are wide 32 miles (51 kilometers) and move in circles 3,000 miles (4,828 km) above the plane of the Earth. These celestial spheres illuminate different portions of the Earth in a 24-hour cycle. Yet this theory does not come with experimental evidence to back it up.



On the Flat Earth Society’s website there is a page listing simple experiments done by seven different people that support the theory. Most having to do with the lack of a “bulge” when looking, over 30 miles out into the ocean, with a telescope.

“IF the earth is a globe, and is 24,900 English statute miles in circumference, the surface of all standing water must have a certain degree of convexity--every part must be an arc of a circle,” said Tom Bishop, a flat Earth believer cited on the Flat Earth Society website. “From the summit of any such arc there will exist a curvature or declination of 8 inches in the first statute mile. In the second mile the fall will be 32 inches; in the third mile, 72 inches, or 6 feet, as shown in this chart. Ergo; looking at the opposite beach 30 miles away there should be a bulge of water over 600 feet tall blocking my view. There isn't.”



PBS NewsHour’s website also published an article several years ago about seven DIY experiments that could be down to prove the Earth is indeed spherical. One of the experiments directly challenges Bishop’s findings by simply suggesting that the experimenter watch the sunset, while laying on a beach, on the Pacific coast. If, while laying on your back you see the sun’s rays completely disappear, you will be able to see them again if you simply hop up and stand on your feet, thus justifying the Earth’s curvature.

Everything is to be questioned, even those questioning what is accepted.

Thứ Sáu, 7 tháng 4, 2017

Does the Universe have a Rest Frame?

Experiment aims at resolving divergence between special relativity and standard model of cosmology



Physics is sometimes closer to philosophy when it comes to understanding the universe. Donald Chang from Hong Kong University of Science and Technology, China, attempts to elucidate whether the universe has a resting frame. The results have recently been published in EPJ Plus.

To answer this tricky question, he has developed an experiment to precisely evaluate particle mass. This is designed to test the special theory of relativity that assumes the absence of a rest frame, otherwise it would be possible to determine which inertial frame is stationary and which frame is moving. This assumption, however, appears to diverge from the standard model of cosmology, which assumes that what we see as a vacuum is not an empty space. The assumption is that the energy of our universe comes from the quantum fluctuation in the vacuum.

In a famous experiment conducted by Michelson and Morley in the late 19th century, the propagation of light was proved to be independent of the movement of the laboratory system. Einstein, his Special Theory of Relativity, inferred that the physical laws governing the propagation of light are equivalent in all inertial frames -- this was later extended to all physics laws not just optics.



In this study, the author set out to precisely measure the masses of two charged particles moving in opposite directions. The conventional thinking assumes that the inertial frame applies equally to both particles. If that's the case, no detectable mass difference between these two particles is likely to arise. However, if the contrary is true, and there is a rest frame in the universe, the author expects to see mass difference that is dependent on the orientation of the laboratory frame. This proposed experiment partially inspired by the Michelson and Morley experiments can be conducted using existing experimental techniques. For simplicity, an electron can be used as the charged particle in the experiment.

Story Source: Material provided by Springer
Journal Reference: Donald C. Chang. Is there a resting frame in the universe? A proposed experimental test based on a precise measurement of particle mass.

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Tư, 29 tháng 3, 2017

How Small Can Superconductors Be?

Topographic image of a lead nanocrystal used in the study. Scale bar: 10 nm. Credit: Vlaic et al. Nature Communications

For the first time, physicists have experimentally validated a 1959 conjecture that places limits on how small superconductors can be. Understanding superconductivity (or the lack thereof) on the nanoscale is expected to be important for designing future quantum computers, among other applications.

In 1959, physicist P.W. Anderson conjectured that superconductivity can exist only in objects that are large enough to meet certain criteria. Namely, the object's superconducting gap energy must be larger than its electronic energy level spacing—and this spacing increases as size decreases. The cutoff point (where the two values are equal) corresponds to a volume of about 100 nm3. Until now it has not been possible to experimentally test the Anderson limit due to the challenges in observing superconducting effects at this scale.

In the new study published in Nature Communications, Sergio Vlaic and coauthors at the ‘University Paris Sciences et Lettres’ and French National Centre for Scientific Research (CNRS) designed a nano-system that allowed them to experimentally investigate the Anderson limit for the first time.

The Anderson limit arises because, at very small scales, the mechanisms underlying superconductivity essentially stop working. In general, superconductivity occurs when electrons bind together to form Cooper pairs. Cooper pairs have a slightly lower energy than individual electrons, and this difference in energy is the superconducting gap energy. The Cooper pairs' lower energy inhibits electron collisions that normally create resistance. If the superconducting gap energy gets too small and vanishes—which can occur, for example, when the temperature increases—then the electron collisions resume and the object stops being a superconductor.

The Anderson limit shows that small size is another way that an object may stop being a superconductor. However, unlike the effects of increasing the temperature, this is not because smaller objects have a smaller superconducting gap energy. Instead, it arises because smaller crystals have fewer electrons, and therefore fewer electron energy levels, than larger crystals do. Since the total possible electron energy of an element stays the same, regardless of size, smaller crystals have larger spacings between their electron energy levels than larger crystals do.



According to Anderson, this large electronic energy level spacing should pose a problem and he expected superconductivity to disappear when the spacing becomes larger than the superconducting gap energy. The reason for this, generally speaking, is that one consequence of increased spacing is a decrease in potential energy, which interferes with the competition between kinetic and potential energy that is necessary for superconductivity to occur.

To investigate what happens to the superconductivity of objects around the Anderson limit, the scientists in the new study prepared large quantities of isolated lead nanocrystals ranging in volume from 20 to 800 nm3.

Although they could not directly measure the superconductivity of such tiny objects, the researchers could measure something called the parity effect, which results from superconductivity. When an electron is added to a superconductor, the additional energy is partly affected by whether there is an even or odd number of electrons (the parity), which is due to the electrons forming Cooper pairs. If the electrons don't form Cooper pairs, there is no parity effect, indicating no superconductivity.

Although the parity effect has previously been observed in large superconductors, this study is the first time that it has been observed in small nanocrystals approaching the Anderson limit. In accordance with Anderson's predictions from more than 50 years ago, the researchers observed the parity effect for larger nanocrystals, but not for the smallest nanocrystals below approximately 100 nm3.



The results not only validate the Anderson conjecture, but also extend to a more general area, the Richardson-Gaudin models. These models are equivalent to the conventional theory of superconductivity, the Bardeen Cooper Schrieffer theory, for very small objects.

"Our experimental demonstration of the Anderson conjecture is also a demonstration of the validity of the Richardson-Gaudin models," coauthor Hervé Aubin at the ‘University Paris Sciences et Lettres’ and CNRS told Phys.org. "The Richardson-Gaudin models are an important piece of theoretical works because they can be solved exactly and apply to a wide range of systems; not only to superconducting nanocrystals but also to atomic nuclei and cold fermionic atomic gas, where protons and neutrons, which are fermions like electrons, can also form Cooper pairs."

On the more practical side, the researchers expect the results to have applications in future quantum computers.

"One of the most interesting applications of superconducting islands is their use as Cooper pair boxes employed in quantum bits, the elemental unit of a hypothetical quantum computer," Aubin said. "So far, Cooper pair boxes, used in qubits, are much larger than the Anderson limit. Upon reducing the size of the Cooper pair box, quantum computer engineers will eventually have to cope with superconductivity at the Anderson limit."
Source: Lisa Zyga - Nature Communications

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Ba, 21 tháng 3, 2017

NASA to launch Cold Atom Lab in Space

Free falling: NASA is putting ultracold atoms in space

A laboratory for cooling an atomic gas to just a billionth of a degree above absolute zero will soon be sent up to the International Space Station (ISS) by physicists working at NASA's Jet Propulsion Laboratory. The goal of the Cold Atom Lab (CAL) mission is to create long-lived Bose–Einstein condensates (BECs) that could lead to better sensors and atomic clocks for use on spacecraft. The BECs could even provide important insights into the nature of dark energy, according to the researchers.

First created in 1995, a BEC is made by trapping and cooling an atomic gas to an extremely low temperature so the atoms fall into the same low-energy quantum state. Instead of behaving like a collection of individual atoms, a BEC is essentially a large quantum object. This makes it very sensitive to disturbances such as stray magnetic fields and accelerations, and therefore BECs can be used to create extremely good sensors.

Falling down
Here on Earth, gravity puts an upper limit on the lifetime of a BEC – the atoms fall down and after a fraction of a second the BEC has dropped out of view of the experiment. In the microgravity environment of the ISS, however, NASA's Robert Thompson and colleagues reckon that their BECs should be observable for 5–10 s. As well as allowing physicists to make more precise measurements of the quantum properties of BECs, the longer lifetime should also make the BECs better sensors. With further development, the team believes that BECs in space could endure for hundreds of seconds.

Five scientific teams will do experiments using Cold Atom Lab, including one led by Eric Cornell of the University of Colorado – who shared the 2001 Nobel Prize for Physics for creating the first BECs.



As well as creating BECs, CAL will also cool fermionic atoms to create degenerative Fermi gases. These systems can be made to mimic the behaviour of electrons in solids and could provide important insights into phenomena such as superconductivity. Physicists will also study ultracold mixtures of bosonic and fermionic atoms. Other planned experiments include atom interferometry and very precise measurements of gravity itself.

Pervasive forces
"Studying these hyper-cold atoms could reshape our understanding of matter and the fundamental nature of gravity," says Thompson. "The experiments we'll do with the Cold Atom Lab will give us insight into gravity and dark energy – some of the most pervasive forces in the universe."



CAL will be contained within a package about the size of an "ice box". This will contain a vacuum chamber, lasers and electronics. It will also include an electromagnetic "knife", which will be used to cool the atoms. The lab is currently in the final stages of assembly and will be launched in August on a SpaceX CRS-12 rocket.
Author
Hamish Johnston is editor of physicsworld.com

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Sáu, 10 tháng 3, 2017

Scientists reach Back in Time to Discover some of the most Power-packed Galaxies

In the heart of an active galaxy, matter falling toward a supermassive black hole generates jets of particles traveling near the speed of light.Credit: NASA's Goddard Space Flight Center Scientific Visualization Studio

When the universe was young, a supermassive black hole heaved out a jet of particle-infused energy that raced through space at nearly the speed of light. Billions of years later, scientists has identified this black hole and four others similar to it that range in age from 1.4 billion to 1.9 billion years old.



When the universe was young, a supermassive black hole -- bloated to the bursting point with stupendous power -- heaved out a jet of particle-infused energy that raced through the vastness of space at nearly the speed of light.

Billions of years later, a trio of Clemson University scientists, led by College of Science astrophysicist Marco Ajello, has identified this black hole and four others similar to it that range in age from 1.4 billion to 1.9 billion years old. These objects emit copious gamma rays, light of the highest energy, that are billions of times more energetic than light that is visible to the human eye.

The previously known earliest gamma-ray blazars -- a type of galaxy whose intense emission is powered by extremely powerful relativistic jets launched by monstrous black holes -- were more than 2 billion years old. Currently, the universe is estimated to be approximately 14 billion years old.



"The discovery of these supermassive black holes, which launch jets that emit more energy in one second than our sun will produce in its entire lifetime, was the culmination of a yearlong research project," said Ajello, who has spent much of his career studying the evolution of distant galaxies. "Our next step is to increase our understanding of the mechanisms involved in the formation, development and activities of these amazing objects, which are the most powerful accelerators in the universe. We can't even come close to replicating such massive outputs of energy in our laboratories. The complexities we're attempting to unravel seem almost as mysterious as the black holes themselves."

Ajello conducted his research in conjunction with Clemson post-doc Vaidehi Paliya and Ph.D candidate Lea Marcotulli. The trio worked closely with the Fermi-Large Area Telescope collaboration, which is an international team of scientists that includes Roopesh Ojha, an astronomer at NASA's Goddard Space Flight Center in Greenbelt, Maryland; and Dario Gasparrini of the Italian Space Agency.

The Clemson team's breakthroughs were made possible by recently juiced-up software on NASA's Fermi Gamma-ray Telescope. The refurbished software significantly boosted the orbiting telescope's sensitivity to a level that made these latest discoveries possible.



"People are calling it the cheapest refurbishment in history," Ajello said. "Normally, for the Hubble Space Telescope, NASA had to send someone up to space to physically make these kinds of improvements. But in this case, they were able to do it remotely from an Earth-bound location. And of equal importance, the improvements were retroactive, which meant that the previous six years of data were also entirely reprocessed. This helped provide us with the information we needed to complete the first step of our research and also to strive onward in the learning process."

Using Fermi data, Ajello and Paliya began with a catalog of 1.4 million quasars, which are
galaxies that harbor at their centers active supermassive black holes. Over the course of a year, they narrowed their search to 1,100 objects. Of these, five were finally determined to be newly discovered gamma-ray blazars that were the farthest away -- and youngest -- ever identified.

"After using our filters and other devices, we were left with about 1,100 sources. And then we did the diagnostics for all of these and were able to narrow them down to 25 to 30 sources," Paliya said. "But we still had to confirm that what we had detected was scientifically authentic. So we performed a number of other simulations and were able to derive properties such as black hole mass and jet power. Ultimately, we confirmed that these five sources were guaranteed to be gamma-ray blazars, with the farthest one being about 1.4 billion years old from the beginning of time."



Marcotulli, who joined Ajello's group as a Ph.D student in 2016, has been studying the blazars' mechanisms by using images and data delivered from another orbiting NASA telescope, the Nuclear Spectroscopic Telescope Array (NuSTAR). At first, Marcotulli's role was to understand the emission mechanism of gamma-ray blazars closer to us. Now she is turning her attention toward the most distant objects in a quest to understand what makes them so powerful.

"We're trying to understand the full spectrum of the energy distribution of these objects by using physical models," Marcotulli said. "We are currently able to model what's happening far more accurately than previously devised, and eventually we'll be able to better understand what processes are occurring in the jets and which particles are radiating all the energy that we see. Are they electrons? or protons? How are they interacting with surrounding photons? All these parameters are not fully understood right now. But every day we are deepening our understanding."



All galaxies have black holes at their centers -- some actively feeding on the matter surrounding them, others lying relatively dormant. Our own galaxy has at its center a super-sized black hole that is currently dormant. Ajello said that only one of every 10 black holes in today's universe are active. But when the universe was much younger, it was closer to a 50-50 ratio.

The supermassive black holes at the center of the five newly discovered blazar galaxies are among the largest types of black holes ever observed, on the order of hundreds of thousands to billions of times the mass of our own sun. And their accompanying accretion disks -- rotating swirls of matter that orbit the black holes -- emit more than two trillion times the energy output of our sun.

One of the most surprising elements of Ajello's research is how quickly -- by cosmic measures -- these supersized black holes must have grown in only 1.4 billion years. In terms of our current knowledge of how black holes grow, 1.4 billion years is barely enough time for a black hole to reach the mass of the ones discovered by Ajello's team.

"How did these incomprehensibly enormous and energy-laden black holes form so quickly?" Ajello said. "Is it because one black hole ate a lot all the time for a very long time? Or maybe because it bumped into other black holes and merged into one? To be honest, we have no observations supporting either argument. There are mechanisms at work that we have yet to unravel. Puzzles that we have yet to solve. When we do eventually solve them, we will learn amazing things about how the universe was born, how it grew into what it has become, and what the distant future might hold as the universe continues to progress toward old age."
Source: Materials provided by Clemson University

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Sáu, 3 tháng 3, 2017

Scientists reveal new Super-Fast form of Computer that 'Grows as it Computes'

DNA double helix. Credit: public domain

Researchers from The University of Manchester have shown it is possible to build a new super-fast form of computer that "grows as it computes".

Professor Ross D King and his team have demonstrated for the first time the feasibility of engineering a nondeterministic universal Turing machine (NUTM), and their research is to be published in the prestigious Journal of the Royal Society Interface.

The theoretical properties of such a computing machine, including its exponential boost in speed over electronic and quantum computers, have been well understood for many years – but the Manchester breakthrough demonstrates that it is actually possible to physically create a NUTM using DNA molecules.

"Imagine a computer is searching a maze and comes to a choice point, one path leading left, the other right," explained Professor King, from Manchester's School of Computer Science. "Electronic computers need to choose which path to follow first.



"But our new computer doesn't need to choose, for it can replicate itself and follow both paths at the same time, thus finding the answer faster.

"This 'magical' property is possible because the computer's processors are made of DNA rather than silicon chips. All electronic computers have a fixed number of chips.
"Our computer's ability to grow as it computes makes it faster than any other form of computer, and enables the solution of many computational problems previously considered impossible.
"Quantum computers are an exciting other form of computer, and they can also follow both paths in a maze, but only if the maze has certain symmetries, which greatly limits their use.

"As DNA molecules are very small a desktop computer could potentially utilize more processors than all the electronic computers in the world combined - and therefore outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy."



The University of Manchester is famous for its connection with Alan Turing - the founder of computer science - and for creating the first stored memory electronic computer.

"This new research builds on both these pioneering foundations," added Professor King.
Alan Turing's greatest achievement was inventing the concept of a universal Turing machine (UTM) - a computer that can be programmed to compute anything any other computer can compute. Electronic computers are a form of UTM, but no quantum UTM has yet been built.

DNA computing is the performing of computations using biological molecules rather than traditional silicon chips. In DNA computing, information is represented using the four-character genetic alphabet - A [adenine], G [guanine], C [cytosine], and T [thymine] - rather than the binary alphabet, which is a series of 1s and 0s used by traditional computers.

Provided by: University of Manchester

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Ba, 21 tháng 2, 2017

The Future is here: Terahertz chips a New Way of Seeing through Matter

Princeton University researchers have drastically shrunk the equipment for producing terahertz -- important electromagnetic pulses lasting one millionth of a millionth of a second -- from a tabletop setup with lasers and mirrors to a pair.

Electromagnetic pulses lasting one millionth of a millionth of a second may hold the key to advances in medical imaging, communications and drug development. But the pulses, called terahertz waves, have long required elaborate and expensive equipment to use.

Now, researchers at Princeton University have drastically shrunk much of that equipment: moving from a tabletop setup with lasers and mirrors to a pair of microchips small enough to fit on a fingertip.

In two articles recently published in the IEEE Journal of Solid State Circuits, the researchers describe one microchip that can generate terahertz waves, and a second chip that can capture and read intricate details of these waves.


In two recently published articles, researchers Kaushik Sengupta (left), an assistant professor of electrical engineering, and Xue Wu (right), a Princeton graduate student in computer science, describe one microchip that can generate.

"The system is realized in the same silicon chip technology that powers all modern electronic devices from smartphones to tablets, and therefore costs only a few dollars to make on a large scale" said lead researcher Kaushik Sengupta, a Princeton assistant professor of electrical engineering.

Terahertz waves are part of the electromagnetic spectrum—the broad class of waves that includes radio, X-rays and visible light—and sit between the microwave and infrared light wavebands. The waves have some unique characteristics that make them interesting to science. For one, they pass through most non-conducting material, so they could be used to peer through clothing or boxes for security purposes, and because they have less energy than X-rays, they don't damage human tissue or DNA.

Terahertz waves also interact in distinct ways with different chemicals, so they can be used to characterize specific substances. Known as spectroscopy, the ability to use light waves to analyze material is one of the most promising—and the most challenging—applications of terahertz technology, Sengupta said.

To do it, scientists shine a broad range of terahertz waves on a target then observe how the waves change after interacting with it. The human eye performs a similar type of spectroscopy with visible light—we see a leaf as green because light in the green light frequency bounces off the chlorophyll-laden leaf.

The challenge has been that generating a broad range of terahertz waves and interpreting their interaction with a target requires a complex array of equipment such as bulky terahertz generators or ultrafast lasers. The equipment's size and expense make the technology impractical for most applications.

Researchers have been working for years to simplify these systems. In September, Sengupta's team reported a way to reduce the size of the terahertz generator and the apparatus that interprets the returning waves to a millimeter-sized chip. The solution lies in re-imaging how an antenna functions. When terahertz waves interact with a metal structure inside the chip, they create a complex distribution of electromagnetic fields that are unique to the incident signal. Typically, these subtle fields are ignored, but the researchers realized that they could read the patterns as a sort of signature to identify the waves. The entire process can be accomplished with tiny devices inside the microchip that read terahertz waves.



"Instead of directly reading the waves, we are interpreting the patterns created by the waves," Sengupta said. "It is somewhat like looking for a pattern of raindrops by the ripples they make in a pond."

Daniel Mittleman, a professor of engineering at Brown University, said the development was "a very innovative piece of work, and it potentially has a lot of impact." Mittleman, who is the vice chair of the International Society for Infrared Millimeter and Terahertz Waves, said scientists still have work to do before the terahertz band can begin to be used in everyday devices, but the developments are promising.
"It is a very big puzzle with many pieces, and this is just one, but it is a very important one," said Mittleman, who is familiar with the work but had no role in it.

On the terahertz-generation end, much of the challenge is creating a wide range of wavelengths within the terahertz band, particularly in a microchip. The researchers realized they could overcome the problem by generating multiple wavelengths on the chip. They then used precise timing to combine these wavelengths and create very sharp terahertz pulses.

In an article published Dec. 14 in the IEEE Journal of Solid State Circuits, the researchers explained how they created a chip to generate the terahertz waves. The next step, the researchers said, is to extend the work farther along the terahertz band. "Right now we are working with the lower part of the terahertz band," said Xue Wu, a Princeton doctoral student in electrical engineering and an author on both papers.

"What can you do with a billion transistors operating at terahertz frequencies?" Sengupta asked. "Only by re-imagining these complex electromagnetic interactions from fundamental principles can we invent game-changing new technology."
Provided by: Princeton University

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Tư, 15 tháng 2, 2017

The Universe's largest supernova: a spinning, star eating black hole

By: Alexandria Addesso

According to the National Aeronautics and Space Administration (NASA), a supernova is defined as one of the largest explosions that take place in space, in particular the explosion of a star. A supernova usually manifests when there is a change in the core, or center, of a star.

The brightest supernova ever recorded, and actually categorized as a super-luminous supernova, was the explosion of an extremely massive star at the end of its life. It was named ASASSN-15lh. ASASSN-15 lhn, was first observed in 2015 by the All Sky Automated Survey for Super-Novae (ASAS-SN). It was twice as bright as the previous record holder, and at its peak was 20 times brighter than the total light output of the entire Milky Way.



"We observed the source for 10 months following the event and have concluded that the explanation is unlikely to lie with an extraordinarily bright supernova,” said Giorgos Leloudas the leader of the team that observed the event at the Weizmann Institute of Science in Israel. “Our results indicate that the event was probably caused by a rapidly spinning supermassive black hole as it destroyed a low-mass star."
After a series of further observations, it was discovered that the massive explosion took place about 4 billion light-years from Earth in a distant galaxy. It is believed that the star that the black hole consumed in order to produce such a massive explosion was its solar system's biggest star, comparable to our own sun. Since then rays have been observed traveling from the black hole towards us at the speed of light. These rays are forming a disc of gas around the black hole as it converts gravitational energy into electromagnetic radiation, producing a bright source of light visible on multiple wavelengths.



"Incredibly, this source is still producing X-rays and may remain bright enough for Swift to observe into next year," said David Burrows, the lead scientist of the team utilizing NASA's Swift satellite to monitor the massive black hole as well as a professor of astronomy at Penn State University. "It behaves unlike anything we've seen before."

Could such an explosion happen in our solar system with our own sun? This is a question scientists and inquiring minds are still trying to figure out.

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Năm, 2 tháng 2, 2017

First step towards photonic quantum network

Illustration of a photon gun. A quantum dot (the yellow symbol) emits one photon (red wave packet) at a time. The quantum dot is embedded in a photonic crystal structure, which is obtained by etching holes (black circles) in a semiconductor.

Advanced photonic nanostructures are well on their way to revolutionizing quantum technology for quantum networks based on light. Researchers from the Niels Bohr Institute have now developed the first building blocks needed to construct complex quantum photonic circuits for quantum networks. This rapid development in quantum networks is highlighted in an article in the journal Nature.

Quantum technology based on light (photons) is called quantum photonics, while electronics is based on electrons. Photons (light particles) and electrons behave differently at the quantum level. A quantum entity is the smallest unit in the microscopic world. For example, photons are the fundamental constituent of light and electrons of electric current. Electrons are so-called fermions and can easily be isolated to conduct current one electron at a time. In contrast photons are bosons, which prefer to bunch together. But since information for quantum communication based on photonics is encoded in a single photon, it is necessary to emit and send them one at a time.

Increased information capacity
Information based on photons has great advantages; photons interact only very weakly with the environment - unlike electrons, so photons do not lose much energy along the way and can therefore be sent over long distances. Photons are therefore very well suited for carrying and distributing information and a quantum network based on photons will be able to encode much more information than is possible with current computer technology and the information could not be intercepted in route.

Many research groups around the world are working intensively in this research field, which is developing rapidly and in fact the first commercial quantum photonics products are starting to be manufactured.


Directional emission of photons. The figure shows the calculations of the photon emission in the new directional single-photon source. If the spin of the quantum dot's electron points up, the photon will be emitted in the one direction.

Control of the photons
A prerequisite for quantum networks is the ability to create a stream of single photons on demand and the researchers at the Niels Bohr Institute succeeded in doing exactly that.

"We have developed a photonic chip, which acts as a photon gun. The photonic chip consists of an extremely small crystal that is 10 microns wide and is 160 nanometers thick. Embedded in the middle of the chip is a light source, which is a so-called quantum dot. Illuminating the quantum dot with laser light excites an electron, which can then jump from one orbit to another and thereby emit a single photon at a time. Photons are usually emitted in all directions, but the photonic chip is designed so that all the photons are sent out through a photonic waveguide," explains Peter Lodahl, professor and head of the Quantum Photonics research group at the Niels Bohr Institute, University of Copenhagen.

In a long, laborious process, the research group further developed and tested the photonic chip until it achieved extreme efficiency and Peter Lodahl explains that it was particularly surprising that they could get the photon emission to occur in a way that was not previously thought possible. Normally, the photons are transmitted in both directions in the photonic waveguide, but in their custom-made photonic chip they could break this symmetry and get the quantum dot to differentiate between emitting a photon right or left, that means emit directional photons. This means full control over the photons and the researchers are beginning to explore how to construct complete quantum network systems based on the new discovery.

"The photons can be sent over long distances via optical fibers, where they whiz through the fibers with very little loss. You could potentially build a network where the photons connect small quantum systems, which are then linked together into a quantum network - a quantum internet," explains Peter Lodahl.

He adds that while the first basic functionalities are already a reality, the great challenge is now to expand them to large, complex quantum networks.

Journal reference: Nature
Provided by: Niels Bohr Institute

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Bảy, 28 tháng 1, 2017

New Laser based on unusual physics phenomenon could improve telecommunications, and computing applications

This is a schematic of the BIC laser: a high frequency laser beam (blue) powers the membrane to emit a laser beam at telecommunication frequency (red). Credit: Kanté group, UC San Diego

Researchers at the University of California San Diego have demonstrated the world's first laser based on an unconventional wave physics phenomenon called bound states in the continuum. The technology could revolutionize the development of surface lasers, making them more compact and energy-efficient for communications and computing applications. The new BIC lasers could also be developed as high-power lasers for industrial and defense applications.

"Lasers are ubiquitous in the present-day world, from simple everyday laser pointers to complex laser interferometers used to detect gravitational waves. Our current research will impact many areas of laser applications," said Ashok Kodigala, an electrical engineering Ph.D. student at UC San Diego and first author of the study.



"Because they are unconventional, BIC lasers offer unique and unprecedented properties that haven't yet been realized with existing laser technologies," said Boubacar Kanté, electrical engineering professor at the UC San Diego Jacobs School of Engineering who led the research.

For example, BIC lasers can be readily tuned to emit beams of different wavelengths, a useful feature for medical lasers made to precisely target cancer cells without damaging normal tissue. BIC lasers can also be made to emit beams with specially engineered shapes (spiral, donut or bell curve) -- called vector beams -- which could enable increasingly powerful computers and optical communication systems that can carry up to 10 times more information than existing ones.

"Light sources are key components of optical data communications technology in cell phones, computers and astronomy, for example. In this work, we present a new kind of light source that is more efficient than what's available today in terms of power consumption and speed," said Babak Bahari, an electrical engineering Ph.D. student in Kanté's lab and a co-author of the study.

Bound states in the continuum (BICs) are phenomena that have been predicted to exist since 1929. BICs are waves that remain perfectly confined, or bound, in an open system. Conventional waves in an open system escape, but BICs defy this norm -- they stay localized and do not escape despite having open pathways to do so.

In a previous study, Kanté and his team demonstrated, at microwave frequencies, that BICs could be used to efficiently trap and store light to enable strong light-matter interaction. Now, they're harnessing BICs to demonstrate new types of lasers. The team published the work Jan. 12 in Nature.



Making the BIC laser
The BIC laser in this work is constructed from a thin semiconductor membrane made of indium, gallium, arsenic and phosphorus. The membrane is structured as an array of Nano-sized cylinders suspended in air. The cylinders are interconnected by a network of supporting bridges, which provide mechanical stability to the device.

By powering the membrane with a high frequency laser beam, researchers induced the BIC system to emit its own lower frequency laser beam (at telecommunication frequency).
"Right now, this is a proof of concept demonstration that we can indeed achieve lasing action with BICs," Kanté said.

"And what's remarkable is that we can get surface lasing to occur with arrays as small as 8 × 8 particles," he said. In comparison, the surface lasers that are widely used in data communications and high-precision sensing, called VCSELs (vertical-cavity surface-emitting lasers), need much larger (100 times) arrays -- and thus more power -- to achieve lasing.

"The popular VCSEL may one day be replaced by what we're calling the 'BICSEL' -- bound state in the continuum surface-emitting laser, which could lead to smaller devices that consume less power," Kanté said. The team has filed a patent for the new type of light source.

The array can also be scaled up in size to create high power lasers for industrial and defense applications, he noted. "A fundamental challenge in high power lasers is heating and with the predicted efficiencies of our BIC lasers, a new era of laser technologies may become possible," Kanté said.

The team's next step is to make BIC lasers that are electrically powered, rather than optically powered by another laser. "An electrically pumped laser is easily portable outside the lab and can run off a conventional battery source," Kanté said.
Story Source:
Materials provided by University of California - San Diego. Original written by Liezel Labios.

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Ba, 24 tháng 1, 2017

Cryogenic test probes Einstein's equivalence principle, general relativity, and space-time 'foam'

Illustration of the experimental set-up in which, scientists attempted to detect any change in the length of a cryogenic silicon resonator. They detected no change, in support of the equivalence principle. Credit: Wiens et al. ©2016 American Physical Society.

Physicists have performed a test designed to investigate the effects of the expansion of the universe—hoping to answer questions such as "does the expansion of the universe affect laboratory experiments?", "might this expansion change the lengths of solid objects and the time measured by atomic clocks differently, in violation of Einstein's equivalence principle?", and "does space-time have a foam-like structure that slightly changes the speed of photons over time?", an idea that could shed light on the connection between general relativity and quantum gravity.



In their study published in Physical Review Letters, E. Wiens, A.Yu. Nevsky, and S. Schiller at ‘Heinrich Heine Universität Düsseldorf’ in Germany, have used a cryogenic resonator to make some of the most precise measurements yet on the length stability of a solid object. Overall, the results provide further confirmation of Einstein's equivalence principle, which is the foundation on which the theory of general relativity is based on. And in agreement with previous experiments, the researchers found no evidence of space-time foam.

"It is not easy to imagine ways of testing for consequences of the expansion of the universe that occur in the laboratory (as opposed to studying distant galaxies)," Schiller told Phys.org. "Our approach is one way to perform such a test. That we have not observed any effect is consistent with the prediction of general relativity."

Over the course of five months, the researchers made daily measurements of the resonator's length by measuring the frequency of an electromagnetic wave trapped within it. In order to suppress all thermal motion, the researchers operated the resonator at cryogenic temperature (1.5 degrees above absolute zero). In addition, external disturbances, such as tilt, irradiation by laser light, and some other effects that might destabilize the device were kept as small as possible.

To measure the resonator's frequency, the researchers used an atomic clock. Any change in frequency would indicate that the change in length of the resonator differs from the change in time measured by the atomic clock.

The experiment detected virtually no change in frequency, or "zero drift"—more precisely, the mean fractional drift was measured to be about 10-20/second, corresponding to a decrease in length that the researchers describe as equivalent to depositing no more than one layer of molecules onto the mirrors of the resonator over a period of 3000 years. This drift is the smallest value measured so far for any resonator.



One of the most important implications of the null result is that it provides further support for the equivalence principle. Formulated by Einstein in the early 1900s, the equivalent principles is the idea that gravity and acceleration—such as the acceleration a person would feel in an upward-accelerating elevator in space—are equivalent.

This principle leads to several related concepts, one of which is local position invariance, which states that, the non-gravitational laws of physics (for example, electromagnetism) are the same everywhere. In the current experiment, any amount of resonance drift would have violated local position invariance. Along similar lines, any amount of resonance drift would also have violated general relativity, since general relativity prohibits changes to the length of solid objects caused by the expansion of the universe.
Finally, the experiment also attempted to detect the hypothetical existence of space-time foam. One of the effects of space-time foam would be that repeated measurements of a length would produce fluctuating results. The constant measurement results reported here therefore indicate that such fluctuations, if they exist at all, must be very small.



In the future, the researchers hope that the extremely precise measurement technique, using the cryogenic resonator could be used for other applications.

"One of the greatest outcomes of this work is that we have developed an approach to make and operate an optical resonator that has extremely little drift," Schiller said. "This could have applications to the field of atomic clocks and precision measurements—for example, for the radar tracking of spacecraft in deep space."
Journal reference: Physical Review Letters

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Hai, 16 tháng 1, 2017

We can see the future of Quantum Systems

Scientists at the University of Sydney have demonstrated the ability to "see" the future of quantum systems, and used that knowledge to preempt their demise, in a major achievement that could help bring the strange and powerful world of quantum technology closer to reality.

The applications of quantum-enabled technologies are compelling, and already demonstrating significant impacts - especially in the realm of sensing and metrology. And the potential to build exceptionally powerful quantum computers using quantum bits, or qubits, is driving investment from the world's largest companies.

However a significant obstacle to building reliable quantum technologies has been the randomization of quantum systems by their environments, or de-coherence, which effectively destroys the useful quantum character.

The physicists have taken a technical quantum leap in addressing this, using techniques from big data to predict how quantum systems will change and then, preventing the system's breakdown from occurring.



The research is published today in Nature Communications.
"Much the way the individual components in mobile phones will eventually fail, so too do quantum systems," said the paper's senior author Professor Michael J. Biercuk.
"But in quantum Technology the lifetime is generally measured in fractions of a second, rather than years."

Professor Biercuk, from the University of Sydney's School of Physics and a chief investigator at the Australian Research Council's Centre for Engineered Quantum Systems, said his group had demonstrated it was possible to suppress de-coherence in a preventive manner. The key was to develop a technique to predict how the system would disintegrate.

Professor Biercuk highlighted the challenges of making predictions in a quantum world: "Humans routinely employ predictive techniques in our daily experience; for instance, when we play tennis we predict where the ball will end up based on observations of the airborne ball," he said.

This works because the rules that govern how the ball will move, like gravity, are regular and known. But what if the rules changed randomly while the ball was on its way to you? In that case it's next to impossible to predict the future behavior of that ball.
"And yet this situation is exactly what we had to deal with because the disintegration of quantum systems is random. Moreover, in the quantum realm observation erases ‘quantumness’, so our team needed to be able to guess how and when the system would randomly break.

"We effectively needed to swing at the randomly moving tennis ball while blindfolded."

The team turned to machine learning for help in keeping their quantum systems - qubits realized in trapped atoms - from breaking.



What might look like random behavior actually contained enough information for a computer program to guess how the system would change in the future. It could then predict the future without direct observation, which would otherwise erase the system's useful characteristics.

The predictions were remarkably accurate, allowing the team to use their guesses preemptively to compensate for the anticipated changes.

Doing this in real time allowed the team to prevent the disintegration of the quantum character, extending the useful lifetime of the qubits.
"We know that building real quantum technologies will require major advances in our ability to control and stabilize qubits - to make them useful in applications," Professor Biercuk said.

Our techniques apply to any qubit, built in any technology, including the special superconducting circuits being used by major corporations.
"We're excited to be developing new capabilities that turn quantum systems from novelties into useful technologies. The quantum future is looking better all the time," Professor Biercuk said.

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Chủ Nhật, 8 tháng 1, 2017

Giant atoms could help unveil 'dark matter' and other cosmic secrets

The universe is an astonishingly secretive place. Mysterious substances known as dark matter and dark energy account for some 95% of it, and despite huge effort to find out what they are, we simply don't know.

We know dark matter exists because of the gravitational pull of galaxy clusters – the matter we can see in a cluster just isn't enough to hold it together by gravity. So there must be some extra material there, made up by unknown particles that simply aren't visible to us. Several candidate particles have already been proposed.

Scientists are trying to work out what these unknown particles are by looking at how they affect the ordinary matter we see around us. But so far it has proven difficult, so we know it interacts only weakly with normal matter at best. Now my colleague Benjamin Varcoe and I have come up with a new way to probe dark matter that may just prove successful: by using atoms that we have been stretched to be 4,000 times larger than usual.



Advantageous atoms

We have come a long way from the Greeks' vision of atoms as the indivisible components of all matter. The first evidence-based argument for the existence of atoms was presented in the early 1800s by John Dalton. But it wasn't until the beginning of the 20th century that JJ Thomson and Ernest Rutherford discovered that atoms consist of electrons and a nucleus. Soon after, Erwin Schrödinger described the atom mathematically using what is today called quantum theory.

Modern experiments have been able to trap and manipulate individual atoms with outstanding precision. This knowledge has been used to create new technologies, like lasers and atomic clocks, and future computers may use single atoms as their primary components.

Individual atoms are hard to study and control because they are very sensitive to external perturbations. This sensitivity is usually an inconvenience, but our study suggests that it makes some atoms ideal as probes for the detection of particles that don't interact strongly with regular matter – such as dark matter.

Our model is based on the fact that weakly interacting particles must bounce from the nucleus of the atom it collides with and exchange a small amount of energy with it – similar to the collision between two pool balls. The energy exchange will produce a sudden displacement of the nucleus that will eventually be felt by the electron. This means the entire energy of the atom changes, which can be analyzed to obtain information about the properties of the colliding particle.


The Large Underground Xenon experiment installed 4,850 ft underground inside a 70,000-gallon water tank shield. Credit: Gigaparsec at English Wikipedia, CC BY-SA

However the amount of transferred energy is very small, so a special kind of atom is necessary to make the interaction relevant. We worked out that the so-called “Rydberg atom” would do the trick. These are atoms with long distances between the electron and the nucleus, meaning they possess high potential energy. Potential energy is a form of stored energy. For example, a ball on a high shelf has potential energy because this could be converted to kinetic energy if it falls off the shelf.

In the lab, it is possible to trap atoms and prepare them in a Rydberg state – making them as big as 4,000 times their original size. This is done by illuminating the atoms with a laser with light at a very specific frequency.

This prepared atom is likely much heavier than the dark matter particles. So rather than a pool ball striking another, a more appropriate description will be a marble hitting a bowling ball. It seems strange that big atoms are more perturbed by collisions than small ones – one may expect the opposite (smaller things are usually more affected when a collision occurs).

The explanation is related to two features of Rydberg atoms: they are highly unstable, because of their elevated energy, so minor perturbations would disturb them more. Also, due to their big area, the probability of the atoms interacting with particles is increased, so they will suffer more collisions.



Spotting the tiniest of particles

Current experiments typically look for dark matter particles by trying to detect their scattering off atomic nuclei or electrons on Earth. They do this by looking for light or free electrons in big tanks of liquid noble gases that are generated by energy transfer between the dark matter particle and the atoms of the liquid.

But, according to the laws of quantum mechanics, there needs to be a certain minimum energy transfer for the light to be produced. An analogy would be a particle colliding with a guitar string: it will produce a note that we can hear, but if the particle is too small the string will not vibrate at all.

So the problem with these methods is that the dark matter particles have to be big enough if we are to detect it in this way. However, our calculations show that the Rydberg atoms will be disturbed in a significant way even by low-mass particles – meaning they can be applied to search for candidates of dark matter that other experiments miss. One of such particles is the Axion, a hypothetical particle which is a strong candidate for dark matter.

Experiments would require for the atoms to be treated with extreme care, but they will not require to be done in a deep underground facility like other experiments as the Rydberg atoms are expected to be less susceptible to cosmic rays compared to dark matter.

We are working to further improve the sensitivity of the system, aiming to extend the range of particles that it may be able to perceive.

Beyond dark matter we are also aiming to one day apply it for the detection of gravitational waves, the ripples in the fabric of space predicted by Einstein long time ago. These perturbations of the space-time continuum have been recently discovered, but we believe that by using atoms we may be able to detect gravitational waves with a different frequency to the ones already observed.

Source: Diego A. Quiñones, The Conversation

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Chủ Nhật, 1 tháng 1, 2017

Exoplanets

By: Alexandria Addesso

From a young age we are taught the names and placements of the planets within our own solar system, but very infrequently taught about planets outside of it. Exoplanets are planets the orbit stars outside of our solar system. For thousands of years such planets were unfathomable because of the lack of technology needed to see them. And within the scientific community, seeing is believing.

“People ask what will happen if Mars One fails. There will be Mars Two, Mars Three, there will be Gliese 581 One, Proxima Centauri b One etc,” said playwright Mehmet Murat Ildan.
“If a project opens the path for other projects, it means that it has already triumphed!”



It was not until the 1990s that exoplanets were discovered. In 1992 a pulsar, which is a rapidly spinning corpse of a star that died as a supernova, named PSR 1257+12 was discovered. Although not an exoplanet, PSR 1257+12 sparked the momentum for scientists to do more research in the field. By 1995 the first exoplanet was confirmed, it was a Jupiter-mass-like planet 20 times closer to its sun-like star than the Earth is to its sun. The planet was deemed 51 Pegasi b.

With the help of the Kepler Spacecraft that was launched in March of 2009, many other exoplanets have been discovered. According to the National Aeronautics and Space Administration (NASA) there are 3,431 confirmed exoplanets, an additional 4,696 possible exoplanets yet to be confirmed, and a total of 2,562 solar systems. Scientist are also researching what exoplanets may be habitual for human beings. Such planets are usually the same or similar distance from their sun as Earth is from ours.



“This is the big question we all have: are we alone in the universe? And exoplanets confirm the suspicions that planets are not rare,” said astrophysicist, cosmologist, author, and science communicator Neil deGrasse Tyson.

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Năm, 29 tháng 12, 2016

Nano-Starship Being Developed to Travel 20 Percent the Speed of Light

Space exploration has made major strides in the past century, bringing our understanding of far distant places closer to home. Recently at the Korea Institute of Science and Technology, the National Aeronautics and Space Administration (NASA) along with renowned scientist Stephen Hawking, introduced their plan to build a nano-starship that can travel 20 percent (one fifth) of the speed of light. The nano-starship, given the name ‘Star-Chip’ due to its small size which is comparable to the size of a postage stamp, will utilize lasers to explore interstellar space and be propelled to our nearest star system, Alpha Centauri.



“The limit that confronts us now is the great void between us and the stars. But now we can transcend it. With light beams, light sails, and the lightest spacecraft ever built, we can launch a mission to Alpha Centauri within a generation,” said Hawking at the original project unveiling. “Today, we commit to this next great leap into the cosmos because we are human, and our nature is to fly.”

If the trip is successful the projected amount of time it would take StarChip to reach Alpha Centauri would be about 20 years. The amount of time needed to reach Alpha Centauri is an obstacle to many of the project researchers, being that the amount of radiation the spacecraft will encounter in space is more than enough to destroy it.



A possible solution that will not double or triple the length of the voyage while also sustaining the ship would be chip self-healing. By utilizing a 'gate-all-around' nanowire transistor, an electric current would be able to heat the chip contained within Star-Chip and therefore heal any damage incurred through exposure to radiation.

“On-chip healing has been around for many, many years,” said Jin-Woo Han, one of the NASA team members.

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.


Thứ Sáu, 16 tháng 12, 2016

Parallel Worlds

By: Alexandria Addesso

Parallel worlds and alternate universes have long been a loved theme of interest in Twilight Zone episodes and science fiction movies and literature. Like much other science fiction subject matter, parallel worlds actually do have some scientific data behind it, although many are still very skeptical.

"The idea of parallel universes in quantum mechanics has been around since 1957," said physicist at Griffith University in Brisbane, Australia, Howard Wiseman, who was also part of the team of physicists that came up with the ‘Many-Worlds Interpretation’ (MWI).

"In the well-known ‘Many-Worlds Interpretation’, each universe branches into a bunch of new universes every time a quantum measurement is made. All possibilities are therefore realized – in some universes the dinosaur-killing asteroid missed Earth. In others, Australia was colonized by the Portuguese."



Yet despite what the name of the name of the MWI seems to suggest, these other worlds and universes have absolutely no proven effect on our own. Thus leaving skeptical physicists all the more doubting, while many parallel world theories often get ridiculed or put on the back burner by many mainstream scientists, the String Theory is a way to make sense of multiple dimensions mathematically. The String Theory states that within the theoretical framework of the theory, point-like particles of particle physics are replaced by one-dimensional objects called strings and describes how these strings circulate through space and interact with each other.

Superstring theory goes a step farther and attempts to explain all of the particles and fundamental forces of nature in one theory by modelling them as vibrations of tiny supersymmetric strings. For mathematical consistency, both theories require extra dimensions of space-time. The String Theory suggests that space-time is 26-dimensional, while superstring theory is 10-dimensional.

"You almost can't avoid having some version of the multiverse in your studies if you push deeply enough in the mathematical descriptions of the physical universe," said physicist Brian Greene who authored the book The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos.



"There are many of us thinking of one version of parallel universe theory or another. If it's all a lot of nonsense, then it's a lot of wasted effort going into this far-out idea. But if this idea is correct, it is a fantastic upheaval in our understanding."

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Chủ Nhật, 4 tháng 12, 2016

New Physics discovery could make Time Travel Impossible

‘Even if it turns out that time travel is impossible, it is important that we understand why it is impossible’. Stephen Hawking



With apologies to time-traveling doctors throughout fiction, it seems that visiting the past might actually be impossible — and not just because we haven't invented the ‘flux capacitor yet’.

To back up a little: The reason time travel should be possible is because, in many cases, the laws of physics seem to favor symmetry in space-time, right down to the shape of atomic nuclei.

But a newly discovered, pear-shaped atomic nucleus might be literally pointing to the future — and why it always exists ahead of us, instead of in all directions.



Think of it this way: In space, west is always west, but it's relatively easy to change your orientation so that west can be behind or in front of you.

A longstanding questions, is why we can't do that with time. Are our own biological shortcomings to blame, or is the inexorable forward march an embedded feature of the universe?

The lopsidedness of this nuclei, suggests that it might be the latter.
Many of the laws of physics are what scientists call "time reversible," meaning that it doesn't matter if they play out going forward or backward. This tracks with the standard idea in physics that space-time is symmetrical.

But in reality, time doesn't feel symmetrical. It moves along the direction of a metaphorical "time arrow" that points to the future and to entropy.



Entropy is lopsided — nature seems to move toward only disorder from order, not in the other direction. A universe where time mirrored ours, though, would see the opposite effect, moving inexorably from disorder to order, and perhaps even toward the Big Bang.

This lopsided little nucleus is pointing toward something, and it seems to be the direction of time itself. One of the discoverers of the nucleus, Marcus Scheck, told the BBC that the atom's shape indicates that "we will always travel from past to present."
Source: Sarah Kramer, Tech Insider, CERN

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Bảy, 19 tháng 11, 2016

Tangled Up in Space-time

Hundreds of researchers in a collaborative project called “It from Qubit” say space and time may spring up from the quantum entanglement of tiny bits of information



“All the world’s a stage…,” Shakespeare wrote, and physicists tend to think that way, too. Space seems like a backdrop to the action of forces and fields that inhabit it but space itself is not made of anything—or is it? Lately scientists have begun to question this conventional thinking and speculate that space—and its extension according to general relativity, spacetime—is actually composed of tiny chunks of information. These chunks might interact to create spacetime and give rise to its properties, such as the concept that curvature in spacetime causes gravity. If so, the idea might not just explain spacetime but might help physicists achieve a long-sought goal: a quantum theory of gravity that can merge general relativity and quantum mechanics, the two grand theories of the universe that tend not to get along. Lately the excitement of this possibility has engrossed hundreds of physicists who have been meeting every three months or so under the banner of a project dubbed “It from Qubit.”

The “it” in this case is spacetime, and the qubit (pronounced “cue-bit,” from “quantum bit”) represents the smallest possible amount of information—a computer “bit” on a quantum scale. The idea suggests the universe is built up from some underlying code, and that by cracking this code, physicists will finally have a way to understand the quantum nature of large-scale events in the cosmos. The most recent It from Qubit (IfQ) meeting was held in July at the Perimeter Institute for Theoretical Physics in Ontario, where organizers were expecting about 90 registrants. Instead, they got so many applications they had to expand to take 200 and simultaneously run five satellite sessions at other universities where scientists could participate remotely. “I think this is one of the most, if not the most, promising avenues of research toward pursuing quantum gravity,” says Netta Engelhardt, a postdoctoral researcher at Princeton University who is not officially involved in It from Qubit but who has attended some of its meetings. “It’s just taking off.”



Because the project involves both the science of quantum computers and the study of spacetime and general relativity, it brings together two groups of researchers who do not usually tend to collaborate: quantum information scientists on one hand and high-energy physicists and string theorists on the other. “It marries together two traditionally different fields: how information is stored in quantum things and how information is stored in space and time,” says Vijay Balasubramanian, a physicist at the University of Pennsylvania who is an IfQ principal investigator.

About a year ago the Simons Foundation, a private organization that supports science and mathematics research, awarded a grant to found the It from Qubit collaboration and finance physicists to study and hold meetings on the subject. Since then excitement has grown and successive meetings have drawn in more and more researchers, some official members of the collaboration funded by Simons and many others simply interested in the topic.



“The project is addressing very important questions, but very difficult questions,” says IfQ collaborator Beni Yoshida, a postdoctoral researcher at Perimeter. “Collaboration is necessary—it’s not like a single person can solve this problem.” Even scientists outside of the project have taken notice. “If the link with quantum information theory proves as successful as some anticipate, it could very well spark the next revolution in our understanding of space and time,” says string theorist Brian Greene of Columbia University, who is not involved in IfQ. “That’s a big deal and hugely exciting.”
Source: Clara Moskowitz

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Năm, 3 tháng 11, 2016

Imagining Humans on Mars

Billionaire Elon Musk hopes to build self-sustaining human colonies on Mars. How have science fiction writers imagined the possible role of Mars in humanity’s future?



Last month in Guadalajara, Mexico, Elon Musk, a former graduate student in physics and founder of the space exploration company SpaceX, made headlines with a daring plan: to put human beings on Mars as early as 2024. Musk’s vision is not of a small handful of astronauts or astrotourists taking a short walk on Mars’s surface. Rather, he sees Mars as the future home of a self-sustaining human colony.

At times Musk’s presentation to the International Astronautical Congress (IAC) felt like the opening scene of a science fiction movie—a comparison that he would probably not dislike. Indeed, SpaceX’s work has been littered with references to science fiction. The company’s Falcon 9 rocket is a nod to the Millennium Falcon from Star Wars. And at the IAC, Musk suggested that the first SpaceX ship to Mars might be named Heart of Gold, after the ship in Douglas Adams’s Hitchhiker’s Guide to the Galaxy.

Those works, however, imagine travel between stars and planets far from the Milky Way. How has science fiction envisioned space exploration closer to home, especially as advances in spaceflight have made human travel to Mars seem almost within our reach?



Mars as utopia and refuge
In the late 19th century, astronomical observations of Mars led to intense speculation about whether its surface might harbor life. American astronomer Percival Lowell even built an observatory in Arizona to get a closer look at what he believed were artificially constructed Martian canals. Although most astronomers agreed that there was little evidence for life on Mars, the idea of a Martian race quickly took hold in fiction.

Many of the first science fiction novels about Mars described travelers to the Red Planet who encountered not merely life forms, but utopian civilizations. In A Plunge into Space (1890), Irishman Robert Cromie envisions a Martian society in which air travel is common and society has evolved beyond the need for politicians. In their novel Unveiling a Parallel (1893), Iowa feminists Alice Jones and Ella Merchant send their protagonist to two egalitarian societies on Mars—one in which women and men are equally promiscuous and violent and another in which equality of the sexes has resulted in a scientifically and philosophically advanced utopia.

A spherical steel spacecraft transports people to a Red Planet utopia in A Plunge into Space (1890). War-waging Martians attack Earth in The War of the Worlds (1898).



The most famous English-language science fiction novel of the late 19th century, H. G. Wells’s The War of the Worlds (1898), imagines a significantly less peaceful encounter between humans and Martians. The book’s unnamed English narrator first reads about possible activity on Mars in Nature. Weeks later, he finds himself fleeing for his life as tentacled Martians kill or imprison his fellow humans. Only a humble bacterial infection saves humanity from total defeat. At the end of the novel, the narrator muses that humans, too, might travel beyond their planet one day—but they will have to contend with the surviving Martians if they do:

If the Martians can reach Venus, there is no reason to suppose that the thing is impossible for men, and when the slow cooling of the sun makes this earth uninhabitable, as at last it must do, it may be that the thread of life that has begun here will have streamed out. . . . It may be, on the other hand, that the destruction of the Martians is only a reprieve. To them, and not to us, perhaps, is the future ordained.

Wells’s vision of humans traveling to other planets to escape a crisis on Earth became a staple of mid-20th-century science fiction. Once again, Mars proved a popular destination. In several novels, including Red Planet (1949), avowedly libertarian author Robert Heinlein imagines political discontent as a motivation for Martian settlement—and for eventual rebellion from Earth authorities.



Ray Bradbury’s celebrated short story collection The Martian Chronicles (1950) describes humanity fleeing a coming nuclear war and encountering telepathic Martians. The collection also deals with life on war-torn Earth; perhaps the most famous story in The Martian Chronicles is “There Will Come Soft Rains,” which tells of a mechanized California house carrying on its work after its occupants die in a nuclear blast. Nuclear disaster also prompts the creation of Martian colonies in Philip K. Dick’s novel Do Androids Dream of Electric Sheep? (1968). However, the colonists—and the androids they build—find Mars so desolate that many sneak back to Earth illegally.

The scientific challenges of Mars
Advances in space science during and after the Cold War have seemed to bring a human landing on Mars closer and closer to reality. In the 1960s NASA’s Mariner program successfully executed a series of Mars flybys that garnered the agency increasingly clear images of the planet. In 1975 NASA successfully landed Viking 1 and Viking 2 on Mars’s surface. Further missions to Mars have only become more ambitious. The celebrated Mars rovers, including Spirit, Opportunity, and Curiosity, have mapped Mars’s surface, sampled its soil and rocks, and helped scientists assess whether the planet was habitable in the past.

Viking 1 and Curiosity have provided scientists and sci-fi writers with striking views of the real Red Planet.

As scientists learn more about Mars’s geology and atmosphere, novelists have increasingly focused on the scientific and technological requirements for putting humans on the planet. Some of the most celebrated contemporary authors of Martian fiction have backgrounds in science or engineering. The most popular example is The Martian (2011), the debut novel by computer engineer Andy Weir, which chronicles the struggles and adventures of an astronaut accidentally left for dead on Mars’s surface.



The Martian focuses on living on Mars as it is. In contrast, Kim Stanley Robinson’s acclaimed Mars Trilogy imagines a centuries-long scientific effort to terraform the planet to support human life. Red Mars/i> (1993), the first book in the trilogy, focuses on the scientific challenges of remaking an entire planet, from the geological to the psychological. Scientific problems are not the only ones Robinson’s characters face: Conflict soon arises over whether Mars should be terraformed at all or whether it should be preserved in its natural desert state.



Science fiction has the unique ability to pose “what-if” questions about scientific and technological developments and to reflect on how those developments might affect human society. Each wave of Mars fiction has evolved alongside new knowledge of the Red Planet and has also raised questions related to the social concerns and political controversies of the day. Nineteenth-century Martian novels also served as commentaries on Victorian culture. In 20th-century fiction, colonies on Mars function as both a bastion of hope for humanity’s future and a sign that something has gone terribly wrong on Earth. Dick and Bradbury were implicitly condemning the casual use of nuclear weapons when they imagined humans fleeing to another planet. Robinson’s work explores the possible effect humans might have on Mars while reminding the reader that, humans are already altering Earth.

Musk’s plan for colonizing Mars within the next century is reminiscent of many of the novels about humans living on the Red Planet. In a recent interview, Robinson suggested that Musk modernize his sci-fi-inspired vision: “Musk’s science fiction story needs some updating, some real imagination using current findings from biology and ecology.” Perhaps as SpaceX hones its Mars settlement plan Musk should reread the Mars Trilogy along with The Martian, which depicted a human living on Mars with 21st-century technology.



If Musk’s ambitious plan succeeds, new takes on Mars in science fiction novels and movies will likely arise—along with new questions about what the achievement will mean for humanity.
Source: Melinda Baldwin is the Books editor at Physics Today.

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Năm, 13 tháng 10, 2016

Beam me up Scotty!

Quantum Teleportation of a particle of light six kilometers


Distance record set for teleporting a photon over a fiber network.



What if you could behave like the crew on the Starship Enterprise and teleport yourself home or anywhere else in the world? As a human, you're probably not going to realize this any time soon; if you're a photon, you might want to keep reading.
Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometers using The City of Calgary's fibre optic cable infrastructure. The project began with an Urban Alliance seed grant in 2014.



This accomplishment, which set a new record for distance of transferring a quantum state by teleportation, has landed the researchers a spot in the journal Nature Photonics. The finding was published back-to-back with a similar demonstration by a group of Chinese researchers.

"Such a network will enable secure communication without having to worry about eavesdropping, and allow distant quantum computers to connect," says Tittel.
Experiment draws on 'spooky action at a distance'

The experiment is based on the entanglement property of quantum mechanics, also known as "spooky action at a distance" -- a property so mysterious that not even Einstein could come to terms with it.

"Being entangled means that the two photons that form an entangled pair have properties that are linked regardless of how far the two are separated," explains Tittel. "When one of the photons was sent over to City Hall, it remained entangled with the photon that stayed at the University of Calgary."

Next, the photon whose state was teleported to the university was generated in a third location in Calgary and then also travelled to City Hall where it met the photon that was part of the entangled pair.

"What happened is the instantaneous and disembodied transfer of the photon's quantum state onto the remaining photon of the entangled pair, which is the one that remained six kilometers away at the university," says Tittel.



City's accessible dark fibre makes research possible
The research could not be possible without access to the proper technology. One of the critical pieces of infrastructure that support quantum networking is accessible dark fiber. Dark fibre, so named because of its composition -- a single optical cable with no electronics or network equipment on the alignment -- doesn't interfere with quantum technology.

The City of Calgary is building and provisioning dark fibre to enable next-generation municipal services today and for the future.

"By opening The City's dark fibre infrastructure to the private and public sector, non-profit companies, and academia, we help enable the development of projects like quantum encryption and create opportunities for further research, innovation and economic growth in Calgary," said Tyler Andruschak, project manager with Innovation and Collaboration at The City of Calgary.

"The university receives secure access to a small portion of our fibre optic infrastructure and The City may benefit in the future by leveraging the secure encryption keys generated out of the lab's research to protect our critical infrastructure," said Andruschak. In order to deliver next-generation services to Calgarians, The City has been increasing its fibre optic footprint, connecting all City buildings, facilities and assets.

Timed to within one millionth of one millionth of a second
As if teleporting a photon wasn't challenging enough, Tittel and his team encountered a number of other roadblocks along the way.

Due to changes in the outdoor temperature, the transmission time of photons from their creation point to City Hall varied over the course of a day -- the time it took the researchers to gather sufficient data to support their claim. This change meant that the two photons would not meet at City Hall.



"The challenge was to keep the photons' arrival time synchronized to within 10 pico-seconds," says Tittel. "That is one trillionth, or one millionth of one millionth of a second."

Secondly, parts of their lab had to be moved to two locations in the city, which as Tittel explains was particularly tricky for the measurement station at City Hall which included state-of-the-art superconducting single-photon detectors developed by the National Institute for Standards and Technology, and NASA's Jet Propulsion Laboratory.

"Since these detectors only work at temperatures less than one degree above absolute zero the equipment also included a compact cryostat," said Tittel.

Milestone towards a global quantum Internet
This demonstration is arguably one of the most striking manifestations of a puzzling prediction of quantum mechanics, but it also opens the path to building a future quantum internet, the long-term goal of the Tittel group.

Source: University of Calgary, provided by University of Calgary. Original written by Drew Scherban.

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

 
OUR MISSION