Hiển thị các bài đăng có nhãn Physics. Hiển thị tất cả bài đăng
Hiển thị các bài đăng có nhãn Physics. Hiển thị tất cả bài đăng

Thứ Sáu, 14 tháng 4, 2017

Flat Earth Theory: Does it have a scientific leg to stand on?

By:Alexandria Addesso

Although the Greek philosopher Aristotle in the 300s BCE explained in his writings that the Earth was spherical and not flat, most of the world did not come to agreement on this rational until the 1700s. Yet lately, there is in growing popularity a segment of people who currently hold the belief that the world is flat. Being that the scientific findings are constantly influx, is there any grounds for this non-spherical belief?

The most common flat-earth theory states that the Earth is a disc, with the Arctic Circle in the center of it and Antarctica around the rim in the form of a 150-foot-tall ice wall. This is a theory popularized by Orlando Ferguson, a real estate developer, in 1893. While the Earth is believed to be a disc, in this theory the sun and moon are believed to be spherical, thus explaining the Earth's day and night cycle by positing that the sun and moon are wide 32 miles (51 kilometers) and move in circles 3,000 miles (4,828 km) above the plane of the Earth. These celestial spheres illuminate different portions of the Earth in a 24-hour cycle. Yet this theory does not come with experimental evidence to back it up.



On the Flat Earth Society’s website there is a page listing simple experiments done by seven different people that support the theory. Most having to do with the lack of a “bulge” when looking, over 30 miles out into the ocean, with a telescope.

“IF the earth is a globe, and is 24,900 English statute miles in circumference, the surface of all standing water must have a certain degree of convexity--every part must be an arc of a circle,” said Tom Bishop, a flat Earth believer cited on the Flat Earth Society website. “From the summit of any such arc there will exist a curvature or declination of 8 inches in the first statute mile. In the second mile the fall will be 32 inches; in the third mile, 72 inches, or 6 feet, as shown in this chart. Ergo; looking at the opposite beach 30 miles away there should be a bulge of water over 600 feet tall blocking my view. There isn't.”



PBS NewsHour’s website also published an article several years ago about seven DIY experiments that could be down to prove the Earth is indeed spherical. One of the experiments directly challenges Bishop’s findings by simply suggesting that the experimenter watch the sunset, while laying on a beach, on the Pacific coast. If, while laying on your back you see the sun’s rays completely disappear, you will be able to see them again if you simply hop up and stand on your feet, thus justifying the Earth’s curvature.

Everything is to be questioned, even those questioning what is accepted.

Thứ Sáu, 7 tháng 4, 2017

Does the Universe have a Rest Frame?

Experiment aims at resolving divergence between special relativity and standard model of cosmology



Physics is sometimes closer to philosophy when it comes to understanding the universe. Donald Chang from Hong Kong University of Science and Technology, China, attempts to elucidate whether the universe has a resting frame. The results have recently been published in EPJ Plus.

To answer this tricky question, he has developed an experiment to precisely evaluate particle mass. This is designed to test the special theory of relativity that assumes the absence of a rest frame, otherwise it would be possible to determine which inertial frame is stationary and which frame is moving. This assumption, however, appears to diverge from the standard model of cosmology, which assumes that what we see as a vacuum is not an empty space. The assumption is that the energy of our universe comes from the quantum fluctuation in the vacuum.

In a famous experiment conducted by Michelson and Morley in the late 19th century, the propagation of light was proved to be independent of the movement of the laboratory system. Einstein, his Special Theory of Relativity, inferred that the physical laws governing the propagation of light are equivalent in all inertial frames -- this was later extended to all physics laws not just optics.



In this study, the author set out to precisely measure the masses of two charged particles moving in opposite directions. The conventional thinking assumes that the inertial frame applies equally to both particles. If that's the case, no detectable mass difference between these two particles is likely to arise. However, if the contrary is true, and there is a rest frame in the universe, the author expects to see mass difference that is dependent on the orientation of the laboratory frame. This proposed experiment partially inspired by the Michelson and Morley experiments can be conducted using existing experimental techniques. For simplicity, an electron can be used as the charged particle in the experiment.

Story Source: Material provided by Springer
Journal Reference: Donald C. Chang. Is there a resting frame in the universe? A proposed experimental test based on a precise measurement of particle mass.

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Tư, 29 tháng 3, 2017

How Small Can Superconductors Be?

Topographic image of a lead nanocrystal used in the study. Scale bar: 10 nm. Credit: Vlaic et al. Nature Communications

For the first time, physicists have experimentally validated a 1959 conjecture that places limits on how small superconductors can be. Understanding superconductivity (or the lack thereof) on the nanoscale is expected to be important for designing future quantum computers, among other applications.

In 1959, physicist P.W. Anderson conjectured that superconductivity can exist only in objects that are large enough to meet certain criteria. Namely, the object's superconducting gap energy must be larger than its electronic energy level spacing—and this spacing increases as size decreases. The cutoff point (where the two values are equal) corresponds to a volume of about 100 nm3. Until now it has not been possible to experimentally test the Anderson limit due to the challenges in observing superconducting effects at this scale.

In the new study published in Nature Communications, Sergio Vlaic and coauthors at the ‘University Paris Sciences et Lettres’ and French National Centre for Scientific Research (CNRS) designed a nano-system that allowed them to experimentally investigate the Anderson limit for the first time.

The Anderson limit arises because, at very small scales, the mechanisms underlying superconductivity essentially stop working. In general, superconductivity occurs when electrons bind together to form Cooper pairs. Cooper pairs have a slightly lower energy than individual electrons, and this difference in energy is the superconducting gap energy. The Cooper pairs' lower energy inhibits electron collisions that normally create resistance. If the superconducting gap energy gets too small and vanishes—which can occur, for example, when the temperature increases—then the electron collisions resume and the object stops being a superconductor.

The Anderson limit shows that small size is another way that an object may stop being a superconductor. However, unlike the effects of increasing the temperature, this is not because smaller objects have a smaller superconducting gap energy. Instead, it arises because smaller crystals have fewer electrons, and therefore fewer electron energy levels, than larger crystals do. Since the total possible electron energy of an element stays the same, regardless of size, smaller crystals have larger spacings between their electron energy levels than larger crystals do.



According to Anderson, this large electronic energy level spacing should pose a problem and he expected superconductivity to disappear when the spacing becomes larger than the superconducting gap energy. The reason for this, generally speaking, is that one consequence of increased spacing is a decrease in potential energy, which interferes with the competition between kinetic and potential energy that is necessary for superconductivity to occur.

To investigate what happens to the superconductivity of objects around the Anderson limit, the scientists in the new study prepared large quantities of isolated lead nanocrystals ranging in volume from 20 to 800 nm3.

Although they could not directly measure the superconductivity of such tiny objects, the researchers could measure something called the parity effect, which results from superconductivity. When an electron is added to a superconductor, the additional energy is partly affected by whether there is an even or odd number of electrons (the parity), which is due to the electrons forming Cooper pairs. If the electrons don't form Cooper pairs, there is no parity effect, indicating no superconductivity.

Although the parity effect has previously been observed in large superconductors, this study is the first time that it has been observed in small nanocrystals approaching the Anderson limit. In accordance with Anderson's predictions from more than 50 years ago, the researchers observed the parity effect for larger nanocrystals, but not for the smallest nanocrystals below approximately 100 nm3.



The results not only validate the Anderson conjecture, but also extend to a more general area, the Richardson-Gaudin models. These models are equivalent to the conventional theory of superconductivity, the Bardeen Cooper Schrieffer theory, for very small objects.

"Our experimental demonstration of the Anderson conjecture is also a demonstration of the validity of the Richardson-Gaudin models," coauthor Hervé Aubin at the ‘University Paris Sciences et Lettres’ and CNRS told Phys.org. "The Richardson-Gaudin models are an important piece of theoretical works because they can be solved exactly and apply to a wide range of systems; not only to superconducting nanocrystals but also to atomic nuclei and cold fermionic atomic gas, where protons and neutrons, which are fermions like electrons, can also form Cooper pairs."

On the more practical side, the researchers expect the results to have applications in future quantum computers.

"One of the most interesting applications of superconducting islands is their use as Cooper pair boxes employed in quantum bits, the elemental unit of a hypothetical quantum computer," Aubin said. "So far, Cooper pair boxes, used in qubits, are much larger than the Anderson limit. Upon reducing the size of the Cooper pair box, quantum computer engineers will eventually have to cope with superconductivity at the Anderson limit."
Source: Lisa Zyga - Nature Communications

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Ba, 21 tháng 3, 2017

NASA to launch Cold Atom Lab in Space

Free falling: NASA is putting ultracold atoms in space

A laboratory for cooling an atomic gas to just a billionth of a degree above absolute zero will soon be sent up to the International Space Station (ISS) by physicists working at NASA's Jet Propulsion Laboratory. The goal of the Cold Atom Lab (CAL) mission is to create long-lived Bose–Einstein condensates (BECs) that could lead to better sensors and atomic clocks for use on spacecraft. The BECs could even provide important insights into the nature of dark energy, according to the researchers.

First created in 1995, a BEC is made by trapping and cooling an atomic gas to an extremely low temperature so the atoms fall into the same low-energy quantum state. Instead of behaving like a collection of individual atoms, a BEC is essentially a large quantum object. This makes it very sensitive to disturbances such as stray magnetic fields and accelerations, and therefore BECs can be used to create extremely good sensors.

Falling down
Here on Earth, gravity puts an upper limit on the lifetime of a BEC – the atoms fall down and after a fraction of a second the BEC has dropped out of view of the experiment. In the microgravity environment of the ISS, however, NASA's Robert Thompson and colleagues reckon that their BECs should be observable for 5–10 s. As well as allowing physicists to make more precise measurements of the quantum properties of BECs, the longer lifetime should also make the BECs better sensors. With further development, the team believes that BECs in space could endure for hundreds of seconds.

Five scientific teams will do experiments using Cold Atom Lab, including one led by Eric Cornell of the University of Colorado – who shared the 2001 Nobel Prize for Physics for creating the first BECs.



As well as creating BECs, CAL will also cool fermionic atoms to create degenerative Fermi gases. These systems can be made to mimic the behaviour of electrons in solids and could provide important insights into phenomena such as superconductivity. Physicists will also study ultracold mixtures of bosonic and fermionic atoms. Other planned experiments include atom interferometry and very precise measurements of gravity itself.

Pervasive forces
"Studying these hyper-cold atoms could reshape our understanding of matter and the fundamental nature of gravity," says Thompson. "The experiments we'll do with the Cold Atom Lab will give us insight into gravity and dark energy – some of the most pervasive forces in the universe."



CAL will be contained within a package about the size of an "ice box". This will contain a vacuum chamber, lasers and electronics. It will also include an electromagnetic "knife", which will be used to cool the atoms. The lab is currently in the final stages of assembly and will be launched in August on a SpaceX CRS-12 rocket.
Author
Hamish Johnston is editor of physicsworld.com

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Sáu, 10 tháng 3, 2017

Scientists reach Back in Time to Discover some of the most Power-packed Galaxies

In the heart of an active galaxy, matter falling toward a supermassive black hole generates jets of particles traveling near the speed of light.Credit: NASA's Goddard Space Flight Center Scientific Visualization Studio

When the universe was young, a supermassive black hole heaved out a jet of particle-infused energy that raced through space at nearly the speed of light. Billions of years later, scientists has identified this black hole and four others similar to it that range in age from 1.4 billion to 1.9 billion years old.



When the universe was young, a supermassive black hole -- bloated to the bursting point with stupendous power -- heaved out a jet of particle-infused energy that raced through the vastness of space at nearly the speed of light.

Billions of years later, a trio of Clemson University scientists, led by College of Science astrophysicist Marco Ajello, has identified this black hole and four others similar to it that range in age from 1.4 billion to 1.9 billion years old. These objects emit copious gamma rays, light of the highest energy, that are billions of times more energetic than light that is visible to the human eye.

The previously known earliest gamma-ray blazars -- a type of galaxy whose intense emission is powered by extremely powerful relativistic jets launched by monstrous black holes -- were more than 2 billion years old. Currently, the universe is estimated to be approximately 14 billion years old.



"The discovery of these supermassive black holes, which launch jets that emit more energy in one second than our sun will produce in its entire lifetime, was the culmination of a yearlong research project," said Ajello, who has spent much of his career studying the evolution of distant galaxies. "Our next step is to increase our understanding of the mechanisms involved in the formation, development and activities of these amazing objects, which are the most powerful accelerators in the universe. We can't even come close to replicating such massive outputs of energy in our laboratories. The complexities we're attempting to unravel seem almost as mysterious as the black holes themselves."

Ajello conducted his research in conjunction with Clemson post-doc Vaidehi Paliya and Ph.D candidate Lea Marcotulli. The trio worked closely with the Fermi-Large Area Telescope collaboration, which is an international team of scientists that includes Roopesh Ojha, an astronomer at NASA's Goddard Space Flight Center in Greenbelt, Maryland; and Dario Gasparrini of the Italian Space Agency.

The Clemson team's breakthroughs were made possible by recently juiced-up software on NASA's Fermi Gamma-ray Telescope. The refurbished software significantly boosted the orbiting telescope's sensitivity to a level that made these latest discoveries possible.



"People are calling it the cheapest refurbishment in history," Ajello said. "Normally, for the Hubble Space Telescope, NASA had to send someone up to space to physically make these kinds of improvements. But in this case, they were able to do it remotely from an Earth-bound location. And of equal importance, the improvements were retroactive, which meant that the previous six years of data were also entirely reprocessed. This helped provide us with the information we needed to complete the first step of our research and also to strive onward in the learning process."

Using Fermi data, Ajello and Paliya began with a catalog of 1.4 million quasars, which are
galaxies that harbor at their centers active supermassive black holes. Over the course of a year, they narrowed their search to 1,100 objects. Of these, five were finally determined to be newly discovered gamma-ray blazars that were the farthest away -- and youngest -- ever identified.

"After using our filters and other devices, we were left with about 1,100 sources. And then we did the diagnostics for all of these and were able to narrow them down to 25 to 30 sources," Paliya said. "But we still had to confirm that what we had detected was scientifically authentic. So we performed a number of other simulations and were able to derive properties such as black hole mass and jet power. Ultimately, we confirmed that these five sources were guaranteed to be gamma-ray blazars, with the farthest one being about 1.4 billion years old from the beginning of time."



Marcotulli, who joined Ajello's group as a Ph.D student in 2016, has been studying the blazars' mechanisms by using images and data delivered from another orbiting NASA telescope, the Nuclear Spectroscopic Telescope Array (NuSTAR). At first, Marcotulli's role was to understand the emission mechanism of gamma-ray blazars closer to us. Now she is turning her attention toward the most distant objects in a quest to understand what makes them so powerful.

"We're trying to understand the full spectrum of the energy distribution of these objects by using physical models," Marcotulli said. "We are currently able to model what's happening far more accurately than previously devised, and eventually we'll be able to better understand what processes are occurring in the jets and which particles are radiating all the energy that we see. Are they electrons? or protons? How are they interacting with surrounding photons? All these parameters are not fully understood right now. But every day we are deepening our understanding."



All galaxies have black holes at their centers -- some actively feeding on the matter surrounding them, others lying relatively dormant. Our own galaxy has at its center a super-sized black hole that is currently dormant. Ajello said that only one of every 10 black holes in today's universe are active. But when the universe was much younger, it was closer to a 50-50 ratio.

The supermassive black holes at the center of the five newly discovered blazar galaxies are among the largest types of black holes ever observed, on the order of hundreds of thousands to billions of times the mass of our own sun. And their accompanying accretion disks -- rotating swirls of matter that orbit the black holes -- emit more than two trillion times the energy output of our sun.

One of the most surprising elements of Ajello's research is how quickly -- by cosmic measures -- these supersized black holes must have grown in only 1.4 billion years. In terms of our current knowledge of how black holes grow, 1.4 billion years is barely enough time for a black hole to reach the mass of the ones discovered by Ajello's team.

"How did these incomprehensibly enormous and energy-laden black holes form so quickly?" Ajello said. "Is it because one black hole ate a lot all the time for a very long time? Or maybe because it bumped into other black holes and merged into one? To be honest, we have no observations supporting either argument. There are mechanisms at work that we have yet to unravel. Puzzles that we have yet to solve. When we do eventually solve them, we will learn amazing things about how the universe was born, how it grew into what it has become, and what the distant future might hold as the universe continues to progress toward old age."
Source: Materials provided by Clemson University

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Sáu, 3 tháng 3, 2017

Scientists reveal new Super-Fast form of Computer that 'Grows as it Computes'

DNA double helix. Credit: public domain

Researchers from The University of Manchester have shown it is possible to build a new super-fast form of computer that "grows as it computes".

Professor Ross D King and his team have demonstrated for the first time the feasibility of engineering a nondeterministic universal Turing machine (NUTM), and their research is to be published in the prestigious Journal of the Royal Society Interface.

The theoretical properties of such a computing machine, including its exponential boost in speed over electronic and quantum computers, have been well understood for many years – but the Manchester breakthrough demonstrates that it is actually possible to physically create a NUTM using DNA molecules.

"Imagine a computer is searching a maze and comes to a choice point, one path leading left, the other right," explained Professor King, from Manchester's School of Computer Science. "Electronic computers need to choose which path to follow first.



"But our new computer doesn't need to choose, for it can replicate itself and follow both paths at the same time, thus finding the answer faster.

"This 'magical' property is possible because the computer's processors are made of DNA rather than silicon chips. All electronic computers have a fixed number of chips.
"Our computer's ability to grow as it computes makes it faster than any other form of computer, and enables the solution of many computational problems previously considered impossible.
"Quantum computers are an exciting other form of computer, and they can also follow both paths in a maze, but only if the maze has certain symmetries, which greatly limits their use.

"As DNA molecules are very small a desktop computer could potentially utilize more processors than all the electronic computers in the world combined - and therefore outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy."



The University of Manchester is famous for its connection with Alan Turing - the founder of computer science - and for creating the first stored memory electronic computer.

"This new research builds on both these pioneering foundations," added Professor King.
Alan Turing's greatest achievement was inventing the concept of a universal Turing machine (UTM) - a computer that can be programmed to compute anything any other computer can compute. Electronic computers are a form of UTM, but no quantum UTM has yet been built.

DNA computing is the performing of computations using biological molecules rather than traditional silicon chips. In DNA computing, information is represented using the four-character genetic alphabet - A [adenine], G [guanine], C [cytosine], and T [thymine] - rather than the binary alphabet, which is a series of 1s and 0s used by traditional computers.

Provided by: University of Manchester

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Ba, 21 tháng 2, 2017

The Future is here: Terahertz chips a New Way of Seeing through Matter

Princeton University researchers have drastically shrunk the equipment for producing terahertz -- important electromagnetic pulses lasting one millionth of a millionth of a second -- from a tabletop setup with lasers and mirrors to a pair.

Electromagnetic pulses lasting one millionth of a millionth of a second may hold the key to advances in medical imaging, communications and drug development. But the pulses, called terahertz waves, have long required elaborate and expensive equipment to use.

Now, researchers at Princeton University have drastically shrunk much of that equipment: moving from a tabletop setup with lasers and mirrors to a pair of microchips small enough to fit on a fingertip.

In two articles recently published in the IEEE Journal of Solid State Circuits, the researchers describe one microchip that can generate terahertz waves, and a second chip that can capture and read intricate details of these waves.


In two recently published articles, researchers Kaushik Sengupta (left), an assistant professor of electrical engineering, and Xue Wu (right), a Princeton graduate student in computer science, describe one microchip that can generate.

"The system is realized in the same silicon chip technology that powers all modern electronic devices from smartphones to tablets, and therefore costs only a few dollars to make on a large scale" said lead researcher Kaushik Sengupta, a Princeton assistant professor of electrical engineering.

Terahertz waves are part of the electromagnetic spectrum—the broad class of waves that includes radio, X-rays and visible light—and sit between the microwave and infrared light wavebands. The waves have some unique characteristics that make them interesting to science. For one, they pass through most non-conducting material, so they could be used to peer through clothing or boxes for security purposes, and because they have less energy than X-rays, they don't damage human tissue or DNA.

Terahertz waves also interact in distinct ways with different chemicals, so they can be used to characterize specific substances. Known as spectroscopy, the ability to use light waves to analyze material is one of the most promising—and the most challenging—applications of terahertz technology, Sengupta said.

To do it, scientists shine a broad range of terahertz waves on a target then observe how the waves change after interacting with it. The human eye performs a similar type of spectroscopy with visible light—we see a leaf as green because light in the green light frequency bounces off the chlorophyll-laden leaf.

The challenge has been that generating a broad range of terahertz waves and interpreting their interaction with a target requires a complex array of equipment such as bulky terahertz generators or ultrafast lasers. The equipment's size and expense make the technology impractical for most applications.

Researchers have been working for years to simplify these systems. In September, Sengupta's team reported a way to reduce the size of the terahertz generator and the apparatus that interprets the returning waves to a millimeter-sized chip. The solution lies in re-imaging how an antenna functions. When terahertz waves interact with a metal structure inside the chip, they create a complex distribution of electromagnetic fields that are unique to the incident signal. Typically, these subtle fields are ignored, but the researchers realized that they could read the patterns as a sort of signature to identify the waves. The entire process can be accomplished with tiny devices inside the microchip that read terahertz waves.



"Instead of directly reading the waves, we are interpreting the patterns created by the waves," Sengupta said. "It is somewhat like looking for a pattern of raindrops by the ripples they make in a pond."

Daniel Mittleman, a professor of engineering at Brown University, said the development was "a very innovative piece of work, and it potentially has a lot of impact." Mittleman, who is the vice chair of the International Society for Infrared Millimeter and Terahertz Waves, said scientists still have work to do before the terahertz band can begin to be used in everyday devices, but the developments are promising.
"It is a very big puzzle with many pieces, and this is just one, but it is a very important one," said Mittleman, who is familiar with the work but had no role in it.

On the terahertz-generation end, much of the challenge is creating a wide range of wavelengths within the terahertz band, particularly in a microchip. The researchers realized they could overcome the problem by generating multiple wavelengths on the chip. They then used precise timing to combine these wavelengths and create very sharp terahertz pulses.

In an article published Dec. 14 in the IEEE Journal of Solid State Circuits, the researchers explained how they created a chip to generate the terahertz waves. The next step, the researchers said, is to extend the work farther along the terahertz band. "Right now we are working with the lower part of the terahertz band," said Xue Wu, a Princeton doctoral student in electrical engineering and an author on both papers.

"What can you do with a billion transistors operating at terahertz frequencies?" Sengupta asked. "Only by re-imagining these complex electromagnetic interactions from fundamental principles can we invent game-changing new technology."
Provided by: Princeton University

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

Thứ Tư, 15 tháng 2, 2017

The Universe's largest supernova: a spinning, star eating black hole

By: Alexandria Addesso

According to the National Aeronautics and Space Administration (NASA), a supernova is defined as one of the largest explosions that take place in space, in particular the explosion of a star. A supernova usually manifests when there is a change in the core, or center, of a star.

The brightest supernova ever recorded, and actually categorized as a super-luminous supernova, was the explosion of an extremely massive star at the end of its life. It was named ASASSN-15lh. ASASSN-15 lhn, was first observed in 2015 by the All Sky Automated Survey for Super-Novae (ASAS-SN). It was twice as bright as the previous record holder, and at its peak was 20 times brighter than the total light output of the entire Milky Way.



"We observed the source for 10 months following the event and have concluded that the explanation is unlikely to lie with an extraordinarily bright supernova,” said Giorgos Leloudas the leader of the team that observed the event at the Weizmann Institute of Science in Israel. “Our results indicate that the event was probably caused by a rapidly spinning supermassive black hole as it destroyed a low-mass star."
After a series of further observations, it was discovered that the massive explosion took place about 4 billion light-years from Earth in a distant galaxy. It is believed that the star that the black hole consumed in order to produce such a massive explosion was its solar system's biggest star, comparable to our own sun. Since then rays have been observed traveling from the black hole towards us at the speed of light. These rays are forming a disc of gas around the black hole as it converts gravitational energy into electromagnetic radiation, producing a bright source of light visible on multiple wavelengths.



"Incredibly, this source is still producing X-rays and may remain bright enough for Swift to observe into next year," said David Burrows, the lead scientist of the team utilizing NASA's Swift satellite to monitor the massive black hole as well as a professor of astronomy at Penn State University. "It behaves unlike anything we've seen before."

Could such an explosion happen in our solar system with our own sun? This is a question scientists and inquiring minds are still trying to figure out.

YOUR INPUT IS MUCH APPRECIATED! LEAVE YOUR COMMENT BELOW.

 
OUR MISSION