Matt Williams is a space journalist and science communicator for Universe Today and Interesting Engineering. He's also a science fiction author, podcaster (Stories from Space), and Taekwon-Do instructor who lives on Vancouver Island with his wife and family.
As a gas giant (or ice giant), Neptune has no solid surface. In fact, the blue-green disc we have all seen in photographs over the years is actually a bit of an illusion. What we see is actually the tops of some very deep gas clouds, which in turn give way to water and other melted ices that lie over an approximately Earth-size core made of silicate rock and a nickel-iron mix. If a person were to attempt to stand on Neptune, they would sink through the gaseous layers.
As they descended, they would experience increased temperatures and pressures until they finally touched down on the solid core itself. That being said, Neptune does have a surface of sorts, (as with the other gas and ice giants) which is defined by astronomers as being the point in the atmosphere where the pressure reaches one bar. Because of this, Neptune’s surface is one of the most active and dynamic places in entire the Solar System.
Every year, the NASA Innovative Advanced Concepts (NIAC) program puts out the call to the general public, hoping to find better or entirely new aerospace architectures, systems, or mission ideas. As part of the Space Technology Mission Directorate, this program has been in operation since 1998, serving as a high-level entry point to entrepreneurs, innovators and researchers who want to contribute to human space exploration.
This year, thirteen concepts were chosen for Phase I of the NIAC program, ranging from reprogrammed microorganisms for Mars, a two-dimensional spacecraft that could de-orbit space debris, an analog rover for extreme environments, a robot that turn asteroids into spacecraft, and a next-generation exoplanet hunter. These proposals were awarded $100,000 each for a nine month period to assess the feasibility of their concept.
For generations, human beings have fantasized about the possibility of finding extra-terrestrial life. And with our ongoing research efforts to discover new and exciting extrasolar planets (aka. exoplanets) in distant star systems, the possibility of actually visiting one of these worlds has received a real shot in the arm. Unfortunately, given the astronomical distances involved, not to mention the cost of mounting an expedition, doing so presents numerous significant challenges.
However, Russian billionaire Yuri Milner and the Breakthrough Foundation – an international organization committed to exploration and scientific research – is determined to mount an interstellar mission to Alpha Centauri, our closest stellar neighbor, in the coming years. With the backing of such big name sponsors as Mark Zuckerberg and Stephen Hawking, his latest initiative (named “Project Starshot“) aims to send a tiny spacecraft to the Alpha Centauri system to search for planets and signs of life.
Of the more than 600,000 known asteroids in our Solar System, almost 10 000 are known as Near-Earth Objects (NEOs). These are asteroids or comets whose orbits bring them close to Earth’s, and which could potentially collide with us at some point in the future. As such, monitoring these objects is a vital part of NASA’s ongoing efforts in space. One such mission is NASA’s Near-Earth Object Wide-field Survey Explorer (NEOWISE), which has been active since December 2013.
And now, after two years of study, the information gathered by the mission is being released to the public. This included, most recently, NEOWISE’s second year of survey data, which accounted for 72 previously unknown objects that orbit near to our planet. Of these, eight were classified as potentially hazardous asteroids (PHAs), based on their size and how closely their orbits approach Earth.
By definition, pollution refers to any matter that is “out of place”. In other words, it is what happens when toxins, contaminants, and other harmful products are introduced into an environment, disrupting its normal patterns and functions. When it comes to our atmosphere, pollution refers to the introduction of chemicals, particulates, and biological matter that can be harmful to humans, plants and animals, and cause damage to the natural environment.
Whereas some causes of pollution are entirely natural – being the result of sudden changes in temperature, seasonal changes, or regular cycles – others are the result of human impact (i.e. anthropogenic, or man-made). More and more, the effects of air pollution on our planet, especially those that result from human activity, are of great concern to developers, planners and environmental organizations, given the long-term effect they can have.
A neutron star is perhaps one of the most awe-inspiring and mysterious things in the Universe. Composed almost entirely of neutrons with no net electrical charge, they are the final phase in the life-cycle of a giant star, born of the fiery explosions known as supernovae. They are also the densest known objects in the universe, a fact which often results in them becoming a black hole if they undergo a change in mass.
For some time, astronomers have been confounded by this process, never knowing where or when a neutron star might make this final transformation. But thanks to a recent study by a team of researchers from Goethe University in Frankfurt, Germany, it may now be possible to determine the absolute maximum mass that is required for a neutron star to collapse, giving birth to a new black hole.
In their drive to achieve the goal of reusable rockets, SpaceX has spent the past few years running their Falcon 9 rocket through the most rigorous of tests. And while they have achieved a soft landing once before, SpaceX has been unable to safely land their rockets at sea, despite several attempts. This has been an important step in the development process, as it would mean that the Falcon 9 can be landed under the most difficult of conditions.
But earlier today, SpaceX finally reached that milestone as their CRS-08 mission, which was launched from Cape Canaveral at 4:43 pm (ET), made it back to Earth in one piece. After sending its payload of a Dragon Capsule to rendezvous with the International Space Station, the first-stage rocket successfully made a soft landing on a drone ship in the Atlantic Ocean. This one achievement brings SpaceX one step closer to fulfilling the goal Musk founded the company upon, which is achieving cost-effective, commercial spaceflight.
The early 20th century was a very auspicious time for the sciences. In addition to Ernest Rutherford and Niels Bohr giving birth to the Standard Model of particle physics, it was also a period of breakthroughs in the field of quantum mechanics. Thanks to ongoing studies on the behavior of electrons, scientists began to propose theories whereby these elementary particles behaved in ways that defied classical, Newtonian physics.
One such example is the Electron Cloud Model proposed by Erwin Schrodinger. Thanks to this model, electrons were no longer depicted as particles moving around a central nucleus in a fixed orbit. Instead, Schrodinger proposed a model whereby scientists could only make educated guesses as to the positions of electrons. Hence, their locations could only be described as being part of a ‘cloud’ around the nucleus where the electrons are likely to be found.
Atomic Physics To The 20th Century:
The earliest known examples of atomic theory come from ancient Greece and India, where philosophers such as Democritus postulated that all matter was composed of tiny, indivisible and indestructible units. The term “atom” was coined in ancient Greece and gave rise to the school of thought known as “atomism”. However, this theory was more of a philosophical concept than a scientific one.
It was not until the 19th century that the theory of atoms became articulated as a scientific matter, with the first evidence-based experiments being conducted. For example, in the early 1800’s, English scientist John Dalton used the concept of the atom to explain why chemical elements reacted in certain observable and predictable ways. Through a series of experiments involving gases, Dalton went on to develop what is known as Dalton’s Atomic Theory.
This theory expanded on the laws of conversation of mass and definite proportions and came down to five premises: elements, in their purest state, consist of particles called atoms; atoms of a specific element are all the same, down to the very last atom; atoms of different elements can be told apart by their atomic weights; atoms of elements unite to form chemical compounds; atoms can neither be created or destroyed in chemical reaction, only the grouping ever changes.
Discovery Of The Electron:
By the late 19th century, scientists also began to theorize that the atom was made up of more than one fundamental unit. However, most scientists ventured that this unit would be the size of the smallest known atom – hydrogen. By the end of the 19th century, his would change drastically, thanks to research conducted by scientists like Sir Joseph John Thomson.
Through a series of experiments using cathode ray tubes (known as the Crookes’ Tube), Thomson observed that cathode rays could be deflected by electric and magnetic fields. He concluded that rather than being composed of light, they were made up of negatively charged particles that were 1ooo times smaller and 1800 times lighter than hydrogen.
This effectively disproved the notion that the hydrogen atom was the smallest unit of matter, and Thompson went further to suggest that atoms were divisible. To explain the overall charge of the atom, which consisted of both positive and negative charges, Thompson proposed a model whereby the negatively charged “corpuscles” were distributed in a uniform sea of positive charge – known as the Plum Pudding Model.
These corpuscles would later be named “electrons”, based on the theoretical particle predicted by Anglo-Irish physicist George Johnstone Stoney in 1874. And from this, the Plum Pudding Model was born, so named because it closely resembled the English desert that consists of plum cake and raisins. The concept was introduced to the world in the March 1904 edition of the UK’sPhilosophical Magazine, to wide acclaim.
Development Of The Standard Model:
Subsequent experiments revealed a number of scientific problems with the Plum Pudding model. For starters, there was the problem of demonstrating that the atom possessed a uniform positive background charge, which came to be known as the “Thomson Problem”. Five years later, the model would be disproved by Hans Geiger and Ernest Marsden, who conducted a series of experiments using alpha particles and gold foil – aka. the “gold foil experiment.”
In this experiment, Geiger and Marsden measured the scattering pattern of the alpha particles with a fluorescent screen. If Thomson’s model were correct, the alpha particles would pass through the atomic structure of the foil unimpeded. However, they noted instead that while most shot straight through, some of them were scattered in various directions, with some going back in the direction of the source.
Geiger and Marsden concluded that the particles had encountered an electrostatic force far greater than that allowed for by Thomson’s model. Since alpha particles are just helium nuclei (which are positively charged) this implied that the positive charge in the atom was not widely dispersed, but concentrated in a tiny volume. In addition, the fact that those particles that were not deflected passed through unimpeded meant that these positive spaces were separated by vast gulfs of empty space.
By 1911, physicist Ernest Rutherford interpreted the Geiger-Marsden experiments and rejected Thomson’s model of the atom. Instead, he proposed a model where the atom consisted of mostly empty space, with all its positive charge concentrated in its center in a very tiny volume, that was surrounded by a cloud of electrons. This came to be known as the Rutherford Model of the atom.
Subsequent experiments by Antonius Van den Broek and Niels Bohr refined the model further. While Van den Broek suggested that the atomic number of an element is very similar to its nuclear charge, the latter proposed a Solar-System-like model of the atom, where a nucleus contains the atomic number of positive charge and is surrounded by an equal number of electrons in orbital shells (aka. the Bohr Model).
The Electron Cloud Model:
During the 1920s, Austrian physicist Erwin Schrodinger became fascinated by the theories Max Planck, Albert Einstein, Niels Bohr, Arnold Sommerfeld, and other physicists. During this time, he also became involved in the fields of atomic theory and spectra, researching at the University of Zurich and then the Friedrich Wilhelm University in Berlin (where he succeeded Planck in 1927).
In 1926, Schrödinger tackled the issue of wave functions and electrons in a series of papers. In addition to describing what would come to be known as the Schrodinger equation – a partial differential equation that describes how the quantum state of a quantum system changes with time – he also used mathematical equations to describe the likelihood of finding an electron in a certain position.
This became the basis of what would come to be known as the Electron Cloud (or quantum mechanical) Model, as well as the Schrodinger equation. Based on quantum theory, which states that all matter has properties associated with a wave function, the Electron Cloud Model differs from the Bohr Model in that it does not define the exact path of an electron.
Instead, it predicts the likely position of the location of the electron based on a function of probabilities. The probability function basically describes a cloud-like region where the electron is likely to be found, hence the name. Where the cloud is most dense, the probability of finding the electron is greatest; and where the electron is less likely to be, the cloud is less dense.
These dense regions are known as “electron orbitals”, since they are the most likely location where an orbiting electron will be found. Extending this “cloud” model to a 3-dimensional space, we see a barbell or flower-shaped atom (as in image at the top). Here, the branching out regions are the ones where we are most likely to find the electrons.
Thanks to Schrodinger’s work, scientists began to understand that in the realm of quantum mechanics, it was impossible to know the exact position and momentum of an electron at the same time. Regardless of what the observer knows initially about a particle, they can only predict its succeeding location or momentum in terms of probabilities.
At no given time will they be able to ascertain either one. In fact, the more they know about the momentum of a particle, the less they will know about its location, and vice versa. This is what is known today as the “Uncertainty Principle”.
Note that the orbitals mentioned in the previous paragraph are formed by a hydrogen atom (i.e. with just one electron). When dealing with atoms that have more electrons, the electron orbital regions spread out evenly into a spherical fuzzy ball. This is where the term ‘electron cloud’ is most appropriate.
This contribution was universally recognized as being one of the cost important contributions of the 20th century, and one which triggered a revolution in the fields of physics, quantum mechanics and indeed all the sciences. Thenceforth, scientists were no longer working in a universe characterized by absolutes of time and space, but in quantum uncertainties and time-space relativity!
Venus is often referred to as “Earth’s Twin” (or “sister planet”), and for good reason. Despite some rather glaring differences, not the least of which is their vastly different atmospheres, there are enough similarities between Earth and Venus that many scientists consider the two to be closely related. In short, they are believed to have been very similar early in their existence, but then evolved in different directions.
Earth and Venus are both terrestrial planets that are located within the Sun’s Habitable Zone (aka. “Goldilocks Zone”) and have similar sizes and compositions. Beyond that, however, they have little in common. Let’s go over all their characteristics, one by one, so we can in what ways they are different and what ways they are similar.
For years, scientists have been hunting for the stable lava tubes that are believed to exist on the Moon. A remnant from the Moon’s past, when it was still volcanically active, these underground channels could very well be an ideal location for lunar colonies someday. Not only would their thick roofs provide naturally shielding from solar radiation, meteoric impacts, and extremes in temperature. They could also be pressurized to create a breathable environment.
But until now, evidence of their existence has been inferred from surface features such as sinuous rilles – channel-like depressions that run along the surface that indicate the presence of subterranean lava flows – and holes in the surface (aka. “skylights”). However, recent evidence presented at the 47th Lunar and Planetary Science Conference (LPSC) in Texas indicates that one such stable lava tube could exist in the once-active region known as Marius Hills.