Physicists have developed an atomic clock so accurate that it would be off by less than a single second in 14 billion years. That kind of accuracy and precision makes it more than just a timepiece. It’s a powerful scientific instrument that could measure gravitational waves, take the measure of the Earth’s gravitational shape, and maybe even detect dark matter.
Fusion power has been the fevered dream of scientists, environmentalists and futurists for almost a century. For the past few decades, scientists have been attempting to find a way to create sustainable fusion reactions that would provide human beings with clean, abundant energy, which would finally break our dependence on fossil fuels and other unclean methods.
In recent years, many positive strides have been made that are bringing the “fusion era” closer to reality. Most recently, scientists working with the Experimental Advanced Superconducting Tokamak (EAST) – aka. the “Chinese artificial sun” – set a new record by super-heating clouds of hydrogen plasma to over 100 million degrees – a temperature which is six times hotter than the Sun itself!
Telescopes have come a long way in the past few centuries. From the comparatively modest devices built by astronomers like Galileo Galilei and Johannes Kepler, telescopes have evolved to become massive instruments that require an entire facility to house them and a full crew and network of computers to run them. And in the coming years, much larger observatories will be constructed that can do even more.
Unfortunately, this trend towards larger and larger instruments has many drawbacks. For starters, increasingly large observatories require either increasingly large mirrors or many telescopes working together – both of which are expensive prospects. Luckily, a team from MIT has proposed combining interferometry with quantum-teleportation, which could significantly increase the resolution of arrays without relying on larger mirrors.
Space junk is a growing problem. For decades we have been sending satellites into orbit around Earth. Some of them de-orbit and burn up in Earth’s atmosphere, or crash into the surface. But most of the stuff we send into orbit is still up there.
This is becoming an acute problem as years go by and we launch more and more hardware into orbit. Since the very first satellite—Sputnik 1—was launched into orbit in 1957, over 8000 satellites have ben placed in orbit. As of 2018, an estimated 4900 are still in orbit. About 3000 of those are not operational. They’re space junk. The risk of collision is growing, and scientists are working on solutions. The problem will compound itself over time, as collisions between objects create more pieces of debris that have to be dealt with.
These missions will look farther into the cosmos than ever before and help astronomers address questions like how the Universe evolved and if there is life in other star systems. Unfortunately, all these missions have two things in common: in addition to being very large and complex, they are also very expensive. Hence why some scientists are proposing that we rely on more cost-effective ideas like swarm telescopes.
Two such scientists are Jayce Dowell and Gregory B. Taylor, a research assistant professor and professor (respectively) with the Department of Physics and Astronomy at the University of New Mexico. Together, the pair outlined their idea in a study titled “The Swarm Telescope Concept“, which recently appeared online and was accepted for publication by the Journal of Astronomical Instrumentation.
As they state in their study, traditional astronomy has focused on the construction, maintenance and operation of single telescopes. The one exception to this is radio astronomy, where facilities have been spread over an extensive geographic area in order to obtain high angular resolution. Examples of this include the Very Long Baseline Array (VLBA), and the proposed Square Kilometer Array (SKA).
In addition, there’s also the problem of how telescopes are becoming increasingly reliant on computing and digital signal processing. As they explain in their study, telescopes commonly carry out multiple simultaneous observation campaigns, which increases the operational complexity of the facility due to conflicting configuration requirements and scheduling considerations.
A possible solution, according to Dowell and Taylor, is to rethink telescopes. Instead of a single instrument, the telescope would consist of a distributed array where many autonomous elements come together through a data transport system to function as a single facility. This approach, they claim, would be especially useful when it comes to the Next Generation Very Large Array (NGVLA) – a future interferometer that will build on the legacy of the Karl G. ansky Very Large Array and Atacama Large Millimeter/submillimeter Array (ALMA). As they state in their study:
“At the core of the swarm telescope is a shift away from thinking about an observatory as a monolithic entity. Rather, an observatory is viewed as many independent parts that work together to accomplish scientific observations. This shift requires moving part of the decision making about the facility away from the human schedulers and operators and transitioning it to “software defined operators” that run on each part of the facility. These software agents then communicate with each other and build dynamic arrays to accomplish the goals of multiple observers, while also adjusting for varying observing conditions and array element states across the facility.”
This idea for a distributed telescope is inspired by the concept of swarm intelligence, where large swarms of robots are programmed to interact with each other and their environment to perform complex tasks. As they explain, the facility comes down to three major components: autonomous element control, a method of inter-element communication, and data transport management.
Of these components, the most critical is the autonomous element control which governs the actions of each element of the facility. While similar to traditional monitoring and control systems used to control individual robotic telescopes, this system would be different in that it would be responsible for far more. Overall, the element control would be responsible for ensuring the safety of the telescope and maximizing the utilization of the element.
“The first, safety of the element, requires multiple monitoring points and preventative actions in order to identify and prevent problems,” they explain. “The second direction requires methods of relating the goals of an observation to the performance of an element in order to maximize the quantity and quality of the observations, and automated methods of recovering from problems when they occur.”
The second component, inter-element communication, is what allows the individual elements to come together to form the interferometer. This can take the form of a leaderless system (where there is no single point of control), or an organizer system, where all of the communication between the elements and with the observation queue is done through a single point of control (i.e. the organizer).
Lastly, their is the issue of data transport management, which can take one of two forms based on existing telescopes. These include fully 0ff-line systems, where correlation is done post-observation – used by the Very Long Baseline Array (VLBA) – to fully-connected systems, where correlation is done in real-time (as with the VLA). For the sake of their array, the team emphasized how connectivity and correlation are a must.
After considering all these components and how they are used by existing arrays, Dowell and Taylor conclude that the swarm concept is a natural extension of the advances being made in robotic and thinking telescopes, as well as interferometry. The advantages of this are spelled out in their conclusions:
“It allows for more efficient operations of facilities by moving much of the daily operational work done by humans to autonomous control systems. This, in turn, frees up personnel to focus on the scientific output of the telescope. The swarm concept can also combine the unused resources of the different elements together to form an ad hoc array.”
In addition, swarm telescopes will offer new opportunities and funding since they will consist of small elements that can be owned and operated by different entities. In this way, different organizations would be able to conduct science with their own elements while also being able to benefit from large-scale interferometric observations.
This concept is similar to the Modular Active Self-Assembling Space Telescope Swarms, which calls for a swarm of robots that would assemble in space to form a 30 meter (~100 ft) telescope. The concept was proposed by a team of American astronomers led by Dmitri Savransky, an assistant professor of mechanical and aerospace engineering at Cornell University.
This proposals was part of the 2020 Decadal Survey for Astrophysics and was recently selected for Phase I development as part of the 2018 NASA Innovative Advanced Concepts (NIAC) program. So while many large-scale telescopes will be entering service in the near future, the next-next-generation of telescopes could include a few arrays made up of swarms of robots directed by artificial intelligence.
Such arrays would be capable of achieving high-resolution astronomy and interferometry at lower costs, and could free up large, complex arrays for other observations.
When it comes to the new era of space exploration, one of the primary focuses has been on cutting costs. By reducing the costs associated with individual launches, space agencies and private aerospace companies will not only be able to commercialize Low Earth-Orbit (LEO), but also mount far more in the way of exploration missions and maybe even colonize space.
Several methods have been proposed so far for reducing launch costs, which include reusable rockets and single-stage-to-orbit rockets. However, a team of engineers from the University of Glasgow and the Ukraine recently proposed an entirely different idea that could make launching small payloads affordable – a self-eating rocket! This “autophage” rocket could easily send small satellites into space more easily and more affordably.
The study which describes how they built and tested the “autophage” engine recently appeared in the Journal of Spacecraft and Rockets under the title “Autophage Engines: Toward a Throttleable Solid Motor“. The team was led by Vitaly Yemets and Patrick Harkness – a Professor from the Oles Honchar Dnipro National University in the Ukraine and a Senior Lecturer from the University of Glasgow, respectively.
Together, the team addressed one the most pressing issues when it comes to rockets today. This has to do with the fact that storage tanks, which contain the rocket’s propellants as they climb, weight many times the spacecraft’s payload. This reduces the efficiency of the launch vehicle and also adds to the problem of space debris, since these fuel tanks are disposable and fall away when spent.
As Dr Patrick Harkness, who led Glasgow’s contribution to the work, explained in a recent University of Glasgow press release:
“Over the last decade, Glasgow has become a centre of excellence for the UK space industry, particularly in small satellites known as ‘CubeSats’, which provide researchers with affordable access to space-based experiments. There’s also potential for the UK’s planned spaceport to be based in Scotland. However, launch vehicles tend to be large because you need a large amount of propellant to reach space. If you try to scale down, the volume of propellant falls more quickly than the mass of the structure, so there is a limit to how small you can go. You will be left with a vehicle that is smaller but, proportionately, too heavy to reach an orbital speed.”
In contrast, an autophage engine consumes its own structure during ascent, so more cargo capacity could be freed-up and less debris would enter orbit. The propellant consists of a solid fuel rod (made of a solid plastic like polyethylene) on the outside and an oxidizer on the inside. By driving the rod into a hot engine, the fuel and oxidizer are vaporized to create gas that then flows into the combustion chamber to produce thrust.
“A rocket powered by an autophage engine would be different,” said Dr. Harkness. “The propellant rod itself would make up the body of the rocket, and as the vehicle climbed the engine would work its way up, consuming the body from base to tip. That would mean that the rocket structure would actually be consumed as fuel, so we wouldn’t face the same problems of excessive structural mass. We could size the launch vehicles to match our small satellites, and offer more rapid and more targeted access to space.”
The research team also showed that the engine could be throttled by simply varying the speed at which the rod is driven into the engine, which is something rare in a solid motor. During the lab tests, the team has been able to sustain rocket operations for 60 seconds at a time. As Dr. Harkness said, the team hopes to build on this and eventually conduct a launch test:
“While we’re still at an early stage of development, we have an effective engine testbed in the laboratory in Dnipro, and we are working with our colleagues there to improve it still further. The next step is to secure further funding to investigate how the engine could be incorporated into a launch vehicle.”
Another challenge of the modern space age is how to deliver additional payloads and satellites into orbit without creating more in the way of orbital clutter. By introducing an engine that can make for cheap launches that also has no disposable parts, the autophage could be a game-changing technology, one which is right up there with fully-recoverable rockets.
The research team also consisted of Mykola Dron and Anatoly Pashkov – a Professor and Senior Researcher from Oles Honchar Dnipro National University – and Kevin Worrall and Michael Middleton – a Research Associate and M.S. student from the University of Glasgow.
Ever since NASA announced that they had created a prototype of the controversial Radio Frequency Resonant Cavity Thruster (aka. the EM Drive), any and all reported results have been the subject of controversy. Initially, any reported tests were the stuff of rumors and leaks, the results were treated with understandable skepticism. Even after the paper submitted by the Eagleworks team passed peer review, there have still been unanswered questions.
Hoping to address this, a team of physicists from TU Dresden – known as the SpaceDrive Project – recently conducted an independent test of the EM Drive. Their findings were presented at the 2018 Aeronautics and Astronautics Association of France’s Space Propulsion conference, and were less than encouraging. What they found, in a nutshell, was that much of the EM’s thrust could attributable to outside factors.
The results of their test were reported in a study titled “The SpaceDrive Project – First Results on EMDrive and Mach-Effect Thrusters“, which recently appeared online. The study was led by Martin Tajmar, an engineer from the Institute of Aerospace Engineering at TU Dresden, and included TU Dresden scientists Matthias Kößling, Marcel Weikert and Maxime Monette.
To recap, the EM Drive is a concept for an experimental space engine that came to the attention of the space community years ago. It consists of a hollow cone made of copper or other materials that reflects microwaves between opposite walls of the cavity in order to generate thrust. Unfortunately, this drive system is based on principles that violate the Conservation of Momentum law.
This law states that within a system, the amount of momentum remains constant and is neither created nor destroyed, but only changes through the action of forces. Since the EM Drive involves electromagnetic microwave cavities converting electrical energy directly into thrust, it has no reaction mass. It is therefore “impossible”, as far as conventional physics go.
As a result, many scientists have been skeptical about the EM Drive and wanted to see definitive evidence that it works. In response, a team of scientists at NASA’s Eagleworks Laboratories began conducting a test of the propulsion system. The team was led by Harold White, the Advanced Propulsion Team Lead for the NASA Engineering Directorate and the Principal Investigator for NASA’s Eagleworks lab.
Despite a report that was leaked in November of 2016 – titled “Measurement of Impulsive Thrust from a Closed Radio Frequency Cavity in Vacuum“ – the team never presented any official findings. This prompted the team led by Martin Tajmar to conduct their own test, using an engine that was built based on the same specifications as those used by the Eagleworks team.
In short, the TU Dresden team’s prototype consisted of a cone-shaped hollow engine set inside a highly shielded vacuum chamber, which they then fired microwaves at. While they found that the EM Drive did experience thrust, the detectable thrust may not have been coming from the engine itself. Essentially, the thruster exhibited the same amount of force regardless of which direction it was pointing.
This suggested that the thrust was originating from another source, which they believe could be the result of interaction between engine cables and the Earth’s magnetic field. As they conclude in their report:
“First measurement campaigns were carried out with both thruster models reaching thrust/thrust-to– power levels comparable to claimed values. However, we found that e.g. magnetic interaction from twisted-pair cables and amplifiers with the Earth’s magnetic field can be a significant error source for EMDrives. We continue to improve our measurement setup and thruster developments in order to finally assess if any of these concepts is viable and if it can be scaled up.”
In other words, the mystery thrust reported by previous experiments may have been nothing more than an error. If true, it would explain how the “impossible EM Drive” was able to achieve small amounts of measurable thrust when the laws of physics claim it shouldn’t be. However, the team also emphasized that more testing will be needed before the EM Drive can be dismissed or validated with confidence.
Alas, it seems that the promise of being able to travel to the Moon in just four hours, to Mars in 70 days, and to Pluto in 18 months – all without the need for propellant – may have to wait. But rest assured, many other experimental technologies are being tested that could one day allow us to travel within our Solar System (and beyond) in record time. And additional tests will be needed before the EM Drive can be written off as just another pipe dream.
The team also conducted their own test of the Mach-Effect Thruster, another concept that is considered to be unlikely by many scientists. The team reported more favorable results with this concept, though they indicated that more research is needed here as well before anything can be conclusively said. You can learn more about the team’s test results for both engines by reading their report here.
And be sure to check out this video by Scott Manley, who explains the latest test and its results
It’s a staple of science fiction, and something many people have fantasized about at one time or another: the idea of sending out spaceships with colonists and transplanting the seed of humanity among the stars. Between discovering new worlds, becoming an interstellar species, and maybe even finding extra-terrestrial civilizations, the dream of spreading beyond the Solar System is one that can’t become reality soon enough!
Looking to the future of crewed space exploration, it is clear to NASA and other space agencies that certain technological requirements need to be met. Not only are a new generation of launch vehicles and space capsules needed (like the SLS and Orion spacecraft), but new forms of energy production are needed to ensure that long-duration missions to the Moon, Mars, and other locations in the Solar System can take place.
One possibility that addresses these concerns is Kilopower, a lightweight fission power system that could power robotic missions, bases and exploration missions. In collaboration with the Department of Energy’s National Nuclear Security Administration (NNSA), NASA recently conducted a successful demonstration of a new nuclear reactor power system that could enable long-duration crewed missions to the Moon, Mars, and beyond.
Known as the Kilopower Reactor Using Stirling Technology (KRUSTY) experiment, the technology was unveiled at a recent news conference on Wednesday, May 2nd, at NASA’s Glenn Research Center. According to NASA, this power system is capable of generating up to 10 kilowatts of electrical power – enough power several households continuously for ten years, or an outpost on the Moon or Mars.
As Jim Reuter, NASA’s acting associate administrator for the Space Technology Mission Directorate (STMD), explained in a recent NASA press release:
“Safe, efficient and plentiful energy will be the key to future robotic and human exploration. I expect the Kilopower project to be an essential part of lunar and Mars power architectures as they evolve.”
The prototype power system employs a small solid uranium-235 reactor core and passive sodium heat pipes to transfer reactor heat to high-efficiency Stirling engines, which convert the heat to electricity. This power system is ideally suited to locations like the Moon, where power generation using solar arrays is difficult because lunar nights are equivalent to 14 days on Earth.
In addition, many plans for lunar exploration involve building outposts in the permanently-shaded polar regions or in stable underground lava tubes. On Mars, sunshine is more plentiful, but subject to the planet’s diurnal cycle and weather (such as dust storms). This technology could therefore ensure a steady supply of power that is not dependent on intermittent sources like sunlight. As Marc Gibson, the lead Kilopower engineer at Glenn, said:
“Kilopower gives us the ability to do much higher power missions, and to explore the shadowed craters of the Moon. When we start sending astronauts for long stays on the Moon and to other planets, that’s going to require a new class of power that we’ve never needed before.”
The Kilopower experiment was conducted at the NNSA’s Nevada National Security Site (NNSS) between November and March of 2017. In addition to demonstrating that the system could produce electricity through fission, the purpose of the experiment was also to show that it is stable and safe in any environment. For this reason, the Kilopower team conduct in the experiment in four phases.
The first two phases, which were conducted without power, confirmed that each component in the system functioned properly. For the third phase, the team increased power to heat the core slowly before moving on to phase four, which consisted of a 28-hour, full-power test run. This phase simulated all stages of a mission, which included a reactor startup, ramp up to full power, steady operation and shutdown.
Throughout the experiment, the team simulated various system failures to ensure that the system would keep working – which included power reductions, failed engines and failed heat pipe. Throughout, the KRUSTY generator kept on providing electricity, proving that it can endure whatever space exploration throws at it. As Gibson indicated:
“We put the system through its paces. We understand the reactor very well, and this test proved that the system works the way we designed it to work. No matter what environment we expose it to, the reactor performs very well.”
Looking ahead, the Kilopower project will remain a part of NASA’s Game Changing Development (GCD) program. As part of NASA’s Space Technology Mission Directorate (STMD), this program’s goal is to advance space technologies that may lead to entirely new approaches for the Agency’s future space missions. Eventually, the team hopes to make the transition to the Technology Demonstration Mission (TDM) program by 2020.
If all goes well, the KRUSTY reactor could allow for permanent human outposts on the Moon and Mars. It could also offer support to missions that rely on In-situ Resource Utilization (ISRU) to produce hydrazine fuel from local sources of water ice, and building materials from local regolith.
Basically, when robotic missions are mounted to the Moon to 3D print bases out of local regolith, and astronauts begin making regular trips to the Moon to conduct research and experiments (like they do today to the International Space Station), it could be KRUSTY reactors that provide them will all their power needs. In a few decades, the same could be true for Mars and even locations in the outer Solar System.
This reactor system could also pave the way for rockets that rely on nuclear-thermal or nuclear-electric propulsion, enabling missions beyond Earth that are both faster and more cost-effective!
And be sure to enjoy this video of the GCD program, courtesy of NASA 360: