Photographer Mike Salway recently took a trip to the western Australia Kimberly Region of the Outback, and has posted some amazing night sky images of his adventures. This picture — and the name of the geologic features — especially caught my eye. The Bungle Bungles of Purnululu National Park are an incredible sight in themselves, huge beehive-shaped sandstone formations. But Mike was able to take a panoramic view of the Milky Way arching over the formations, a symmetrical halo of light in the full sky.
“You know the skies are dark when you can see the Milky Way overhead, even when there’s a more than half-moon shining brightly high in the west sky,” Mike wrote on his website. “And that’s what it was like at the Bungle Bungles.”
This image is an 8 frame panorama, taken on the Piccaninny Creek bed with his Canon 5D Mk II and Samyang 14mm f/2.8 lens.
Want to get your astrophoto featured on Universe Today? Join our Flickr group or send us your images by email (this means you’re giving us permission to post them). Please explain what’s in the picture, when you took it, the equipment you used, etc.
Looking like an intricate pen-and-ink illustration, the complex and beautiful structures of the Sun’s surface come to life in yet another stunning photo by Alan Freidman, captured from the historic Mount Wilson Observatory near Los Angeles, California.
Click below for the full-size image in all its hydrogen alpha glory.
An oft-demonstrated master of solar photography, Alan took the image above while preparing for the transit of Venus on June 5 — which he also skillfully captured on camera (see a video below).
Hydrogen is the most abundant element found on the sun. The sun’s “surface” and the layer just above it — the photosphere and chromosphere, respectively — are regions where atomic hydrogen exists profusely in upper-state form. It’s these absorption layers that hydrogen alpha imaging reveals in detail.
The images above are “negatives”… check out a “positive” version of the same image here.
” The seeing was superb… definitely the best of the visit and among the best solar conditions I’ve ever experienced,” Alan writes on his blog.
The video below was made by Alan on June 5, showing Venus transiting the Sun while both passed behind a tower visible from the Observatory.
Alan’s work is always a treat… see more of his astrophotography on his website AvertedImagination.com.
Astrophotographer extraordinaire Thierry Legault has made a name for himself with his images of spacecraft transiting across the face of the Sun. He has done it again by capturing the first-ever image of the Tiangong-1 space station transiting the Sun. The monster sunspot, AR 1476 absolutely dwarfs the Chinese space station (inside the circle), but you can see incredible details of the Tiangong-1 below in a zoomed-in version. Legault had less than a second to capture the event, with the Tiangong traveling at 7.4km/s (26500 km/h or 16500 mph,) the transit duration was only 0.9 seconds! The size of the station is pretty small — as without solar panels the first module of the Tiangong measures just 10.3 x 3.3 meters.
Legault’s equipment was a Takahashi FSQ-106 refractor, a Baader Herschel prism and Canon 5D Mark II camera. Exposure of 1/8000s at 100 ISO.
As Legault told us in an interview earlier this year, in order to capture such images he studies maps, uses CalSky software, and has a radio synchronized watch to know very accurately when the transit event will happen.
“My camera has a continuous shuttering for 4 seconds, so I begin the sequence 2 seconds before the calculated time,” he said. “I don’t look through the camera – I never see the space station when it appears, I am just looking at my watch!”
For a transit event, he gets a total of 16 images – 4 images every second, and only after he enlarges the images will he know if he succeeded or not.
“There is a kind of feeling that is short and intense — an adrenaline rush!” Legault said. “I suppose it is much like participating in a sport, but the feeling is addictive.”
Thanks to Thierry for sharing his latest success, and you can see larger versions of these images, and much more at his website.
Going to see the new Avengers movie this weekend, either for the first or fortieth time? You may not see much of Thor’s helmet in the film (as he opts for more of a “Point Break” look) but astronomers using the Isaac Newton Group of telescopes on the Canary Islands have succeeded in spotting it… in this super image of the Thor’s Helmet nebula!
Named for its similarity to the famous horned Viking headgear (seen horizontally), the Thor’s Helmet nebula is a Wolf-Rayet structure created by stellar winds from the star seen near the center blowing the gas of the bluish “helmet” outwards into space via pre-supernova emissions.
The colors of the image above, acquired with the ING’s Isaac Newton Telescope, correspond to light emitted in hydrogen alpha, doubly-ionised oxygen and single-ionised sulfur wavelengths.
Super-sized for the thunder god himself, Thor’s Helmet measures at about 30 light-years across. It’s located in the constellation Canis Major, approximately 15,000 light-years from Earth. (You’d think Thor would have left his favorite accessory in a more convenient location… I suspect Loki may be behind this.)
Astronomers, assemble!
Read more about this and see other images from the ING telescopes here.
The Isaac Newton Group of Telescopes (ING) is owned by the Science and Technology Facilities Council (STFC) of the United Kingdom, and it is operated jointly with the Nederlandse Organisatie voor Wetenschappelijk Onderzoek (NWO) of the Netherlands and the Instituto de Astrofísica de Canarias (IAC) of Spain. The telescopes are located in the Spanish Observatorio del Roque de los Muchachos on La Palma, Canary Islands, which is operated by the Instituto de Astrofísica de Canarias (IAC).
Painstakingly assembled from over 150,000 digital photos taken over the course of eight months, this stunning time-lapse video of aurora-filled Arctic skies is the latest creation by photo/video artist Ole C. Salomonsen. Take a moment, turn up the sound, sit back and enjoy the show!
This is Ole’s second video project. The footage was shot on location in parts of Norway, Finland and Sweden from September 2011 to April 2012, and shows the glorious effects that the Sun’s increasing activity has had on our planet’s upper atmosphere.
The video is a merge of two parts; the first part contains some more wild and aggressive auroras, as well as a few Milky Way sequences, hence either auroras are moving fast because they are or they are fast due to motion of the Milky Way / stars. Still, some of the straight-up shots are very close to real-time speed — although auroras mostly are slower, she can also be FAST!
The second part has some more slow and majestic auroras, where I have focused more on composition and foreground. The music should give you a clear indication of where you are.
[/caption]
The music was provided by Norwegian composer Kai-Anders Ryan.
Ole’s “hectic” aurora season is coming to a close now that the Sun is rising above the horizon in the Arctic Circle, and he figured that it was a good time to release the video. It will also be available on 4K Digital Cinema on request.
“Hope you like the video, and that you by watching it are able to understand my fascination and awe for this beautiful celestial phenomenon,” says Ole.
Take a look at the collection of images above. All are high resolution astrophotos of different artificial satellites, taken by renowned astrophotographer Thierry Legault, using one of his 10″ telescopes and a simple webcam. The images have been sharpened and enlarged so that it’s easy to see small structures on the satellites such as antennas or solar panels.
Like this one, which is surely the Soyuz, with solar panels on each side:
These are pretty awesome images….
…except Thierry and I are not telling the truth.
These images are not of satellites, but are all pictures of the star Vega.
What you have just seen is an example of what Legault calls “Bad Astrophotography,” a phrase Legault uses in homage to Phil Plait and his Bad Astronomy blog. Basically, this means that because of image artifacts or over-processing you can be fooled – intentionally or unintentionally — into seeing something that is not really there.
“In any raw image there is noise and if you process this image too strongly, the noise appears and some processing can transform the noise into something that looks like detail – but it is not detail,” said Legault.
So just like the images that have been touted as the Bigfoot on Mars, or even blurry pictures of supposed UFOs, sometimes astrophotos can look like something they are not.
“Many people are not aware that an image is not reality — it is a transformation of reality,” Legault told Universe Today, “and any image that is taken under difficult conditions or close to the resolution limits of the telescope, the image is less and less reliable or reflects less and less the reality.”
Many things can cause problems in astrophotography:
atmospheric turbulence, which can distort images and even create false details or make real ones disappear
the unavoidable shaking of the telescope due to manual tracking, especially in satellite imaging
noise, the variation of brightness or color in images, due to sensor and circuitry of a digital camera, or the diffraction of light from the telescope
These problems may be hard to avoid, depending on your equipment and level of skill. So what should an astrophotographer do?
“The solution for these issues is to be careful with processing,” Legault explained. “I’ve often said the best, most skilled person in imaging processing is not the one that knows all the possibilities of processing, but the person that knows when to stop processing an image.”
Overprocessing
Over-processing, such as multiple smoothing, sharpening and enlargement operations, or layer transformations and combinations in Photoshop can create false details in images.
The issues with the lead image in this article of all the “satellites” — the structures and the different colors you see — are mainly caused by atmospheric turbulence and noise in the raw images, combined with effects from the color sensor in the camera.
Atmospheric Turbulence
Think of how when you look at a star that is low on the horizon with the naked eye, you see twinkling, and sometimes even changes in color, so the atmospheric turbulence can definitely make an effect on colors.
“When you observe a star through a telescope at high magnification, it can become even more distorted,” Legault said. “You have spikes, distortions and changes in shape, and a star that is supposed to be a point or a disk, unfortunately, by turbulence is transformed into something that is completely distorted and can take many shapes.”
Equipment issues
Additionally, Legault said, combining the distortions with an effect from color sensors in the camera, called the Bayer sensor, can cause additional issues.
“For the sensor, you have pixels by groups of four: one red, one blue and two green in square,” Legault said, “and you can easily imagine that if the object is very small, such as a very small star, the light can fall on a red pixel and then the image can become red. Then the image of the star is distorted and you have some spikes that fall on a different color pixel.”
And then the processing does the rest, transforming turbulence and camera artifacts into details that may look real, Legault said.
Legault recalled an amateur who, a few years ago, published an image of Saturn’s moon Titan.
“The image contained surface details and a sharp disk edge,” he said, “and looked quite convincing. But we all know that Titan is covered with an opaque and uniform atmosphere, and surface details can’t be seen. The details were actually only artifacts created from noise or other image defects by over-processing a poor resolution image with multiple upsizing, downsizing, sharpening and smoothing operations.”
What’s an amateur astrophotographer to do?
So, with more and more people doing astrophotography these days, how can they make sure that what they think they are seeing is real?
“There are solutions like combining raw images,” Legault said. “When you combine 10 or 20 or 100 raw images, you can decrease the noise and the image is more reliable and less distorted by turbulence.”
For example, take a look at the images of the space shuttle Discovery below. The two left images are consecutive single frames, processed by smoothing (noise reduction), sharpening (wavelets) and was enlarged 3 times.
The first and second images, although blurry, seem to show lots of very small details. But when they are compared together or with a combination of the 27 best images of the series (on the right), only the larger structures are finally common.
“The bright line marked A is not real, it is an artifact likely caused by turbulence,” Legault said, “and if it were an image of the space station taken during an EVA, I could perhaps claim that this detail is an astronaut, but I would be wrong. The double dark spot marked B, could be taken for windows on top of the cockpit of Discovery. But it is not real; if it were an image of the Space Station, I could claim that it’s the windows of the Cupola, but again I would be wrong. In C, the two parallel lines of the payload bay door is common to both images, but a comparison with the right image, which contains only real details, show that they are not real and that they are probably a processing artifact.”
One of the drawbacks of color sensors is that there is more noise in the image, so the image is less reliable than with black and white sensors. This is the reason that deep sky cameras often use black and white sensors. And so for imaging satellites like the International Space Station, Legault uses a black and white camera.
“It is more reliable, and you don’t need a color camera because the space station is colorless, except for the solar panels,” Legault said. “In addition, the monochrome sensor is much more sensitive to light, by 3 or 4 times. More sensitive means you have less noise.”
Logical advice
Legault’s main advice is just to be logical about what you are seeing in both raw and processed images.
“You need to look at the whole image, the consistency of the whole image, and not just one detail,” he said. “If I take an image that I say has detail on Jupiter’s satellites and on the same image I cannot even see the great red spot on Jupiter, it doesn’t work – that is not possible. The image must have an overall consistency and include details of an object larger than the one that we are interested in. So, if we see an image where someone is supposed to have an astronaut and a module of the space station, and a larger module is not visible or is completely distorted, there is a problem.”
Another piece of advice is to compare your image to another image taken by someone else — another amateur astrophotographer, a professional or even a space agency.
“If You have a photo of the space shuttle or the space station, for example, you can compare it to a real photo and see if all the details are there,” Legault said.
And if you still have questions about what you are seeing on your own images, Legault also suggests posting your images on astronomy forums so you can get the analysis and insights of other amateur astrophotographers.
“So, there are solutions to make sure that details are real details,” Legault said, “and as you get used to observing raw images and processed images, it will become easier to understand if everything is real, if just a part is real, or if almost nothing is real.”
But Legault’s main advice is not to over-process your images. “Many of amateurs take amazing, sharp images and using gentle and reasonable processing so that there are no artifacts.”
For more information and advice from Thierry Legault, see his website, especially the technical pages. Legault has written a detailed article for the March issue of Sky & Telescope on how to image the International Space Station.
Now as the theme from Arthur plays in your head you can enjoy this GIF animation of the ISS passing across the face of a daytime Moon, photographed by Alan Friedman from his location in upstate New York.
I know it’s crazy, but it’s true.
Alan captured these images at 10:30 a.m. EST back on September 2, 2007, and slowed down the animation a bit; in real-time the event lasted less than half a second. (Click the image for an even larger version.)
Atmospheric distortion creates the “wobbly” appearance of the Moon.
Alan Friedman is a talented photographer, printer (and avid vintage hat collector) living in Buffalo, NY. His images of the Sun in hydrogen alpha light are second-to-none and have been featured on many astronomy websites. When he’s not taking amazing photos of objects in the sky he creates beautiful hand-silkscreened greeting cards at his company Great Arrow Graphics.
NOTE: Although this article previously stated that the images were taken Jan. 12, 2012, they were actually captured in September 2007 and re-posted on Jan. 13 of this year. Alan states that he’s since learned how to judge exposure so the ISS doesn’t appear as a streak, but personally he likes (as do I) how this one came out.
Let’s see… September 2007… that would have been Expedition 15!
As a professional astronomy journalist, I read a lot of science papers. It hasn’t been all that long ago that I remember studying about galaxy groups – with the topic of dark matter and dwarf galaxies in particular. Imagine my surprise when I learn that two of my friends, who are highly noted astrophotographers, have been hard at work doing some deep blue science. If you aren’t familiar with the achievements of Ken Crawford and R. Jay Gabany, you soon will be. Step inside here and let us tell you why “it matters”…
According to Ken’s reports, Cold Dark Matter (or CDM) is a theory that most of the material in the Universe cannot be seen (dark) and that it moves very slowly (cold). It is the leading theory that helps explain the formation of galaxies, galaxy groups and even the current known structure of the universe. One of the problems with the theory is that it predicts large amounts of small satellite galaxies called dwarf galaxies. These small galaxies are about 1000th the mass of our Milky Way but the problem is, these are not observed. If this theory is correct, then where are all of the huge amounts of dwarf galaxies that should be there?
Enter professional star stream hunter, Dr. David Martinez-Delgado. David is the principal investigator of the Stellar Tidal Stream Survey at the Max-Planck Institute in Heidelberg, Germany. He believes the reason we do not see large amounts of dwarf galaxies is because they are absorbed (eaten) by larger galaxies as part of the galaxy formation. If this is correct, then we should find remnants of these mergers in observations. These remnants would show up as trails of dwarf galaxy debris made up mostly of stars. These debris trails are called star streams.
“The main aim of our project is to check if the frequency of streams around Milky Way-like galaxies in the local universe is consistent with CDM models similar to that of the movie.” clarifies Dr. Martinez-Delgado. “However, the tidal destruction of galaxies is not enough to solve the missing satellite problem of the CDM cosmology. So far, the best given explanation is that some dark matter halos are not able to form stars inside, that is, our Galaxy would surround by a few hundreds of pure dark matter satellites.”
Enter the star stream hunters professional team. The international team of professional astronomers led by Dr. David Martinez-Delgado has identified enormous star streams on the periphery of nearby spiral galaxies. With deep images he showed the process of galactic cannibalism believed to be occurring between the Milky Way and the Sagittarius dwarf galaxy. This is in our own back yard! Part of the work is using computer modeling to show how larger galaxies merge and absorb the smaller ones.
“Our observational approach is based on deep color-magnitude diagrams that provide accurate distances, surface brightness, and the properties of stellar population of the studied region of this tidal stream.” says Dr. Martinez-Delgado (et al). “These detections are also strong observational evidence that the tidal stream discovered by the Sloan Digitized Sky Survey is tidally stripped material from the Sagittarius dwarf and support the idea that the tidal stream completely enwraps the Milky Way in an almost polar orbit. We also confirm these detections by running numerical simulations of the Sagittarius dwarf plus the Milky Way. This model reproduces the present position and velocity of the Sagittarius main body and presents a long tidal stream formed by tidal interaction with the Milky Way potential.”
Enter the team of amateurs led by R. Jay Gabany. David recruited a small group of amateur astrophotographers to help search for and detect these stellar fossils and their cosmic dance around nearby galaxies, thus showing why there are so few dwarf galaxies to be found.
“Our observations have led to the discovery of six previously undetected, gigantic, stellar structures in the halos of several galaxies that are likely associated with debris from satellites that were tidally disrupted far in the distant past. In addition, we also confirmed several enormous stellar structures previously reported in the literature, but never before interpreted as being tidal streams.” says the team. “Our collection of galaxies presents an assortment of tidal phenomena exhibiting strikingly diverse morphological characteristics. In addition to identifying great circular features that resemble the Sagittarius stream surrounding the Milky Way, our observations have uncovered enormous structures that extend tens of kiloparsecs into the halos of their host’s central spiral. We have also found remote shells, giant clouds of debris within galactic halos, jet-like features emerging from galactic disks and large-scale, diffuse structures that are almost certainly related to the remnants of ancient, already thoroughly disrupted satellites. Together with these remains of possibly long defunct companions, our survey also captured surviving satellites caught in the act of tidal disruption. Some of these display long tails extending away from the progenitor satellite very similar to the predictions forecasted by cosmological simulations.”
Can you imagine how exciting it is to be part of deep blue science? It is one thing to be a good astrophotographer – even to be an exceptional astrophotographer – but to have your images and processing to be of such high quality as to be contributory to true astronomical research would be an incredible honor. Just ask Ken Crawford…
“Several years ago I was asked to become part of this team and have made several contributions to the survey. I am excited to announce that my latest contribution has resulted in a professional letter that has been recently accepted by the Astronomical Journal.” comments Ken. “There are a few things that make this very special. One, is that Carlos Frenk the director of the Institute for Computational Cosmology at Durham University (UK) and his team found that my image of galaxy NGC7600 was similar enough to help validate their computer model (simulation) of how larger galaxies form by absorbing satellite dwarf galaxies and why we do not see large number of dwarf galaxies today.”
Dr. Carlos Frenk has been featured on several television shows on the Science and Discovery channels, to name a few, to explain and show some of these amazing simulations. He is the director of the Institute for Computational Cosmology at Durham University (UK), was one of the winners of the 2011 Cosmology Prize of The Peter and Patricia Gruber Foundation.
“The cold dark matter model has become the leading theoretical picture for the formation of structure in the Universe. This model, together with the theory of cosmic inflation, makes a clear prediction for the initial conditions for structure formation and predicts that structures grow hierarchically through gravitational instability.” says Frenk (et al). “Testing this model requires that the precise measurements delivered by galaxy surveys can be compared to robust and equally precise theoretical calculations.”
And it requires very accurate depictions of studies. According to the team, this pilot survey was conducted with three privately owned observatories equipped with modest sized telescopes located in the USA and Australia. Each observing site features very dark, clear skies with seeing that is routinely at and often below 1.5 arcseconds. These telescopes are manufactured by RC Optical Systems and follow a classic Ritchey-Chretien design. The observatories are commanded with on-site computers that allow remote operation and control from any global location with highband web accesses. Each observatory uses proven, widely available remote desktop control software. Robotic orchestration of all observatory and instrument functions, including multiple target acquisition and data runs, is performed using available scripting software. Additional use of a wide field instrument was employed for those galaxies with an extended angular size. For this purpose, they selected the Astro Physics Starfire 160EDF6, a short focal length (f/7) 16 cm aperture refractor that provides a FOV of 73.7 × 110.6 arcmin. But, it’s more than just taking a photograph. The astrophotographer needs to completely understand what needs to be drawn out of the exposure. It’s more than just taking a “pretty picture”… it’s what matters.
“The galaxy I want to show you has some special features called ‘shells’. I had to image very deep to detect these structures and carefully process them so you can see the delicate structures within.” explains Crawford. “The galaxy name is NGC7600 and these shell structures have not been captured as well in this galaxy before. The movie above shows my image of NGC7600 blending into the simulation at about the point when the shells start to form. The movie below shows the complete simulation.”
“What is ground breaking is that the simulation uses the cold dark matter theory modeling the dark matter halos of the galaxies and as you can see, it is pretty convincing.” concludes Crawford. “So now you all know why we do not observe lots of dwarf galaxies in the Universe.”
But, we can observe some very incredible science done by some very incredible friends. It’s what matters…
They say necessity is the mother of invention, and if you’ve ever tried to take a picture through a telescope with your iPhone you’ll understand the necessity behind this invention: the AstroClip, an ingenious bit of injection-molded awesomeness that mounts an iPhone 4 onto any standard 1.25″ telescope eyepiece, keeping it stable and centered with the camera lens. I think this is a great idea and would certainly get one… that is, if it actually becomes a reality.
Invented by Boston designer Matthew Geyster, the AstroClip (patent pending) is still in development stage right now, awaiting the funding to go into production. Injection molding is a “simple but very expensive” process and in order to get the AstroClip produced Geyster has put his project up on Kickstarter, a web site that lets people pitch their great ideas that need funding and gives them a timeline to gather pledges.
If the AstroClip project can accumulate $15,000 in pledges by September 3, it will go into production. At the time of this writing there are 38 days left until then and it’s only 10% toward its goal. I’m hoping that drawing some more attention to this cool idea will help it along!
By becoming a “backer” you can pledge in several denomination categories, ranging from $1 or more to $500 or more. Each category above $25 comes with a “reward” of some sort… these are all listed on the project page.
I think Matthew has a great concept here. The camera on the iPhone 4 is very good and could take some great shots of the Moon and other astronomical objects, were it to just have a secure mount on a telescope lens. I’ve tried to do it without a mount before and really, it’s not easy.
“The AstroClip is designed to be very minimal, while still being fully functional. The clip is very simple and rigid to hold your iPhone 4 steady and securely for the perfect shot. I also added the three adjustment screws that look like they’re meant to be on a telescope. With the simplicity and functionality of the AstroClip you will be taking great photos of outer space in no time at all.”
– Matthew Geyster
Honestly, I have no connection personally with this project or with Matthew… I just think this is something that would be very popular with iPhone users and astronomy enthusiasts. (I don’t even have a telescope… the light pollution in my city is pretty bad.) I just liked the idea so much I wanted to help support it however I could, and Universe Today seemed the perfect place to call attention to it!
If it proceeds the AstroClip will be entirely produced in the USA. Check it out on Kickstarter by clicking the image above or visit theastroclip.com.
Jason Major is a graphic designer, photo enthusiast and space blogger. Visit his websiteLights in the Dark and follow him on Twitter @JPMajor or on Facebook for the most up-to-date astronomy news and images!
Shhhh! Don’t tell anyone, but we’ve got pictures….. ground-based pictures of secret spy satellites in Earth orbit. We’re not revealing our sources, but … oh wait, I guess we might as well tell you. Even if we didn’t reveal our source, you’d probably guess that astrophotographer extraordinaire Thierry Legault — who has been sharing his wonderfully detailed ground-based images of the space shuttle and International Space Station with Universe Today – has been working on capturing other satellites in orbit as well. Legault and his partner in imaging crime, Emmanuel Rietsch have tackled the difficult task of tracking down spy satellites and then tracking them with a telescope. For imaging the shuttle and ISS, they developed their own design of a motorized mount outfitted with a computer program so it can slowly and precisely rotate in order to track and follow an object in Earth orbit with a telescope and video camera. Now they are able to image even smaller objects.
Since October 2010, Legault has been using the autoguided mount, with the help of a DMK 31AF03 Firewire video camera mounted on the finder (FL 200 mm) and of the software Videos Sky, created by Rietsch, and then modified by Reitsch and Legault for fast tracking with the Takahashi EM400 mount.
The X-37B spaceplane now in orbit is the second of the two Orbital Test Vehicles launched by the US Air Force, launched on March 5, 2011. Reportedly, it will conduct experiments and tests for close to nine months and then autonomously de-orbit and land. Legault and Rietsch were able to image the spaceplane in late May of this year with fairly good results.
“I tried to get help to identify the real orientation of X-37B,” Legault told Universe Today via Skype today, “but on the contrary of the Keyhole and Lacrosse satellites, it’s not easy considering its complex shape with several wings.”
And the Air Force isn’t telling.
“Keyhole-class” (KH) reconnaissance satellites have been used for more than 30 years and are typically used to take overhead photos for military missions. Some of the keyhole satellites resemble the Hubble Space Telescope, but instead of looking out into space, it looks back at Earth. A similar type of spy satellites are the Lacrosse satellites, which are radar-imaging satellites.
But even with the tracking system, getting images of small satellites is not easy. “Despite this performing tracking system and hours of training on airplanes passing in the sky, keeping the space ship inside a sensor of a few millimeters at a focal length of 5000 mm and a speed over 1°/s needs a lot of concentration and training,” said Legault on his website.
The autoguiding and acquisition are done via a laptop with a double hard drive (one of which is a Solid State Drive – made with flash memory), enabling the precision of tracking of about one arc minute.
For security reasons, the sighting times for spy satellites are not published on an official website like NASA does for the shuttle and ISS. But with a bit of digging, Legault said others can try their luck at trying to spot these secret satellites.
“Orbital data are in the Calsky database,” Legault told UT, “therefore their passages are forecast as for the ISS. Generally, orbits are determined by amateurs, some of them are specialized in this activity, especially Kevin Fetter (and data are exchanged on the Seesat mailing list, owned by Ted Molczan).”
Legault is well-known for his images of the shuttle and ISS transiting the sun, but he said the accuracy of orbital data for the spy satellites is not sufficient for capturing a solar transit – and besides, these satellites are much smaller than the ISS and would appear as a small dark dot, at best.
“But for nighttime passages the data is sufficient,” Legault said. “Generally they are not visible with the naked eye or barely (except during flares), but they are easily visible with a finder.”
You can follow Universe Today senior editor Nancy Atkinson on Twitter: @Nancy_A. Follow Universe Today for the latest space and astronomy news on Twitter @universetoday and on Facebook.