Accident Damages Mirror on Telescope Slated for Dark Energy Camera

Cracks in the secondary mirror on the Blanco telescope in Chile after an accident on February 20, 2012. Credit: Cerro Tololo Inter-American Observatory

[/caption]

An accident at the Blanco 4m telescope at Chile’s Cerro Tololo Inter-American Observatory has severely damaged a secondary mirror. The telescope is currently shut down for installation of the highly anticipated Dark Energy Survey Camera, and on February 20, 2012, the telescope’s f/8 secondary mirror was dropped during testing, resulting in fractures in the glass in the center of the mirror. Officials at the telescope said they are analyzing the extent of the damage to the mirror, and whether it extends beyond the visible cracks on the surface. They are also reviewing how the accident might affect the installation of the “DECam.”

Two staff members were injured during the incident, but are expected to fully recover. According to a post on the CTIO website, the f/8 had been removed for the installation of the DECam, and the f/8 was on the dome floor to test the focus mechanism. “The mirror and its back end assembly were being transferred to a handling cart to enable the tests. Unfortunately, the mirror was improperly installed on the cart and when the mirror was being rotated on the cart, the entire cart/mirror assembly toppled over injuring two of our technical staff,” said the report.

The mirror itself impacted the dome floor, causing the fractures, pictured above.

At this time, officials say it is not clear if the mirror is repairable or not and are reviewing what needs to be done to stabilize the cracks in the mirror. The accident is being investigated and initially, officials said they didn’t expect the incident delay the installation and commissioning of Dark Energy Camera as the f/8 is not required for the installation or operation of the Dark Energy Camera system. However, a later update said the DECam installation schedule was being modified to allow for the absence of the f/8 mirror.

The Dark Energy Camera will map 300 million galaxies with an extremely red sensitive 500 Megapixel camera, with a 1 meter diameter, 2.2 degree field of view prime focus corrector, and a data acquisition system fast enough to take images in 17 seconds.

The CTIO website said they would be providing future updates on the status of the mirror and the DECam installation.

Our previous article about the DECam.

Yes, As a Matter of Fact It IS Rocket Science

Feb. 24, 2012 launch of Atlas V with MUOS-1. Credit: Jen Scheer (@flyingjenny)

[/caption]

On the afternoon of February 24, 2012, at 5:15 p.m. EST local time, a United Launch Alliance Atlas V rocket lifted off from the pad at Cape Canaveral Air Force Base carrying in its payload the US Navy’s next-generation narrowband communications satellite MUOS-1. After two scrubbed launches the previous week due to weather, the third time was definitely a charm for ULA, and the launch went nominally (that’s science talk for “awesome”.)

But what made that day, that time the right time to launch? Do they just like ending a work week with a rocket launch? (Not that I could blame them!) And what about the weather… why go through the trouble to prepare for a launch at all if the weather doesn’t look promising? Where’s the logic in that?

As it turns out, when it comes to launches, it really is rocket science.

There are a lot of factors involved with launches. Obviously all the incredible engineering it takes to even plan and build a launch vehicle, and of course its payload — whatever it happens to be launching in the first place. But it sure doesn’t end there.

Launch managers need to take into consideration the needs of the mission, where the payload has to ultimately end up in orbit… or possibly even beyond. Timing is critical when you’re aiming at moving targets — in this case the targets being specific points in space (literally.) Then there’s the type of rocket being used, and where it is launching from. Only then can weather come into the equation, and usually only at the last minute to determine if the countdown will proceed before the launch window closes.

How big that launch window may be — from a few hours to a few minutes — depends on many things.

Kennedy Space Center’s Anna Helney recently assembled an article “Aiming for an Open Window” that explains how this process works:

_________________

The most significant deciding factors in when to launch are where the spacecraft is headed, and what its solar needs are. Earth-observing spacecraft, for example, may be sent into low-Earth orbit. Some payloads must arrive at a specific point at a precise time, perhaps to rendezvous with another object or join a constellation of satellites already in place. Missions to the moon or a planet involve aiming for a moving object a long distance away.

For example, NASA’s Mars Science Laboratory spacecraft began its eight-month journey to the Red Planet on Nov. 26, 2011 with a launch aboard a United Launch Alliance (ULA) Atlas V rocket from Cape Canaveral Air Force Station in Florida. After the initial push from the powerful Atlas V booster, the Centaur upper stage then sent the spacecraft away from Earth on a specific track to place the laboratory, with its car-sized Curiosity rover, inside Mars’ Gale Crater on Aug. 6, 2012. Due to the location of Mars relative to Earth, the prime planetary launch opportunity for the Red Planet occurs only once every 26 months.

Additionally, spacecraft often have solar requirements: they may need sunlight to perform the science necessary to meet the mission’s objectives, or they may need to avoid the sun’s light in order to look deeper into the dark, distant reaches of space.

A Delta II arcs across the sky carrying NASA's Suomi NPP spacecraft. Image credit: NASA/Bill Ingalls

Such precision was needed for NASA’s Suomi National Polar-orbiting Partnership (NPP) spacecraft, which launched Oct. 28, 2011 aboard a ULA Delta II rocket from Vandenberg Air Force Base in California. The Earth-observing satellite circles at an altitude of 512 miles, sweeping from pole to pole 14 times each day as the planet turns on its axis. A very limited launch window was required so that the spacecraft would cross the ascending node at exactly 1:30 p.m. local time and scan Earth’s surface twice each day, always at the same local time.

All of these variables influence a flight’s trajectory and launch time. A low-Earth mission with specific timing needs must lift off at the right time to slip into the same orbit as its target; a planetary mission typically has to launch when the trajectory will take it away from Earth and out on the correct course.

According to [Eric Haddox, the lead flight design engineer in NASA’s Launch Services Program], aiming for a specific target — another planet, a rendezvous point, or even a specific location in Earth orbit where the solar conditions will be just right — is a bit like skeet shooting.

“You’ve got this object that’s going to go flying out into the air and you’ve got to shoot it,” said Haddox. “You have to be able to judge how far away your target is and how fast it’s moving, and make sure you reach the same point at the same time.”

But Haddox also emphasized that Earth is rotating on its axis while it orbits the sun, making the launch pad a moving platform. With so many moving players, launch windows and trajectories must be carefully choreographed.

__________________

It’s a fascinating and complex set of issues that mission managers need to get just right in order to ensure the success of a launch — and thus the success of a mission, whether it be putting a communication satellite into orbit or a rover onto Mars… or somewhere much, much farther than that.

Read the rest of the article here.

35 Years Later, the ‘Wow!’ Signal Still Tantalizes

The "Wow!" signal. Credit: Wikimedia Commons

Since the SETI program first began searching for possible alien radio signals a few decades ago, there have been many false alarms but also instances of fleeting signals of interest which disappeared again as quickly as they had appeared. If a potential signal doesn’t repeat itself so it can be more carefully observed, then it is virtually impossible to determine whether it is of truly cosmic origin. One such signal in particular caught astronomers’ interest on August 15, 1977. The famous “Wow!” signal was detected by the Big Ear Radio Observatory at Ohio State University; it was thirty times stronger than the background noise but lasted only 72 seconds and was never heard again despite repeated subsequent searches.

In a new book titled The Elusive Wow, amateur astronomer Robert Gray chronicles the quest for the answer to this enduring puzzle.

When the signal was first seen in the data, it was so pronounced that SETI scientist Jerry Ehman circled it on the computer printouts in red ink and wrote “Wow!” next to it. It appeared to fit the criteria for an extraterrestrial radio signal, but because it wasn’t heard again, the follow-up studies required to either confirm or deny this were not possible. So what was it about the signal that made it so interesting?

First, it did appear to be an artificial radio signal, rather than a natural radio emission such as a pulsar or quasar. The Big Ear telescope used a receiver with 50 radio channels; the signal was only heard on one frequency, with no other noise on any of the other channels. A natural emission would cause static to appear on all of the frequencies, and this was not the case. The signal was narrow and focused, as would be expected from an artificial source.

The Big Ear Radio Observatory. Credit: Big Ear Radio Observatory / North American AstroPhysical Observatory / Ohio State University

The signal also “rose and fell” during the 72 seconds, as would be expected from something originating in space. When the radio telescope is pointed at the sky, any such signal will appear to increase in intensity as it first moves across the observational beam of the telescope, then peak when the telescope is pointed straight at it and then decrease as it moves away from the telescope. This also makes a mere computer glitch a less likely explanation, although not impossible.

What about satellites? This would seem to be an obvious possible explanation, but as Gray notes, a satellite would have to be moving at just the right distance and at just the right speed, to mimic an alien signal. But then why wasn’t it observed again? An orbiting satellite will broadcast its signal repeatedly. The signal was observed near the 1420 MHz frequency, a “protected spectrum” in which terrestrial transmitters are forbidden to transmit as it is reserved for astronomical purposes.

There may be a bias in thinking that any alien signals will be like ours which leak out to space continuously, ie. all of our radio and TV broadcasts. That is, “normal” radio emissions from every-day type technologies which could easily be seen on an ongoing basis. But what if they were something more like beacons, sent out intentionally but only on a periodic basis? As Gray explains, radio searches to date have tended to look at many different spots in the sky, but they will only examine any particular spot for a few minutes or so before moving on to the next. A periodic signal could easily be missed completely, or if seen, it may be a long time before it is seen again.

Of course, it is also possible that any other civilizations out there might not even use radio at all, especially if they are more advanced than us (while other intelligent life might be behind us, as well). A newer branch of SETI is now searching for artificial sources of light, like laser beams, used as beacons.

So where does this leave us? The “Wow!” signal still hasn’t been adequately explained, although various theories have been proposed over the years. Perhaps one day it will be observed again, or another one like it, and we will be able to solve the mystery. Until then, it remains a curiosity, a tantalizing hint of what a definite signal from an extraterrestrial civilization might look like.

More information is available at the Big Ear Radio Observatory website.

Mercury Down Under

MESSENGER wide-angle camera image of Mercury's southern hemisphere.

[/caption]

NASA’s MESSENGER spacecraft, about to wrap up its first full year in orbit around Mercury, captured this view of the planet’s heavily-cratered southern hemisphere on August 28, 2011. Because of its orbit, MESSENGER gets particularly good panoramic views of Mercury’s underside.

Here’s why…

MESSENGER’s orbit, established on March 18, 2011 at 00:45 UTC, is not a simple circling path around the first rock from the Sun. Instead it is highly elliptical, bringing it 124 miles (200 km) above Mercury’s north pole at its closest and more than 9,420 miles (15,193 km) from its south pole at its farthest! (See diagram below.)

The close approaches over the northern hemisphere allow MESSENGER to study the Caloris basin, Mercury’s largest surface feature and, at over 960 miles (1,550 km) across, one of the largest impact craters in the entire Solar System.

The view of Mercury’s southern hemisphere above features some notable craters as well: the relatively youthful 444-mile (715-km) -wide Rembrandt basin is seen at top right, while the smaller pit-floor crater Kipling can be discerned to its left, just below the planet’s limb.

When craters are larger than 300 km in diameter, they are referred to as basins.

During its 12 months in orbit MESSENGER will have experienced only two days on Mercury! This is because Mercury rotates very slowly on its axis, completing a full solar day (sunrise to sunrise) every 176 Earth days. (And you thought your work day seemed to last forever!)

Three perspectives of MESSENGER's orbit.

Find out more about the MESSENGER mission here.

Image credit: NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington. 

See Venus in Daylight This Weekend

Photo of Venus and the Moon taken on Jan 26, 2012. The new moon is at the top right and Venus is at the bottom left. Credit: Gadi Eidelheit.

[/caption]

The planet Venus is so bright that when conditions are right, it can be visible in full daylight. This weekend, and especially on Saturday, February 25, 2012, conditions should be just right for seeing Venus in the daytime. Our friend Gadi Eidelheit sent us his tips for seeing Venus, and says it is easier to see Venus when it is far from the Sun and less affected by its glare, so make sure that the Sun is blocked by a building or a tree. If you have a clear blue sky in your location early Saturday afternoon, try first locating the crescent Moon at about 1 pm local time. At this time, the Moon will be in the southeastern sky, about 60 degrees above the horizon.

When you find the Moon, look a short distance directly below it to find Venus. The planet will appear as a tiny white dot in the sky. You can also use sky maps or internet sites (such as Heavens-Above) to find out where Venus is relative to the Moon.

If you don’t see Venus during the day, try to see Venus immediately at sunset; and right now, the Moon, Venus and Jupiter are lining up for triple conjunction at dusk, and with clear skies, it will be a great view that is almost impossible to miss!

But for seeing Venus on subsequent days, try to stand in the same position where you saw it before, but 20 minutes before sunset. Try to locate Venus a little higher up and to the East from where it was a day before. Do so for several days, each time a little earlier.

You can also try to use binoculars to locate Venus. Safety first, make sure that the Sun is completely blocked and that you can not accidentally look directly at it through the binoculars! Although Venus is bright, it will not appear through binoculars if they are not focused properly. In order to use binoculars, focus it beforehand (such as the evening before) on Venus and make sure that the focus does not change. Now the binoculars are focused and you can use them to see Venus in the day. After you find Venus through the binoculars, try to see it without them.

If you get images of Venus in the daytime or of the triple conjunction, you can submit them to our Flickr page.

If your location does not have clear skies for the triple conjunction, The online Slooh Space Camera will webcast views from various observatories around the world, beginning at 0230 GMT (9:30 pm EST, 6:30 pm PST) both nights this weekend (Feb. 26 and 27). Access the webcast here.

Slooh will provide footage from multiple observatories around the world, including Arizona and the Canary Islands off the coast of Africa. The broadcast can be accessed at Slooh’s homepage, found here: http://events.slooh.com/

Frederick Quintao on Google+ has provided instructions for seeing Venus in the daytime in Portuguese!

Rare Amateur Video of Challenger Disaster Surfaces

The Challenger space shuttle a few moments after the rupture took place in the booster. Credit: NASA

Visit msnbc.com for breaking news

A rare home video that captured the explosion of the space shuttle Challenger Shuttle on Jan. 28, 1986, has been found. Bob Karman and his family were on a return trip from vacation to Disney World, and filmed the launch from the Orlando airport. This is thought to be only the second amateur video taken of the launch, back when home video cameras were just becoming popular.

A Slice of Daybreak

An orbital dawn view from the ISS on Feb. 4, 2012

[/caption]

Here’s a gorgeous view from the International Space Station, taken by the Expedition 30 crew on Feb. 4, 2012 as the station passed into orbital dawn. The greens and reds of the aurora borealis shimmer above Earth’s limb beyond the Station’s solar panels as city lights shine beneath a layer of clouds.

As the ISS travels around the planet at 17,500 mph (28,163 km/h) it moves in and out of daylight, in effect experiencing dawn 16 times every day.

From that vantage point, 240 miles (386 km) above the Earth, the lights of the aurora — both northern and southern — appear below, rather than above.

See this and more images from the Space Station’s nightly flights here.

Also, here’s a time-lapse video made from photos taken by the Expedition 30 crew a few days earlier. Enjoy!

(Video courtesy of the Image Science & Analysis Laboratory, NASA Johnson Space Center.)

RoboScopes – Real Armchair Astronomy

The Faulkes Telescope. Credit: Faulkes Telescope/LCOGT

[/caption]

Using and getting the most out of robotic astronomy

Whilst nothing in the field of amateur astronomy beats the feeling of being outside looking up at the stars, the inclement weather many of us have to face at various times of year, combined with the task of setting up and then packing away equipment on a nightly basis, can be a drag. Those of us fortunate enough to have observatories don’t face that latter issue, but still face the weather and usually the limits of our own equipment and skies.

Another option to consider is using a robotic telescope. From the comfort of your home you can make incredible observations, take outstanding astrophotos, and even make key contributions to science!

The main elements which make robotic telescopes appealing to many amateur astronomers are based around 3 factors. The first is that usually, the equipment being offered is generally vastly superior to that which the amateur has in their home observatory. Many of the robotic commercial telescope systems, have large format mono CCD cameras, connected to high precision computer controlled mounts, with superb optics on top, typically these setups start in the $20-$30,000 price bracket and can run up in to the millions of dollars.

A look at the Faulkes Telescope South inside. Credit: Faulkes Telescope/LCOGT

Combined with usually well defined and fluid workflow processes which guide even a novice user through the use of the scope and then acquisition of images, automatically handling such things as dark and flat fields, makes it a much easier learning curve for many as well, with many of the scopes specifically geared for early grade school students.

Screenshot of the Faulkes Telescope realtime interface. Credit: Faulkes Telescope/LCOGT

The second factor is geographic location. Many of the robotic sites are located in places where average rainfall is a lot lower than say somewhere like the UK or North Eastern United States for example, with places like New Mexico and Chile in particular offering almost completely clear dry skies year round. Robotic scopes tend to see more sky than most amateur setups, and as they are being controlled over the Internet, you yourself don’t even have to get cold outside in the depths of winter. The beauty of the geographic location aspect is that in some cases, you can do your astronomy during the daytime, as the scopes may be on the other side of the world.

iTelescope systems are located all over the globe. Credit: iTelescope project

The third is ease of use, as it’s nothing more than a reasonably decent laptop, and solid broadband connection that’s required. The only thing you need worry about is your internet connection dropping, not your equipment failing to work. With scopes like the Faulkes or Liverpool Telescopes, ones I use a lot, they can be controlled from something as modest as a netbook or even an Android/iPad/iPhone, easily. The issues with CPU horsepower usually comes down to the image processing after you have taken your pictures.

Software applications like the brilliant Maxim DL by Diffraction Limited which is commonly used for image post processing in amateur and even professional astronomy, handles the FITS file data which robotic scopes will deliver. This is commonly the format images are saved in with professional observatories, and the same applies with many home amateur setups and robotic telescopes. This software requires a reasonably fast PC to work efficiently, as does the other stalwart of the imaging community, Adobe Photoshop. There are some superb and free applications which can be used instead of these two bastions of the imaging fraternity, like the excellent Deep Sky stacker, and IRIS, along with the interestingly named “GIMP” which is variant on the Photoshop theme, but free to use.

Some people may say just handling image data or a telescope over the internet detracts from real astronomy, but it’s how professional astronomers work day in day out, usually just doing data reduction from telescopes located on the other side of the world. Professionals can wait years to get telescope time, and even then rather than actually being a part of the imaging process, will submit imaging runs to observatories, and wait for the data to roll in. (If anyone wants to argue this fact…just say “Try doing eyepiece astronomy with the Hubble”)

The process of using and imaging with a robotic telescope still requires a level of skill and dedication to guarantee a good night of observing, be it for pretty pictures or real science or both.

Location Location Location

The location for a robotic telescope is critical as if you want to image some of the wonders of the Southern Hemisphere, which those of us in the UK or North America will never see from home, then you’ll need to pick a suitably located scope. Time of day is also important for access, unless the scope system allows an offline queue management approach, whereby you schedule it to do your observations for you and just wait for the results. Some telescopes utilise a real time interface, where you literally control the scope live from your computer, typically through a web browser interface. So depending on where in the world it is, you may be in work, or it may be at a very unhealthy hour in the night before you can access your telescope, it’s worth considering this when you decide which robotic system you wish to be a part of.

Telescopes like the twin Faulkes 2-metre scopes, which are based on the Hawaiian island of Maui, atop a mountain, and Siding Spring, Australia, next to the world famous Anglo Australian Observatory, operate during usual school hours in the UK, which means night time in the locations where the scopes live. This is perfect for children in western Europe who wish to use research grade professional technology from the classroom, though the Faulkes scopes are also used by schools and researchers in Hawaii.

The type of scope/camera you choose to use, will ultimately also determine what it is you image. Some robotic scopes are configured with wide field large format CCD’s connected to fast, low focal ratio telescopes. These are perfect for creating large sky vistas encompassing nebulae and larger galaxies like Messier 31 in Andromeda. For imaging competitions like the Astronomy Photographer of the Year competition, these wide field scopes are perfect for the beautiful skyscapes they can create.

Scopes like the Faulkes Telescope North, even though it has a huge 2m (almost the same size as the one on the Hubble Space Telescope) mirror, is configured for smaller fields of view, literally only around 10 arcminutes, which will nicely fit in objects like Messier 51, the Whirpool Galaxy, but would take many separate images to image something like the full Moon (If Faulkes North were set up for that, which it’s not). It’s advantage is aperture size and immense CCD sensitivity. Typically our team using them is able to image a magnitude +23 moving object (comet or asteroid) in under a minute using a red filter too!

A field of view with a scope like the twin Faulkes scopes, which are owned and operated byLCOGT is perfect for smaller deep sky objects and my own interests which are comets and asteroids.Many other research projects such as exoplanets and the study of variable stars are conducted using these telescopes.Many schools start out imaging nebulae, smaller galaxies and globular clusters, with our aim at the Faulkes Telescope Project office, to quickly get students moving on to more science based work, whilst keeping it fun. For imagers, mosaic approaches are possible to create larger fields, but this obviously will take up more imaging and telescope slew time.

Each robotic system has its own set of learning curves, and each can suffer from technical or weather related difficulties, like any complex piece of machinery or electronic system. Knowing a bit about the imaging process to begin with, sitting in on other’s observing sessions on things like Slooh, all helps. Also make sure you know your target field of view/size on the sky (usually in either right ascension and declination) or some systems have a “guided tour mode” with named objects, and make sure you can be ready to move the scope to it as quickly as possible, to get imaging. With the commercial robotic scopes, time really is money.

Global Rent-A-Scope interface

Magazines like Astronomy Now in the UK, as well as Astronomy and Sky and Telescope in the United States and Australia are excellent resources for finding out more, as they regularly feature robotic imaging and scopes in their articles. Online forums like cloudynights.com and stargazerslounge.com also have thousands of active members, many of whom regularly use robotic scopes and can give advice on imaging and use, and there are dedicated groups for robotic astronomy like the Online Astronomical Society. Search engines will also give useful information on what is available as well.

To get access to them, most of the robotic scopes require a simple sign up process, and then the user can either have limited free access, which is usually an introductory offer, or just start to pay for time. The scopes come in various sizes and quality of camera, the better they are, usually the more you pay. For education and school users as well as astronomical societies, The Faulkes Telescope (for schools) and the Bradford Robotic scope both offer free access, as does the NASA funded Micro Observatory project. Commercial ones like iTelescope, Slooh and Lightbuckets provide a range of telescopes and imaging options, with a wide variety of price models from casual to research grade instrumentation and facilities.

So what about my own use of Robotic Telescopes?

Personally I use mainly the Faulkes North and South scopes, as well as the Liverpool La Palma Telescope. I have worked with the Faulkes Telescope Project team now for a few years, and it’s a real honour to have such access to research grade intrumentation. Our team also use the iTelescope network when objects are difficult to obtain using the Faulkes or Liverpool scopes, though with smaller apertures, we’re more limited in our target choice when it comes to very faint asteroid or comet type objects.

After having been invited to meetings in an advisory capacity for Faulkes, late in 2011 I was appointed pro am program manager, co-ordinating projects with amateurs and other research groups. With regards to public outreach I have presented my work at conferences and public outreach events for Faulkes and we’re about to embark on a new and exciting project with the European Space Agency whom I work for also as a science writer.

My use of Faulkes and the Liverpool scopes is primarily for comet recovery, measurement (dust/coma photometry and embarking on spectroscopy) and detection work, those icy solar system interlopers being my key interest. In this area, I co-discovered Comet C2007/Q3 splitting in 2010, and worked closely with the amateur observing program managed by NASA for comet 103P, where my images were featured in National Geographic, The Times, BBC Television and also used by NASA at their press conference for the 103P pre-encounter event at JPL.

The 2m mirrors have huge light grasp, and can reach very faint magnitudes in very little time. When attempting to find new comets or recover orbits on existing ones, being able to image a moving target at magnitude 23 in under 30s is a real boon. I am also fortunate to work alongside two exceptional people in Italy, Giovanni Sostero and Ernesto Guido, and we maintain a blog of our work, and I am a part of the CARA research group working on comet coma and dust measurements, with our work in professional research papers such as the Astrophysical Journal Letters and Icarus.

The Imaging Process

When taking the image itself, the process starts really before you have access to the scope. Knowing the field of view, what it is you want to achieve is critical, as is knowing the capabilities of the scope and camera in question, and importantly, whether or not the object you want to image is visible from the location/time you’ll be using it.

First thing I would do if starting out again is look through the archives of the telescope, which are usually freely available, and see what others have imaged, how they have imaged in terms of filters, exposure times etc, and then match that against your own targets.

Ideally, given that in many cases, time will be costly, make sure that if you’re aiming for a faint deep sky object with tenuous nebulosity, you don’t pick a night with a bright Moon in the sky, even with narrowband filters, this can hamper the final image quality, and that your choice of scope/camera will in fact image what you want it to. Remember that others may also want to use the same telescopes, so plan ahead and book early. When the Moon is bright, many of the commercial robotic scope vendors offer discounted rates, which is great if you’re imaging something like globular clusters maybe, which aren’t as affected by the moonlight (as say a nebula would be)

Forward planning is usually essential, knowing that your object is visible and not too close to any horizon limits which the scope may impose, ideally picking objects as high up as possible, or rising to give you plenty of imaging time. Once that’s all done, then following the scope’s imaging process depends on which one you choose, but with something like Faulkes, it’s as simple as selecting the target/FOV, slewing the scope, setting the filter, and then exposure time and then waiting for the image to come in.

The number of shots taken depends on the time you have. Usually when imaging a comet using Faulkes I will try to take between 10 and 15 images to detect the motion, and give me enough good signal for the scientific data reduction which follows. Always remember though, that you’re usually working with vastly superior equipment than you have at home, and the time it takes to image an object using your home setup will be a lot less with a 2m telescope. A good example is that a full colour high resolution image of something like the Eagle Nebula can be obtained in a matter of minutes on Faulkes, in narrowband, something which would usually take hours on a typical backyard telescope.

For imaging a non moving target, the more shots in full colour or with your chosen filter (Hydrogen Alpha being a commonly used one with Faulkes for nebula) you can get the better. When imaging in colour, the three filters on the telescope itself are grouped into an RGB set, so you don’t need to set up each colour band. I’d usually add a luminance layer with H-Alpha if it’s an emission nebula, or maybe a few more red images if it’s not for luminance. Once the imaging run is complete, the data is usually placed on a server for you to collect, and then after downloading the FITS files, combine the images using Maxim (or other suitable software) and then on in to something like Photoshop to make the final colour image. The more images you take, the better the quality of the signal against the background noise, and hence a smoother and more polished final shot.

Between shots the only thing that will usually change will be filters, unless tracking a moving target, and possibly the exposure time, as some filters take less time to get the requisite amount of light. For example with a H-Alpha/OIII/SII image, you typically image for a lot longer with SII as the emission with many objects is weaker in this band, whereas many deep sky nebula emit strongly in the H-Alpha.

The Image Itself

NGC 6302 taken by Thomas Mills High School with the Faulkes Telescope

As with any imaging of deep sky objects, don’t be afraid to throw away poor quality sub frames (the shorter exposures which go to make up the final long exposure when stacked). These could be affected by cloud, satellite trails or any number of factors, such as the autoguider on the telescope not working correctly. Keep the good shots, and use those to get as good a RAW stacked data frame as you can. Then it’s all down to post processing tools in products like Maxim/Photoshop/Gimp, where you’d adjust the colours, levels, curves and possibly use plug ins to sharpen up the focus, or reduce noise. If it’s pure science your interested in, you’ll probably skip most of those steps and just want good, calibrated image data (dark and flat field subtracted as well as bias)

The processing side is very important when taking shots for aesthetic value, it seems obvious, but many people can overdo it with image processing, lessening the impact and/or value of the original data. Usually most amateur imagers spend more time on processing than actual imaging, but this does vary, it can be from hours to literally days doing tweaks. Typically when processing an image taken robotically, the dark and flat field calibration are done. First thing I do is access the datasets as FITS files, and bring those in to Maxim DL. Here I will combine and adjust the histogram on the image, possible running multiple iterations of a de-convolution algorithm if the start points are not as tight (maybe due to seeing issues that night).

Once the images are tightened up and then stretched, I will save them out as FITS files, and using the free FITS Liberator application bring them in to Photoshop. Here, additional noise reduction and contrast/level and curve adjustments will be made on each channel, running a set of actions known as Noels actions (a suite of superb actions by Noel Carboni, one of the worlds foremost imaging experts) can also enhance the final individual red green and blue channels (and the combined colour one).

Then, I will composite the images using layers into a colour final shot, adjusting this for colour balance and contrast. Possibly running a focus enhancement plug in and further noise reduction. Then publish them via flickr/facebook/twitter and/or submit to magazines/journals or scientific research papers depending on the final aim/goals.

Serendipity can be a wonderful thing

I got in to this quite by accident myself…. In March 2010, I had seen a posting on a newsgroup that Comet C/2007 Q3, a magnitude 12-14 object at the time, was passing near to a galaxy, and would make an interesting wide field side by side shot. That weekend, using my own observatory, I imaged the comet over several nights, and noticed a distinct change in the tail and brightness of the comet over two nights in particular.

Comet C/2007 Q3. Credit: Nick Howes

A member of the BAA (British Astronomical Association), seeing my images, then asked if I would submit them for publication. I decided however to investigate this brightening a bit further, and as I had access to the Faulkes that week, decided to point the 2m scope at this comet, to see if anything unusual was taking place. The first images came in, and I immediately, after loading them in to Maxim DL and adjusting the histogram, noticed that a small fuzzy blob appeared to be tracking the comet’s movement just behind it. I measured the separation as only a few arc-seconds, and after staring at it for a few minutes, decided that it may have fragmented.

I contacted Faulkes Telescope control, who put me in touch with the BAA comet section director, who kindly logged this observation the same day. I then contacted Astronomy Now magazine, who leapt on the story and images and immediately went to press with it on their website. The following days the media furore was quite literally incredible.

Interviews with national newspapers, BBC Radio, Coverage on the BBC’s Sky at Night television show, Discovery Channel, Radio Hawaii, Ethiopia were just a few of the news/media outlets that picked up the story.. the news went global that an amateur had made a major astronomical discovery from his desk using a robotic scope. This then led on to me working with members of the AOP project with the NASA/University of Maryland EPOXI mission team on imaging and obtaining light curve data for comet 103P late in 2010, again which led to articles and images in National Geographic, The Times and even my images used by NASA in their press briefings, alongside images from the Hubble Space Telescope. Subscription requests to Faulkes Telescope Project as a result of my discoveries went up by hundreds of % from all over the world.

In summary

Robotic telescopes can be fun, they can lead to amazing things, this past year, a work experience student I was mentor for with the Faulkes Telescope Project, imaged several fields we’d assigned to her, where our team then found dozens of new and un-catalogued asteroids, and she also managed to image a comet fragmenting. Taking pretty pictures is fun, but the buzz for me comes with the real scientific research I am now engaged in, and it’s a pathway I aim to stay on probably for the rest of my astronomical lifetime. For students and people who don’t have the ability to either own a telescope due to financial or possibly location constraints, it’s a fantastic way to do real astronomy, using real equipment, and I hope, in reading this, you’re encouraged to give these fantastic robotic telescopes a try.