Instructionals:

“The universe is not required to be in perfect harmony with human ambition.”

—Carl Sagan

          To tell you everything you'd need to know to both succeed and flourish in this hobby I'd need to spend years writing, and would wither to a husk hunched over my desk while perfectly beautiful imaging nights passed me by (though I've catalogued the best resources I've collected in my the 'Resources' section of this site, for those who truly want to pump the gas!) Instead, this page outlines in broad strokes the general steps and/or workflow that I use in various aspects of the hobby, and I hope it will provide a useful starting block for aspiring hobbyists. 



Choosing the Right Equipment

          The table supplied here is to give an idea of what you can expect to find in a given price range for a given element of an imaging set-up. First, a couple of disclaimers:

          I've generally bifurcated each entry in this table into two types. One is for imaging Solar System Objects (SSOs), and the other is for imaging Deep-Sky Objects (DSOs). The needs of the two are a vendiagram with a very small intersection. DSOs tend to demand low focal ratios (i.e. "f-number") and accurate tracking. Bright/commonly imaged SSOs (like the planets out to Saturn) demand high magnifications and benefit as a result from large apertures and long focal lengths. Most all scopes and many astronomy-specific cameras are specialized for one or the other, and while many can provide some results outside of their primary use case, they won't do it particularly well. Very few can do both well, and they will make your wallet cry. Exceptions to the rule will be mentioned.

          Another proviso: I don't mean to imply that something is cheap or an average expense if it falls in one column or another- I've rated products in the gamut of those that—for what they are— are billed as or commonly considered 'budget' options, 'premium' options, or something in between. Bear in mind that for nearly every aspiration in the hobby there is a way to get there affordably given time, dedication, and/or ingenuity. If you want to buy your way directly to many of those aspirations, it will make your wallet scream (and your brain besides).

"I have no money and I must scream": "I have a little wiggle room": Middle of the road: "Help, my wallet is too fat:" "Wadsworth, tell them I want it in rose gold!"
Optics:
Cameras:
Mounts:
Accessories:



A Planetary Image Capture Workflow:

1. Choose the right time and location.

         See the relevant area of the "Resources" page of this site. For the planets, one doesn't need to be very picky. Good, still "seeing" is worth seeking out, and so is actually being able to see the planet. Planets further from the sun are best imaged when they reach the perigee (nearest distance to Earth) of their orbits, also called "opposition". For coreward planets, they will only be visible at oblique angles when they become illuminated side-on in through twilit or nearly-twilit skies.

2. Go to the location with time to spare.

         Being able to set up in daylight is a plus if possible. You will also want time to set up, allow your optics to equalize with the outside temperature, chat with any curious folks that want to see what you are up to, and maybe even impress a cutie (technical term).

3. Collimate your scope (if applicable).

         You may be tempted to do this at home, but know that transportation substantially risks undoing your hard-won alignment.

4. Align your finder scope/sight with your main imaging train.

         Terrestrial targets like street lights and radio towers are great for this, provided they are a great distance away. This should be done through the same train of filters, adapters, etc. that you plan to capture with. It also may be entirely superfluous if you are using a Go-to mount of some kind and plate-solving software. At this point, I only attach my Telrad to my optical tuve because it helps to balance it (being a classic Newtonian, it is naturally not balanced along its optical axis).

5. Ensure camera settings are correct, achieve focus.

         Capturing planetary data is best done in video at the highest resolution possible, and digitally zoomed if this feature can retain the same pixel ratio. Use the Sunlight white balance, the lowest ISO that will work, and have no in-camera corrections applied to the data. With these settings ensured, enter live-view and focus your optics on a medium-brightness star with a Bahtinov mask (or autofocuser, other focusing mask like a Hartmann mask, etc.) You could just eyeball the focus, but try to get it as tight as possible. If your focuser can lock or tighten down its position, do so.

6. Find your target and begin recording.

         If you're shooting untracked, once the target is located, lead it and let it drift through the FoV without touching the scope. Software will take care of centering the object in post (the program in question is called Planetary Imaging Pre-Processor or "PIPP"). If you're tracking via go-to and/or polar alignment, follow the procedures in the next tutorial for those steps as well before recording. Ideally one should record in video, but taking many still exposures can work as well. Note that in a DSLR, the slap of the mirror can cause enough vibration to ruin or harm your exposures, so if you aren't shooting video, you should be using the setting that separates the actuation of the mirror and the exposure into separate button-presses, or else introduces a delay of at least a few seconds. Most 'prosumer' models have such a feature somewhere in their menus. As you can imagine however, this creates a big bottleneck on the data you collect. This is one reason shooting video is preferred on DSLRs, and often in general if too much quality isn't lost in video format. Mirrorless and astronomy specific cameras mitigate this consideration of course. No matter what, you want to be using some method that keeps your hands off of the instruments during recording, because pressing the shutter button on your camera will introduce vibrations, even if you have the steadiness of a brain surgeon. The solution is to get a shutter remote that is either wireless or uses a long cable, or else shooting 'tethered' with a laptop and the appropriate software. If you are shooting with a monochrome camera and a filter wheel, just repeat this process in for each filter, but work fast. The rotation of many planetary targets is fast enough to change in a matter of minutes. Derotation software like WinJUPos can account for this, but you need to at least get the color layers of each frame synchronized.

7. All done!

          Yep, planetary astrophotos are a breeze. In their simplest form, the only electronic you need is the camera and a remote, and if you are using film for some inscrutable reason you need no electronics at all. Further down in this page, you can read about how to process planetary data.



A Deep-Sky Image Capture Workflow:

1. Choose the right time and location.

         See the relevant area of the "Resources" page of this site. Deep-sky efficacy heavily depends on the following environmental factors: light-pollution from terrestrial sources and moonlight/airglow, atmospheric transparency from clouds and uncondensed moisture/smog/smoke/etc., lack of wind to prevent the buffeting/movement/differential flexure in your set-up, hard ground to ensure the mount doesn't shift out of alignment, low dew-point to prevent fogging/dew on instruments.

2. Go to the location with time to spare.

         While experience will yield faster results, there are far more steps to this set-up and it will take time: often more than an hour. This should be sufficient for your optics to equalize with outside temperatures if they aren't extreme, software can help determine the time needed, and can be found in my "Resources" section.

3. Adjust your tripod's legs to make the head as level as possible.

          If there is an accessory tray meant to screw in between the three legs, do so: it isn't just for accessories, but for ensuring full deployment of the base and adding rigidity to the structure. Use a spirit level to ensure your EQ mount's tripod is as level as possible, as this will make polar alignment much easier and faster. A poorly leveled tripod means that adjustments to the altitude or azimuth of the polar axis will adjust along both axes to some degree, requiring many more adjustments to reach alignment. "Slow is smooth, smooth is fast."

4. Attach the mount head and use the polar alignment sight to "rough in" polar axis with the North or South Celestial Pole.

         Procedures vary between mounts, know yours! This primarily concerns my own experience, that is to say using a GEM in the mid-Northern hemisphere. That entails using a polar scope, turning the declination axis 90 degrees, attaching the counterweight bar, using it as a lever along the right ascenscion axis to line up Ursa Major and/or Casseopeia in the reticule, and then placing polaris in its circle within the reticule using the azimuth and altitude knobs.

5. Attach your counterweight, then attach your optic.

          This order prevents accidental slippage of your unbalanced scope, and potentially breaking your equipment and your will to live.

6. Now is a good time to collimate.

          Do it if your scope is the type that requires collimation. Most reflecting telescopes will need this tuned from time to time. Some models will retain collimation better than others. You may not need to adjust anything, but you ought to be checking regardless.

7. Attach your finder/guide scope, your imaging train, your cables, and anything else that add's weight on top. Then, balance your mount.

          For an equatorial mount, good balance is critical for accurate tracking and guiding. First, unlock your RA axis and rotate it parallel to the ground and slide the counterweight to balance. At this stage, it pays to know your target, or at least which side of the meridian you will be imaging on. The reason is that most equatorial mounts (that aren't very high-end) include a bit of mechanical slop in the gears of their drives. This is an intentional thing because the motor is only so strong and the tolerances of the machining are only so precise. A little slop lets it move freely without straining itself, but it's not good for photography! If you know where your target is, the fix is simple: you want to weight your right ascension to be slightly east-heavy, just enough to allow the gears of the RA drive to remain meshed. When this is done, lock the RA axis in this 90 position, and unlock the declination. Correct for this by sliding the optical tube forwards or backwards before retightening the dovetail thumbscrews, but once you return the scope to the home position, tighten these thumbscrews again. They are a very common point of differential flexure that is the bane of astrophotography.

8. Complete fine-scale polar alignment.

         This can be done in many, many ways. Some use a fancy instrument like the QHY Polemaster, some use the dirt-simple manual method of drift-aligning with a reticle eyepiece, others use the indications from their mount's onboard go-to alignment feedback, and others still use feedback from an autoguider camera and a program like PHD2 or SharpCap. I use the latter method so the first instrument I connect to my laptop or power on is my guide camera. Then I connect to the camera in SharpCap Pro, bring it into focus, and then use the Polar Align tool to adjust the alt-az knobs on my mount to get within 10 arcseconds of the north celestial pole.

9. Power on your mount and bring it fully online.

          Next up I power on my mount. Procedures vary, but for one like mine (Meade LX-85) you use your handbox to tell your mount the date/time. At this point, my mount also likes to go through it's go-to alignment before connecting with a computer, so even while I align that with plate solving, I let the mount do it's thing and tell it what a good job it did. You may actually be using the Go-To alignment in your mount's handbox, in which case: your workflow should involve having aligned a finder-scope as well as having connected/focused your main camera by now.

10. Connect your mount, your camera, autoguider, etc. Ensure everything is communicating as it should be.

          This is a stage that is intensely dependent on the hardware and preferences of the user. I connect my mount to the ASCOM Device Hub on my laptop, and then power on Astrophotography Tool ("APT"), before connecting my main camera to this program, and connecting this program to the ASCOM Hub as well. Then I connect my autoguider to PHD2, and that program—as you may have guessed—also connects to the ASCOM Hub. At this stage, I use the controls in APT to actuate the thermoelectric cooling in my camera, and let it cool.

11. Complete any secondary mount calibrations.

          This isn't vital to do every night, but should certainly be done at least once any time you alter your imaging payload. At this stage, I mean executing routines within your mount like drive training. For this, I initiate the process in my mount's handbox (which turns off sidereal tracking/activates terrestrial target-mode). Then I activate my camera's live-view in APT before slewing to a distant terrestrial target, bringing it to focus (I eyeball focus at this stage), aligning my finder-scope, and then executing the drive training procedure in the mount handbox. There are even finer calibrations one can do in this capacity, like fine tuning the drive ratios, etc. but they ought to be done in the daylight, and aren't vital unless your mount is really confused.

12. Complete Go-To alignment.

          If you're using this guide, your mount almost certainly has Go-To pointing software, which is an enormous boon. Like polar alignment, there are many ways to achieve this. My workflow involves doing it now by slewing to look at the sky and focusing enough that stars look crisp in the preview. This is also eyeballed, because my next step is using the preview to plate-solve with ASTAP (integrated with APT) before syncing the mount and using APT's Pointcraft tool to select a medium-brightness star and telling it to slew there.

13. Fine tune focus of your main camera.

         Once I have the bright star selected in the previous step centered, I affix my Bahtinov mask, and use the live-view on my camera/APT's Bahtinov Aid tool to reach ideal focus.

14. Slew to your target.

         Since this workflow requires me to know my target during the balancing step, simply plate-solve and sync the mount before finding the object in APT's Pointcraft and telling it to slew there. Once it's done slewing, I check to make sure the object is centered, either with more plate-solving, or with short exposures that show reference stars and a slow slew speed.

15. Activate and calibrate autoguiding.

          At this point, I open PHD2 and tell it to find reference stars (using the multi-star guiding in the recent development builds, it makes a huge difference!), before telling it to calibrate. This takes a minute or so, so at this point I may get more coffee. Once the calibration is done, I watch the graph of the autoguiding for a bit to ensure that it's operating smoothly. At this stage I also turn on the PEC training on my mount and tell it to start training. Note that if you also use an LX-85, PEC training will not work unless the firmware is fixed. I don't recommend this mount. It needs a lot of highly technical TLC to get it working right.

16. Capture your light frames.

         At this point you should first double check that all possible points of unintended mechanical flexure are tightened fully, and that your camera is set to the best possible settings. Shooting a whole night of footage and finding out it's unusable in post-processing is a private embarassment that will age your inner-child by at least 20 years (rough years). That is to say, use Daylight White-Balance, RAW format encoding, and an ISO/Exposure time combination that uses the lowest ISO and longest exposures needed to achieve optimal Signal:Noise ratio. This can be a complicated question to deduce from theory, but in practice, it can be at least proximately found with test-exposures and a bit of 'the knack'. You should aim to capture the most integration time you're comfortable with. While diminishing returns come after an extended period, you can generally assume that more is neccessarily better.

17. Capture your in situ calibration subframes.

         In short, this involves first shooting "dark frames" with the exact same settings, but with your lens cap on, and all diligence done to ensure light doesn't leak in any other which way (such as the back of a mirror-cell). More is better, but my rule is no less than 25 dark frames. Then you take your "flat frames" in which you will need to shoot while pointing the reopened optics at a flat, white light source, with exposure adjusted for even metering. This surface can be a tablet screen, an artist's tracing light-board, or the bright morning sky with a white cloth stretched over the aperture for diffusion. I aim for a very large number of these—at least 100—because they are very fast exposure times and can be taken about as fast as most cameras can write the data. The same is true with "bias" or "offset" frames. These measure distortions caused by the inherent imperfection in sensor structure and consequently subtract that from the final stacked image (after stacking, they can become very apparent). This is done by taking exposures with the lens cap on once more, with all settings the same except the exposure speed- which should be as short as possible: generally around 1/4000th of a second. I like to take at least 250, because again, it doesn't take long to make these. You can also capture them whenever, as long as you match the settings up right.

18. Pack it in, go home, and get some sleep!

         Hopefully you aren't an obsessive masochist like me, and can allow yourself to sleep without processing the data first. It will be there in the morning... err it will be there whenever you wake up!




A Planetary Image Processing Work-flow:

While image processing seems complicated at first, it is ultimately the easiest part. Mistakes are reversible, and processes are repeatable.

1. First, download your data to your computer.

         This is likely to be a very large collection of files, and your computer will need a storage buffer/cache in additon, so ensure that you have at least twice the size of the data in free storage. While your computer doesn't need to be a powerhouse, a nice PC can make a tremendous difference in speed for stacking/processing. An actively weak computer may choke on the processing. I use Windows 10 and my processing experience mainly extends to this OS. My recommended processing regimen for Mac users involves processing their machine itself in an industrial baler before building or buying a Windows computer. Linux users should be able to process the data directly in their alien-robot brains.

2. Open the video file(s) in Planetary Image Pre-Processor.

         This free program is the reason I insist that tracking mounts are truly gratuitous for planetary. The program combs through each frame of your video file(s)—solo or en-bloc—before detecting your planetary subject, and then optionally centering, cropping, resizing, color-correcting, sorting for clarity, and spitting back out as many or as little of the best frames as you like, in video, gif, or image format. One should opt for the highest fidelity output they can get, which for many cameras in video is simple 8 bit-depth .AVI. For this method, known as "lucky imaging", the sheer number of frames makes up for lost color depth.

3. Open PIPP's output file in AutoStakkert3.

         For the actual weighting, alignment, and stacking of planetary images, this program does the finest job by far. With just a little bit of tweaking, perhaps locating or refining points of contrast on the slideshow of frames, this program combines the best parts of the best images into a startlingly crisp planetary image.

4. Open AutoStakkert3's output file in Registax 6.

         While it's stacking produces inferior results to AutoStakkert3, Registax6 includes a conveniently collected suite of post-processing features that are go a long way towards refining the clarity of stacked planetary images. Namely the Wavelets transform feature (with the "Waveletscheme" set to 'Dyadic (2^n)' and "Wavelet filter" set to Gaussian, adjusting the level of each layer while tweaking the 'denoise' and 'sharpen' coefficients) yields much sharper images, the "Denoise/Deringing" applet deals with bloom, and the "RGB" align corrects for chromatic abberations. Nothing that can't be done in other software, but it's a handy lightweight interface for these tools.

4. Complete final adjustments in Topaz Denoise AI and GIMP.

         These two programs can accomplish a great deal to intelligently filter out any remaining noise, and to adjust the levels of the image into something truly polished. Use a light touch in Topaz as a rule with astrophotos. It's an aggressive program and what looks good in a thumbnail destroys detail and creates ugly super-pixels close up. You want to use the bare minimum needed to *reduce* noise while preserving the fine details.




A Deep-Sky Image Processing Work-flow:

I've tackled such images in a number of ways, but this is an example of steps one might take.

1. First, download your data to your computer.

         I really hope this isn't a surprise to the reader.

2. Open the raw files in DeepSkyStacker, and let it work.

         While one can accomplish everything this program does better in PixInsight, in practice, DSS does at least as good a job as many people's best efforts in PI, and automates the processes. The user uploads their RAW image files by type: Light Frames, followed by the three calibration frame pools mentioned in my DSO capture tutorial: Dark, Flat, and Bias frames (optionally one may want to capture the bias signal with dark-flat frames. This is preferred for a number of one-shot-color CMOS sensors. The theory is covered elsewhere, but these are basically shot through a closed optic like dark frames, only at the same exposure time and other settings used for your flat frames. You could also think of them as longer bias frames.) Then you click "Select all" and "Register all" to see the final dialogue. This allows a lot of important tweaks, but the program will suggest the ideal settings for your image set (though you can and should learn more about these settings; computers lack intuitive judgement and make mistakes). For the purposes of this tutorial, you just need to make sure that the star detection picks up at least ~30 stars in order to stack the images accurately, and follow the program's recommendations.
         What DSS puts out is the result of combining all of the pools of calibration data into three ‘master’ noise profiles, which are subtracted in turn from each other, creating a single calibration stack subtracted from each light subframe, before each of those lightframes is overlaid or “stacked”. Greater visibility or weight is given to those images with clearer focus and tracking (with most preferred settings profiles) but all told that is just a coefficient that tweaks each image’s proportionate weight based on the number of subframes. Where the images agree, the values are kept. What results is a single, large image file with a much higher signal-to-noise ratio than any individual subframe. On a technical level, it is a far superior file, but it’s far from good-looking,yet. This uncompressed TIFF file is not only ungainly in size, but you’ll likely also notice that it looks like garbage! That’s because the data is displayed linearly. It’s all there but it will require adjustment of it’s levels and resizing to make all of that precious signal appear vividly, intelligibly, and attractively to a person (and to effectively inhabit the shallower bit-depth of color space used by most image filetypes and the monitors built around them.)

3. Carry out any processes that require linear data.

         This is generally a step at higher levels of mastery, using software like PixInsight or Siril. Processes at this stage may include, but are not limited to: deconvolution sharpening, color calibration, background/gradient removal, and denoise processes like ACDNR.

4. Use a program to perform a "non-linear stretch" of the data.

         Many programs are capable of this including PixInsight, RawTherapee, Photoshop, and others. For this process I use GIMP- which is in most all respects a free version of Photoshop for use-case and toolset. A non-linear stretch in GIMP involves selectively applying gain and suppression to the whole gamut of pixel-data or "histogram' in order to provide the greatest share of its range (and thus contrast/definition) to the data that visibly expresses the data from our astronomical object. In my "Resources" section, I link to a complete tutorial on non-linear stretching. Eventually, the data can't be stretched any more without becoming unintelligible again as it is lost to the limits of what color depth is there to stretch. The recieved wisdom is that no process will take you much beyond this point, and while machine learning may allow you to dither out a longer stretch, this is a very destructive process and at least as likely to destroy details while evincing others. Best to avoid the shrink-wrapped nebulas that some hobbyists produce.