Level of details in prime focus vs eyepiece images

In summary: You will have poorer results if you are not stacking images - I suggest looking at a youtube video or two on how to take a short movie of the sun and stack the frames within it - there are multiple free tools for the job. The final thing in your list is really the matching of the chip pixel size to the image size but you need to look at the other bits first as they are the most significant and you should be able to get some decent results.
  • #176
Devin-M said:
the max available space on this website
I don't understand how a particular website and how it displays images is of any consequence to real photography. If you want to show people your images in all their glory then you send them your own large files. The vagaries of a website just can't be trusted so why bother with it - if quality is important?

There is a phrase "Procrustean bed" which applies here, I think.

Plus, I would love to know how to access that drop down of resolution choices
 
Astronomy news on Phys.org
  • #177
It's necessary to know the final display resolution so the actual pixels don't get resized to a lower resolution when you upload it if you want the highest possible angular resolution from the sensor to your eyeballs (this is called a 100% crop). You would lose resolution if you chose a 1000w area and tried to upload it for an 800w final display.

After selecting the crop tool from the menu on the left, select a ratio from the drop down menu circled in red.

You'll find that the pixels won't resize in this operation. Since I've selected an area that's the same width as the final display width, I'll get each pixel on the sensor displaying on one pixel on the final display on this web page, which theoretically gives the best resolving power from sensor to eyeball.

ratio-crop.jpg


800px2.jpg
 
  • #178
By 4am I’d been wiping dew off 3 lenses every 20 minutes for 8 hours… I think I ended up with 3 targets and over 20 hours of observation over the course of 9 hours… I depleted 2 full sets of batteries last night so I’m still shooting dark frames in the refrigerator this morning… Still have loads of processing to do til I have something to show for it…

8D9FE5DA-38B7-4BED-8A6A-D94B092A6B22.jpeg

C1427BB4-8767-4E68-ADD0-E35F7E517923.jpeg

DE79D137-249F-4012-A9AF-3DD1F638320E.jpeg

38ED4BE3-7CB2-470F-9A56-0636B9D9E638.jpeg
 
Last edited:
  • Like
Likes Drakkith
  • #179
Flying Bat Nebula - Ha-RGB Composite - 300mm f/4.5 on 35mm sensor
12x20min (4 hrs) 6nm Ha Filter @ 6400iso
60x1min (1hr) RGB (no filter) @ 3200iso

flying-bat-ha-rgb2.jpg


100% Crop

flying-bat-ha-rgb-100pc-crop-1-jpg.jpg


Orion + Assorted Nebulas Ha Filter - 24mm f/2.8 on a 35mm sensor
12x5min (1hr) @ 6400iso

orion_ha-jpg.jpg


100% crop

orion_ha-crop-1-jpg.jpg
 
Last edited:
  • Like
Likes Drakkith
  • #180
4182784.png

4182784-1.png

5883434-1.jpeg

5883434.jpeg

https://www.speakev.com/cdn-cgi/image/format=auto,onerror=redirect,width=1920,height=1920,fit=scale-down/https://www.speakev.com/attachments/flying-bat-ha-rgb_1600-jpg.152139/
 
Last edited:
  • Like
Likes Drakkith
  • #181
Devin-M said:
You'll find that the pixels won't resize in this operation.
That's true because it's what the crop operation does. The problem is that areas of images often need to be scaled at different rates. Dealing with and introducing spatial distortions must always involve non- integer sample rate changes. Although the simple technique of linear scaling can be used for many astro images, you can only do it for long focus lenses if you want to stitch images together because of barrel and pincushion distortions in the overlap.
However, I wonder how relevant the image quality deterioration can be when dealing with true bit images of any astro object. Stacking will add jitter and reduce such problems which, to be honest, should only be noticeable when interpolation filtering doesn't use enough nearby pixels.

I do acknowledge, however, if you are aiming to get the best from a website which uses crude up or down scaling then you need to provide them with images of correct pixel numbers and image dimensions.

I can't find much about the process of image sampling on Google; there's the usual hole in the information with descriptions of how to use Photoshop etc - but not what it actually does (that's worth a lot of money to them) and, at the other extreme, applications of pixel processing for specific applications such a facial resolution. I have no access to appropriate textbooks and searching for information at that level is a real pain. But sampling theory (multidimensional) does tell us that, subject to noise, the original (appropriately filtered) original image can be reconstructed perfectly with the right filtering. You only need to look at the best up-scaled TV pictures to see how good things can be.
 
  • #182
Here's an easy way to measure what your sensor can theoretically do... In my case I think I'm more limited by the lens than the sensor.

I took an RGB source image from last month which is 600mm f/9.

100pc-crop.jpg


Next I did a 100% ratio crop of an 800w x 620h patch from the RAW file, so the pixels have not been resized and the pixel dimensions are the same as the maximum allowed dimensions on this web page so I should be getting exactly 1 pixel from the sensor for every pixel displayed on this webpage...

600mm-f9-100pc-800w-d800.jpg


Next I uploaded the image to http://nova.astrometry.net/upload ...

4184584.png


4184584-1.png


4184584-2.png


5885661-1.jpeg


5885661.jpeg


Now I have the measurement of the sensor's arcsec/pixel capability with the 600mm f/9 lens fitted...

4184584-3.jpg


Center (RA, Dec):(314.139, 44.664)
Center (RA, hms):20h 56m 33.411s
Center (Dec, dms):+44° 39' 51.833"
Size:22.4 x 17.3 arcmin
Radius:0.236 deg
Pixel scale:1.68 arcsec/pixel

The lower the arcsec/pixel the finer the detail one can resolve assuming they have a perfect lens...
 
Last edited:
  • #183
I need a better lens before I need a better sensor...
 
  • #184
Devin-M said:
Now I have the measurement of the sensor's arcsec/pixel capability...
If you use a different lens the arcsec/pixel ratio will change. I think that's what's normal to specify pixel size and number of pixels along each axis. The quantity is more 'portable'.
Devin-M said:
I need a better lens before I need a better sensor...
With your system, that's X3 cost. Ouch!
 
  • #185
sophiecentaur said:
With your system, that's X3 cost. Ouch!
My optical tube assemblies are only worth about $316 USD each (used - Nikon 300mm f/4.5 $161 & Nikon TC-301 2x teleconverter $155)

I think the OP could estimate whether it’s better to shoot through the eyepiece or prime focus by measuring the arcsec/pixel of both options and then also counting how many pixels wide the stars are with both options.
 
  • #186
I did a ratio crop of a single dim star at 600mm f/9 so the pixels from the sensor weren't resized, and then I did an interpolation-free enlargement using "nearest neighbor" as the resampling algorithm to 620px height...

single-star-3.jpg


Then I counted how many pixels wide a dim star was (14 pixels)...

single-star-600mm-f:9-14px-star.jpg
So now I know something about how good the sensor is at a given focal length (in this case 1.68 arcsec/pixel at 600mm) and I know how good the lens is -- stars which should only cover a single pixel have a radius of 7 pixels, so I think the lens needs to be around 7x sharper before I could get more detail with a denser sensor...
 
  • #187
I think if you multiply the dim star pixel radius by the arcsec/pixel, and then test 2 different options, whichever result comes out to a lower # suggests to me you'll be getting more detail with that option...

In the test above I got 7px radius star x 1.68 arcsec/pixel sensor/lens combo = ~11.7 arcsec/pixel effective resolution considering the flaws in the optical tube.
 
Last edited:
  • #188
For comparison, on my most recent shoot of the Flying Bat Nebula, I used the same 300mm f/4.5 lens, but I didn't use the TC-301 2x teleconverter...

100% ratio crop 800w x 620w
flying-bat-ha-rgb-800x620-crop.jpg


4192067.png


4192067-1.png


4192067-2.png


5894566-1.jpeg


5894566.jpeg


Which gives:

Center (RA, Dec): (317.713, 60.755)
Center (RA, hms): 21h 10m 51.124s
Center (Dec, dms): +60° 45' 18.476"
Size: 44.6 x 34.5 arcmin
Radius: 0.470 deg
Pixel scale: 3.34 arcsec/pixel

So I know I'm getting 3.34 arcsec/pixel projected onto the sensor...

Now I enlarge a dim star to 620h with no interpolation / "nearest neighbor" --

flying-bat-ha-rgb-single-star.jpg


The star radius is about 3 pixels...

If I multiply 3px star radius times 3.34 arcsec/pixel, I get around 10.02 effective arcsec/pixel. That's slightly better than the 11.7 effective arcsec/pixel I get by doing the same test with the 2x teleconverter added which makes me think it probably isn't worth using the 2x teleconverter to try to get better resolving power on a target.
 
  • #189
I just ordered a 6" 1800mm focal f/12 Meade LX85 Maksutov-Cassegrain OTA for $779 USD... we'll have to see what that does when it shows up in the mail...

meade-lx85-m6-ota-1_copy.jpg
 
  • Like
Likes collinsmark
  • #190
I received the 1800mm f/12 today… I made a little video from a few test frames showing the atmospheric wobble from shooting a mountaintop (Mt. Lassen in Northern California) which is about 44 miles away…

The telescope feels heavier than I was expecting though I haven’t weighed it yet. I was able to get it perfectly balanced in both the RA and Dec axis on my cheap tracker but I had to do some “creative” rigging to achieve this and I estimate I’m 2-3x over the weight limit of the tracker but I expect it will work. It’s a good thing I have more than one of these trackers as I had to use an extra counterweight from another one in addition to using a camera with a 600mm f/9 lens + video fluid pan head as additional counterweight (i’m not intending to image through the counterweight camera its just there for balance).

There’s a bit of vignetting and dust but I will be correcting these with flat frames when I do astrophotography. Here’s a few pictures including an un-retouched full frame, a 100% crop and some pictures of the setup.

I mounted the camera at prime focus with a t-ring for nikon dslr’s and a t-adapter. The camera is a Nikon D800 with a 35mm full-frame 36MP sensor and the telescope is a Meade LX85 1800mm f/12 (6 inch aperature) Maksutov-Cassegrain reflector.

F822C5F5-0E38-4C2C-9098-B7669AD4E9C8.jpeg

7BA17375-094C-4FAC-B5F2-EB04CDA9CD78.jpeg

23980EA5-9BC7-4CB4-A37F-8290D19FCB5F.jpeg

125B53D5-419E-41CC-8E69-BF4BDFB97C45.jpeg

0F3695C5-1666-4F08-A10F-E6AED56070FC.jpeg

87FCF9C9-3CD0-4D28-AB19-C14429FF7043.jpeg
A874F118-9A74-4AF4-B4FB-4FC7445CC025.jpeg

0178C2CB-6A87-4E9D-BE6A-CFCD92833E1B.jpeg

234F68CC-9E94-4181-931D-3854582CB502.jpeg
 
Last edited:
  • Like
Likes collinsmark
  • #191
sophiecentaur said:
There you go! Camera characteristics are things you just have to buy your way out of...

PS I was wondering whether a large set of still images might give you enough for dealing with 'planetary problems' and give you inherent high res.
No doubt my camera is very beginner level and I will soon grow out of it. As you said my camera is more suited for nebula imaging. But the clouds here are pierced only by planets. Gotta wait for another month for winter skies.

I tried stacking raw stills. It is a huge effort to get the same number of frames I get from a 50 fps video. So my probability of catching "lucky" frames are always less. Looks like fps wins over resolution in planetary.

Devin-M said:
One way you can test if you’re losing resolution in 1080p mode…
When I shift from 4K to 1080p FHD mode, I can see the framing change. The subject is now more zoomed in, which indicates that it is cropping the sensor resolution down to the central part.

So the planet resolution is the same (pixels per arc-sec), but I'm losing details with the MP4 algorithm.

One thing I can improve without spending money is by spending some time and waiting for better skies. Now Its moisture-laden and quite hazy.
 
  • Like
Likes Devin-M
  • #192
I started to worry about the overall sharpness after yesterday’s test of the 1800mm f/12, but after some new tests I’m not so worried. I think most of the loss of sharpness yesterday was on account of the 44 miles of dense atmosphere I was shooting through towards the mountaintop.

I have 4 test images, the 1st is a 100% crop of an 800px w x 620px h section of a test image of the top of a tree down the street from my home (about 618ft away - shot at 6400iso 1/2000th sec), so this one should show the actual pixels captured by the sensor 1:1 on this webpage with no resizing. The second image shows the full frame although I had put the camera into 1.5 crop sensor mode to crop out the vignetting so even though its a full frame 35 mm sensor we are only seeing the APS-C sensor sized central portion of that sensor @ 4800px x 3200px before upload, and the third image is taken through the “telephoto” lens in my iPhone 11 Pro, 100% crop and the fourth image is the iphone telephoto full frame…

7A35ADE4-40AB-4B08-AEB8-D6FC6C5BB336.jpeg
DAE78B37-1E34-4F65-981C-7BE52DE4EE14.jpeg

IMG_8023.jpg

B6490854-B24E-4917-9774-416E2DE76294.jpeg


…looks pretty sharp to me!
 
Last edited:
  • Like
Likes collinsmark
  • #193
Do you see any sharpness difference between the image and eyepiece live?

I'm also now realising how important the focus mechanism is, to get really sharp images.
 
  • #194
saturn_stacked_3.gif


1800mm f/12 6400iso, 50 raw x 1/320th sec, nikon d800 @ prime focus, meade lx85 maksutov-cassegrain, 100% crop, 448px x 295 px, 0.56 arcsec/pixel
 
Last edited:
  • #195
PhysicoRaj said:
I'm also now realising how important the focus mechanism is, to get really sharp images.
This is well known to astrophotographers and they use, either a Bahtinov mask or, when they are using a PC to control things, they autofocus with the software.
It's largely why people are prepared to spend so much on a good focuser that has a steady action and which doesn't creep during a session.
 
  • #196
sophiecentaur said:
This is well known to astrophotographers and they use, either a Bahtinov mask or, when they are using a PC to control things, they autofocus with the software.
It's largely why people are prepared to spend so much on a good focuser that has a steady action and which doesn't creep during a session.
I 3D printed a Bahtinov and it works like a charm. The only issue is the telescope focusing mechanism itself, which needs to be finer I feel. I have seen some people add a DIY mod like a bigger wheel to get precise focus, have to try that and see.

Apart from that, since my scope is an achromat and not an apochromat, I could be seeing one colour plane out of focus, which could be reducing the overall sharpness??
 
  • #197
I think my sharpness on Saturn was being limited a bit by atmospheric dispersion based on the blue fringing at the top and red fringing at the bottom…. Saturn was quite low to the horizon while I was imaging. They make a corrector for that but I’m not sure I’m ready to fork over the cash for it quite yet…

From an “Atmospheric dispersion corrector” product description:

https://www.highpointscientific.com...bvBUFd5kWBdkDaTrcH--FnAxjJfJbjEMaAnYfEALw_wcB

The ZWO ADC, or Atmospheric Dispersion Corrector, reduces prismatic smearing during planetary imaging, resulting in images with finer details. It also improves the image when doing visual planetary observations, allowing the observer to see more surface detail.

Optical dispersion is an effect caused by the refractive quality of the atmosphere as light passes through it, and is dependent on the angle of the light as well as its wavelength. Optical dispersion spreads the incoming light into a vertical spectrum of colors, causing the object to appear higher in the sky than it truly is. The amount of “lift” that occurs is exaggerated when objects are closer to the horizon, and because optical dispersion is wavelength dependent, it causes the image to separate into different colors. That is why you will see a bluish fringe on the top of an object, and a red fringe at the bottom when atmospheric dispersion effects are particularly bad.

A correctly adjusted ADC, placed between the camera or eyepiece and a Barlow lens, will reduce the effects of optical dispersion and improve image resolution. It does this by applying the opposite amount of dispersion caused by the atmosphere to the image and then re-converging the light of the different wavelengths at the focal plane.


---

4x "nearest neighbor" enlargement (current distance 1.51 billion kilometers - 0.56 arcsec/pixel):
saturn_stacked_3_4x_2.jpg
One thing I'm quite happy about is I was 2-3x over the weight limit on my cheap $425 Star Adventurer 2i Pro tracker but it still worked...

94C131B3-847B-4591-A86C-1D5415EBD5EE.jpeg


A373797A-8137-492B-AF8F-730D9EFC528E.jpeg
 
Last edited:
  • Like
Likes Drakkith and collinsmark
  • #198
 
  • #199
Devin-M said:
I think my sharpness on Saturn was being limited a bit by atmospheric dispersion based on the blue fringing at the top and red fringing at the bottom…. Saturn was quite low to the horizon while I was imaging. They make a corrector for that but I’m not sure I’m ready to fork over the cash for it quite yet…

Yes, I use the ZWO atmospheric dispersion corrector (ADC) for pretty much all my planetary work. It does help a fair amount, but it's not a panacea. It does work though, for what it's worth. I found the money to be well spent.

---

The other main factor (from what I can tell from your Saturn image/video) is probably atmospheric seeing. Atmospheric seeing conditions vary quite a bit from night to night, and they're not necessarily correlated to cloud cover. I.e., you can have nights with good seeing and bad clouds in the sky, clear skies with bad seeing, bad seeing and clouds, or (sometimes hopefully) clear skies with good seeing.

For any given night, the best seeing around a target will usually be when the target crosses the meridian, because that is when the target is highest in the sky, and thus has less atmosphere to pass through.

----

Once you have your raw video data (ideally on a night/time with relatively good seeing), process that data with a lucky imaging program such as Autostakkert! (it's free software). That software will throw away a fraction (say maybe 50% of the frames -- whatever you specify), and warp the remaining frames such that they stack nicely. Then it stacks them producing a single image as an output. I suggest using a high-res uncompressed format for you image such as .TIFF. (And just to be clear, a video goes in [itex] \rightarrow [/itex] an image comes out.)

At that point, you image will still be blurry, but now you can coax out the detail using wavelet sharpening in a program such as RegiStax. (RegiStax is also free software). Don't use RegiStax to do the stacking, since you've already did that using AutoStakkert! Instead, just open up the image and go directly to wavelet sharpening.

The difference between any given raw frame, and the final image out of RegiStax can be remarkable.

[Edit: Oh, my. I'm sorry if this post was off-topic. When I posted it, I thought this was the "Out Beautiful Universe -- Photo and Videos" thread. :doh:]
 
Last edited:
  • #200
collinsmark said:
Once you have your raw video data (ideally on a night/time with relatively good seeing), process that data with a lucky imaging program such as Autostakkert! (it's free software). That software will throw away a fraction (say maybe 50% of the frames -- whatever you specify), and warp the remaining frames such that they stack nicely. Then it stacks them producing a single image as an output. I suggest using a high-res uncompressed format for you image such as .TIFF. (And just to be clear, a video goes in → an image comes out.)

At that point, you image will still be blurry, but now you can coax out the detail using wavelet sharpening in a program such as RegiStax. (RegiStax is also free software). Don't use RegiStax to do the stacking, since you've already did that using AutoStakkert! Instead, just open up the image and go directly to wavelet sharpening.

The difference between any given raw frame, and the final image out of RegiStax can be remarkable.

Thank you for your amazing suggestions!

I did 4 things:

1) I threw out quite a few of the more blurry raw files before stacking
2) I did the wavelet sharpening (amazing results in this step)
3) I individually nudged each of the color channels into alignment to correct the atmospheric dispersion
4) I did some final noise reduction and adjustment filters in Adobe Lightroom

After "Lucky Imaging" Selection - 63 RAW Images, Wavelet Sharpening, Channel Nudge and Noise Reduction (4x "nearest neighbor" enlargement):
saturn_wavelet_sharpened_channel_nudge_noise_reduction_4x.jpg


Stacked RAW Images (4x "nearest neighbor" enlargement):
saturn_stacked_3_4x.jpg


Typical Source Raw Image - All Noise Reduction Disabled (4x "nearest neighbor" enlargement):
1800mm f/12 6400iso 1/320th sec
saturn_source_4x.jpg
 
Last edited:
  • Like
Likes Drakkith and collinsmark
  • #201
Before & after the "channel nudge" to correct the atmospheric dispersion:

After "RGB channel nudge" for atmospheric dispersion in Adobe Photoshop & color/luminance noise reduction in Adobe Lightroom:
saturn_wavelet_sharpened_channel_nudge_noise_reduction_4x.jpg


Before "RGB channel nudge" (after wavelet sharpening & "lucky" image selection re-stacking):
saturn_wavelet_sharpened_4x.jpg


Before wavelet sharpening and removal of blurry frames (after stacking without lucky image selection):
saturn_stacked_3_4x_2.jpg


Typical RAW Image (converted to 16 bit tif file with all noise reduction & sharpening disabled both in camera and conversion software-- Adobe Lightroom):
saturn_source_4x.jpg
 
Last edited:
  • #202
1.png

2.png
3.png

5.png

4.png
saturn_wavelet_sharpened_channel_nudge_noise_reduction_color_corrected_4x.jpg
 
Last edited:
  • Like
Likes PhysicoRaj
  • #203
Oh my, even with an APO the atmospheric dispersion is a thing. Again :cash::cash:

Luckily in my latitude, the ecliptic goes almost overhead and the MW core rises more than 40o at times.

My next scope should be a reflector I note.
 
  • Like
Likes Devin-M
  • #204
collinsmark said:
you can have nights with good seeing and bad clouds in the sky, clear skies with bad seeing, bad seeing and clouds, or (sometimes hopefully) clear skies with good seeing.
I think it's very dependent on temperature and pressure differences if I am right, which affects winds as well as moisture level. I once shot through the clouds (literally through) but got a crystal image of the moon, and wondered if the chilly night helped calm the air down.

At that point, you image will still be blurry, but now you can coax out the detail using wavelet sharpening in a program such as RegiStax. (RegiStax is also free software).
Registax has two options - RGB align and RGB balance - does one of them help deal with dispersion and/or chroma aberration??
 
  • #205
PhysicoRaj said:
Oh my, even with an APO the atmospheric dispersion is a thing. Again :cash::cash:

Luckily in my latitude, the ecliptic goes almost overhead and the MW core rises more than 40o at times.

My next scope should be a reflector I note.
I checked your latitude... only 1-2 months left to observe Saturn (until March) and it will be getting closer and closer to the horizon at sunset that whole time...
 
Last edited:
  • #206
PhysicoRaj said:
I think it's very dependent on temperature and pressure differences if I am right, which affects winds as well as moisture level. I once shot through the clouds (literally through) but got a crystal image of the moon, and wondered if the chilly night helped calm the air down.

Well, technically yes, you are correct. But it's kinda complicated. Similar to how a particular region of ocean sometimes has severe ocean waves and at other times it is calm, different layers of the atmosphere behave kind of like that. Sometimes the boundaries between atmospheric layers are quite smooth; sometimes they're wavy; and sometimes they're just downright chaotic.

When the differing atmospheric layers are calm, relative to one another, you'll have good seeing. When they're not calm is when you have bad seeing.

Seeing can also be affected by terrestrial sources too, such as a nearby building's heat exhaust or even the air currents inside your telescope's optical tube. That's why you shouldn't set up your telescope right next to a building's central air unit, and it's (one reason) why you should let your telescope reach thermal equilibrium before imaging.

PhysicoRaj said:
Registax has two options - RGB align and RGB balance - does one of them help deal with dispersion and/or chroma aberration??

They both might help a little bit, maybe.

But atmospheric dispersion is not something you can completely fix just by nudging the red and blue channels to match the green. Atmospheric dispersion will blur the signal within a given color channel. So yeah, you could nudge a blurry red channel and a blurry blue channel to perfectly overlap a blurry green channel, but it won't get rid of the original blur.

For that, there's really no substitute for an Atmospheric Dispersion Corrector.

Well, that said, maybe you could separate the red, green and blue channels into separate monochrome images, and then use a software program that has the capability of compensating for motion blur. Then put the results of all three back together. Atmospheric dispersion has similar characteristics to motion blur on a monochrome image. So that might help a little. But I haven't tried that, so I'm speculating a little here.
 
  • Like
Likes PhysicoRaj and Devin-M
  • #207
collinsmark said:
Well, that said, maybe you could separate the red, green and blue channels into separate monochrome images, and then use a software program that has the capability of compensating for motion blur. Then put the results of all three back together. Atmospheric dispersion has similar characteristics to motion blur on a monochrome image. So that might help a little.

I found this page with a few options for removing motion blur using photoshop filters...

https://www.techwalla.com/articles/how-to-remove-motion-blur-with-photoshop
 
  • #208
Here I've intentionally misaligned the RGB channels to make them all separately visible...

saturn_wavelet_sharpened_channels_separated_4x.jpg


Just the red channel converted to monochrome...

red_channel_4x.jpg


Vertical motion blur removal applied via "smart sharpen" filter in Adobe Photoshop...

smart_sharpen_4x.jpg
 
  • #209
Simulated saturn red channel converted to monochrome:

sim_saturn.jpg


w/ 5px simulated vertical motion blur applied:
sim_saturn_5px_vertical_motion_blur.jpg


w/ "Smart Sharpen" motion blur removal in Adobe Photoshop:
sim_saturn_5px_vertical_motion_blur_smart_motion_blur_removal.jpg
 
  • #210
Those last results weren't so great but I think I just got much better results with the "Shake Reduction" filter in Photoshop instead of "Smart Sharpen - Motion Blur"...

Simulated motion blur:
red_sim_motion_blur.jpg


Motion Blurred Image with "Shake Reduction Filter" applied:
red_sim_shake_reduction.jpg


Original Simulated Image with no added motion blur:
red_sim_original.jpg


The thin gap in the rings where it crosses in front of the planet and the sharpness of the bottom edge of the planet look much more like the original when Shake Reduction is applied, compared to the motion blurred image without Shake Reduction.
 
Last edited:

Similar threads

Replies
27
Views
5K
Replies
39
Views
6K
Replies
2
Views
2K
Replies
9
Views
3K
Replies
8
Views
3K
Replies
1
Views
12K
Replies
2
Views
6K
Back
Top