So, how do we do this in python? Essentially the easiest stacking is to simply add up all the pixel values for all the desired images, and divide by the number of images, creating an average image. Building on our knowledge of PIL we can write a script starting with importing the required libraries:
from PIL import Image import glob import numpy as np
find all the images in a folder, in this case *.png files:
imgList = glob.glob('./*.png')
Next we create a loop to loop through all the pictures we found however, we need to know if it is the first image to intialize the summed image variable:
first = True for img in imgList:
Now for the tricky part. Notice how we imported numpy. The problem with using the PIL image class is that the data type for the RGB pixels values are unit8, or a value between 0 and 255. If the resulting value exceeds 255, then it restarts from 0 (i.e. 140+210=94). This is known as an overflow and will change the pixel color. to overcome this issue we will convert the PIL image to a numpy array:
temp = np.asarray(Image.open(img))
Then change the data type of the array.There are several options for data types. Lets try to pick on logically. Suppose we have 1000 photos. The maximum value for one photo is 255. Therefore we need a data type that will handle a number of 1000*255 = 255,000. So uint32 should do the trick (0 to 4294967295)
to float:
temp = temp.astype('uint32')
next, we have to either 1) create a new variable to hold the sum or 2) add the current image to the summed image:
if first: sumImage = temp first = False else: sumImage = sumImage + temp
now we calculate the averaged image by dividing the summed image by the number of images:
avgArray = sumImage/len(imgList)
we have to convert back to the unit8 data type, then back into the PIL image class:
avgImg = Image.fromarray(avgArray.astype('uint8'))
Finally using what we learned previously, we can show, save, etc...
avgImg.show()
Bringing it all back together:
from PIL import Image
import glob
import numpy as np
imgList = glob.glob('./*.png')
first = True
for img in imgList:
temp = np.asarray(Image.open(img))
temp = temp.astype('uint32')
if first:
sumImage = temp
first = False
else:
sumImage = sumImage + temp
avgArray = sumImage/len(imgList)
avgImg = Image.fromarray(avgArray.astype('uint8'))
avgImg.show()
Image Example:
I didn't have a 'real world' example so I made a quick scene in blender of several balls drooping. I rendered 127 images at 24 fps. The first one looks like this:
You can see the effect of sample rate on the averaging 1) all the frames, 2) every 10th frame, and 3) every 20th frame:
It is possible to modify the code and subtract frames:
and multiply frames:
I stumbled upon this awesome NASA website. The following images have been stacked with the above code. Images courtesy of the Image Science & Analysis Laboratory, NASA Johnson Space Center. Thanks for providing those images. They are great!
Images ISS031-E-66034 to ISS031-E-66136:
Images ISS030-E-68896 to ISS030-E-69180:
Images ISS030-E-271717 to ISS030-E-271798:
Images ISS031-E-57221 to ISS031-E-57490:
I think I could do a couple things to make better images. Ill have to give it a shot. There also seem to be a lot of noise. i don't know if the noise is from the actual images or my algorithm? I would assume averaging the photos would eliminate the noise.
Hi,
ReplyDeleteI'm a Python newbie, and I'm trying to get the average of a set of images I have.
I run your script and gives me an error: can't identify image file
What file-type are you trying to read? png, jpg?
DeleteHi i am also facing the above mentioned error. my file type is png
ReplyDeleteI think It is important to note that what you are speaking of here is NOT Image "Stacking". What you are presenting here is called Frame Integration / Frame Averaging. This is different from stacking because frame integration does NOT attempt to shift the images based upon the objects in the scene prior to averaging the pixels POST alignment.
ReplyDeleteThis is speculation from some things I know about the cameras they use on ISS but I think the noise you're talking about in NASA images might be due radiation damage to the sensors themselves. Most cameras in orbit eventually get bad/hot/cold pixels due to radiation damage. As that is a sensor defect, the same pixel will always be affected so averaging won't eliminate this issue. Sort of a unique side effect from being in space.
ReplyDelete