I'm trying to produce a video file from a series of matplotlib plots.
I successfully produced the video with the animation module of matplotlib, but it was far too slow for my application (it was saving temporary png files to make the animation) so I ended up feeding the raw images to ffmpeg through stdin, see my code below. This already gives a nice speedup, but further improvement should be possible using the hardware encoder through gstreamer.
I'm stuck trying to figure out how to modify the following code to build a proper gstreamer pipeline...
Code: Select all
import numpy as np
import matplotlib.pylab as plt
import time
import subprocess
# Number of frames
nframes = 200
# Generate data
x = np.linspace(0, 100, num=nframes)
y = np.random.random_sample(np.size(x))
def testSubprocess(x, y):
start_time = time.time()
#set up the figure
fig = plt.figure(figsize=(15, 9))
canvas_width, canvas_height = fig.canvas.get_width_height()
# First frame
ax0 = plt.plot(x,y)
pointplot, = plt.plot(x[0], y[0], 'or')
def update(frame):
# your matplotlib code goes here
pointplot.set_data(x[frame],y[frame])
# Open an ffmpeg process
outf = 'testSubprocess.mp4'
cmdstring = ('ffmpeg',
'-y', '-r', '1', # overwrite, 1fps
'-s', '%dx%d' % (canvas_width, canvas_height), # size of image string
'-pix_fmt', 'argb', # format
'-f', 'rawvideo', '-i', '-', # tell ffmpeg to expect raw video from the pipe
'-vcodec', 'mpeg4', outf) # output encoding
p = subprocess.Popen(cmdstring, stdin=subprocess.PIPE)
# Draw frames and write to the pipe
for frame in range(nframes):
# draw the frame
update(frame)
fig.canvas.draw()
# extract the image as an ARGB string
string = fig.canvas.tostring_argb()
# write to pipe
p.stdin.write(string)
# Finish up
p.communicate()
print("Movie written in %s seconds" % (time.time()-start_time))
if __name__ == "__main__":
testSubprocess(x, y)Code: Select all
filesrc location=/dev/stdinThanks,
Federico