Hi,
I'm currently working on a project to live video stream from an usb camera to a web application.
I'm using ffmpeg encoder along with x264 on a raspberry pi to video stream on a web application built in NodeJS. The client side uses a JSMpeg decoder which from what i read is a MPEG-1 decoder.
Even though the streaming works perfectly fine since i was able to do the code bit for it, I do not understand how is it possible to encode the images using ffmpeg and decode them using JSMpeg. If any of you guys know how this is possible please explain. I am very interested into this subject and I would really appreciate your help.
Thank you!
Regards,
Alex