derg
Posts: 1
Joined: Tue Jan 16, 2018 10:46 pm

Streaming separate video and audio sources to a webserver

Tue Jan 16, 2018 10:55 pm

Hello,

I am trying to create a camera that will send raw video to a webserver, which will then have audio recorded from a windows machine added over the file. The idea is for all of this to be controlled by the windows machine. I was thinking of creating a java webapp to control everything, but am unsure how to advance. The simple progression of events would be as follows:

1.) User navigates to IP address of RPI hosting webserver on windows machine
2.) User presses enter to start recording video from the RPI which has a picam
3.) User is prompted to type a filename
4.) User adds voicenotes via microphone connected to laptop
5.) Webserver combines video and audio using timestamps
6.) Separate user navigates to IP address and selects file to view

Is a java app a good idea for this? I'm having difficulty trying to combine the video, audio, and separate timestamps into one file for others to access on a webserver.
I know of UV4L, but am having issues connecting to it from a windows machine. I dont have a firewall on the pi, and have forwarded the ports on the router, but can only access the UV4L server from the localhost and not the windows machine.
Can I possibly get some guidance for this? Thanks!

Return to “General programming discussion”

Who is online

Users browsing this forum: No registered users and 5 guests