do you see any advantages using two scripts?
Yes. It is about robustness in a networked system. Remember that in a network, any computer could stop running or disappear from the net at any time. Each system then has to be able to handle that situation.
I would definitely run at least two processes. The client would then be synchronous: send a request, wait for the response. The answer might take several seconds before it arrives (especially if you get a time-out due to network problems) .
It can be done with threads of course, but sharing variables between threads is not quite as naively easy to program with as it might look from the start. In my experience it is better to run separate processes, not sharing variables directly. This will allow the processes to run on different CPU kernels (the Raspberry 2 has four kernels, most modern PC-s has at least 2).
If you run both functions in same thread, you cannot let the client do a simple wait, instead it needs to check for new incoming requests to the master function while waiting for the response to the client question. This means going for an asynchronous client, which is quite a bit more complicated, might not be supported in the libraries you have chosen to use.
Additionally, I would probably let the master process handle each request in a separate child process. Then, if for some reason, the request aborts the process, the master will still run.
If you want to share variables, place them in a controlled store outside of both processes. I would suggest sqlite as one of many possible solutions or why not go for a full database such as mysql. These functions have solutions to sharing variables between threads or processes.
A totally unrelated spoiler here: the computer language Erlang + the OTP libraries has very good solutions to how to handle a distributed system with all its inherent fault situations. It is a quite different "world" though.
// Gunnar