amcdonley
Posts: 194
Joined: Mon Jan 26, 2015 5:56 pm
Location: Florida, USA

Simple Variable Sharing Between Python Processes?

Mon Aug 19, 2019 1:03 am

Right now I have a python module to manage concurrent access to some robot variables stored in a file "carlData.json"

Code: Select all

carlDataJason.py :  
*    saveCarlData(dataname, datavalue, logit=False)   # adds datanaem:datavalue to carlData.json file
*    getCarlData(dataname=None)      # either returns dictionary with all values, or just value of passed name
*    delCarlData(dataname)           # delete item from carlData.json
I don't have a getLocked(), setLocked() pair though.

At this point I am only keeping two variables:
* chargingCycles # number of times robot has docked for recharging
* chargeConditioning # count down flag to discharge batteries (dock to recharge) to 7.7v instead of the normal 8.1v if greater than 0.

I am planning to add some more slowly changing variables:
* dockingState [returning, docking, docked, dismounting, notdocked]
* lastDockingStateChange datetime
* chargingState [unknown, discharging, charging, trickling]
* lastChargingStateChange datetime
* drivingProcess [process_name or PID or none] controlling the vehicle position
* tiltpanProcess [process_name or PID or none] controlling the tiltpan (and distance sensor)
* tiltpanCentered boolean (when centered, multple processes can use distance sensor readings)


but eventually, I want to start tracking more dynamic variables such as:
* botInMotion boolean
* currentDrivingCommand?
* currentX,Y,theta (Dock reference) - this is very wishful
* currentOrientationConfidence
* currentPositionConfidence
* tiltpanInMotion

I'm wondering if I need to look at multiprocessing shared memory, or a sqlite3 database, or just continue on as I am?

Eventually, I am sure I'll want to add a DB for tracking driving command history and such, but I'm hesitant to complicate my simple robot life.

User avatar
DougieLawson
Posts: 39788
Joined: Sun Jun 16, 2013 11:19 pm
Location: A small cave in deepest darkest Basingstoke, UK
Contact: Website Twitter

Re: Simple Variable Sharing Between Python Processes?

Mon Aug 19, 2019 7:55 am

SQLite3 doesn't support multiple writers. You very rapidly destroy your database if you get two writers running.

Use python multi-threading to run everything in a single python process.
Note: Any requirement to use a crystal ball or mind reading will result in me ignoring your question.

Criticising any questions is banned on this forum.

Any DMs sent on Twitter will be answered next month.
All fake doctors are on my foes list.

blimpyway
Posts: 419
Joined: Mon Mar 19, 2018 1:18 pm

Re: Simple Variable Sharing Between Python Processes?

Mon Aug 19, 2019 9:39 am

You may consider a generic messaging queue (eg mosquitto) to communicate state changes across processes.

Although is not by far as fast as shared memory, there are a few good reasons for this approach:
- allows for modularity - have small processes each responsible with its own limited task
- transparently spawn processes/modules across machines if/whenever needed. e.g. if you want to add vision/image recognition which is way too heavy for your robot's main Pi you can just a dedicated PI or Jetson/whatever for that task alone and have it integrate easily with existing modules.
- allows for debugging - by just subscribing to whatever message "keys" you can see state changes from a different machine than the robot's PI
- allows for nice interfaces/GUIs/visualisations on a PC away from PI
- logging/history/database inserts can also happen on a different machine reducing Pi's frequent SD card updates (and risk of failure because of it)
- for such reasons, robot platforms like ROS are built upon a similar named message queue paradigm.

From what I see in your variables 10-20 messages/second should be more than enough to update state with very little impact on PI's cpu cycles.

amcdonley
Posts: 194
Joined: Mon Jan 26, 2015 5:56 pm
Location: Florida, USA

Re: Simple Variable Sharing Between Python Processes?

Mon Aug 19, 2019 6:28 pm

DougieLawson wrote:
Mon Aug 19, 2019 7:55 am
SQLite3 doesn't support multiple writers. You very rapidly destroy your database if you get two writers running.

Use python multi-threading to run everything in a single python process.
Results on my Pi3B using multiprocessing pool.map to 3 "worker/inserter" processes that
each obtain closing_lock/connect/insert/commit/close:
# Insert: Max 43 ms Ave 18ms each for 100 values
# Insert: Max 6-11s Ave 43-105ms for 1000 values <-- WOW

Results using one "db server" process and three queue stuffer processes:
insert to db: 200-750 us average, 2-27 ms max (6K-60K row db)
add_to_queue: 86-95 us
fetch 1 row : 900 us
fetch all : 10-15 us per row (600-900 ms total for 60000)

So having a db server process seems fast, but I don't know how "data safe" sqlite3 is. My programs have sometimes crashed things awkwardly, and once in the last 12 months a crash allowed the batteries to run down without an orderly shutdown.

I'm just not sure I'm ready for the complication of a DB, but I know it will happen eventually.

amcdonley
Posts: 194
Joined: Mon Jan 26, 2015 5:56 pm
Location: Florida, USA

Re: Simple Variable Sharing Between Python Processes?

Mon Aug 19, 2019 7:16 pm

blimpyway wrote:
Mon Aug 19, 2019 9:39 am
You may consider a generic messaging queue (eg mosquitto) to communicate state changes across processes.
I have to admit I am intimidated by the whole IoT complexity. I've watched rosie the red robot as he ventured there, but I have kept my distance. After 40+ years of doing commercial software systems, I should not be hesitant about something as basic as a queue. Sometimes I just feel tired of learning, sometimes I get excited about learning - I've been working at learning OpenCV for the last few months, and get distracted by these philosophical investigations.
there are a few good reasons for this approach:
- transparently spawn processes/modules across machines
One of the principles of design for my robot has been self-sufficiency first. I want to be sure my bot is fully utilizing its computational and sensor resources, even if it means having a slower response. Speech recognition, text-to-speech, and object recognition are faster, better, and easier using Google, but there is something in *me* that wants my bot to keep it local. (Probably equally a curse as NIH syndrome).
- allows for debugging - by just subscribing to whatever message "keys" you can see state changes from a different machine than the robot's PI
Hadn't thought about that one. Sort of "big brother" like, but for positive purposes.
- allows for nice interfaces/GUIs/visualisations on a PC away from PI
- logging/history/database inserts can also happen on a different machine reducing Pi's frequent SD card updates (and risk of failure because of it)
- for such reasons, robot platforms like ROS are built upon a similar named message queue paradigm.
I admit I am a little jealous of the ROS visualization tools. I feel like I need my bot to have a library of basic capabilities before I add ROS.
From what I see in your variables 10-20 messages/second should be more than enough to update state with very little impact on PI's cpu cycles.
Thanks, that is very useful.

Return to “Automation, sensing and robotics”