Cameras on Android are a bit tricky because they also interact with graphics drivers for memory allocation, etc. Again a kind reminder that Raspberry Pi is not an Android device to begin with so there's no support from the chipset/device manufacturer (Broadcom/Pi Foundation) what comes to Android specific hardware drivers (hw graphics acceleration, hw video dec/enc, camera, etc). Yes, there's still various issues with the camera and it's listed under known issues. Camera performance depends on the app and the stock camera app is not currently working.jacob.kruger.work wrote: ↑Tue Mar 30, 2021 7:34 amKonsta, firstly, thank you very much for your efforts.
Now, to explain where am coming from, here is my form of a progress report in terms of trying to get this version of lineage running on a raspberry pi 4b 8Gb model, in fully accessible mode:
One remaining question - have otherwise gotten it up and running, and operating more or less perfectly in terms of interaction, having talkback screenreader - android accessibility suite - reading out feedback to me, using various voices, etc., but, while the stock camera app just crashes, the one primary reason tried this in the first place - Augmented reality for the blind - the vOICe for android - https://seeingwithsound.com/android.htm - actually runs well enough, but, with slightly different reaction times to camera scene changes, depending on different USB cameras, and - half of my reason for posting this response - is that the latest test camera, which is an endoscope camera, seems to respond better if it's connected to one of the USB 2 ports - makes some sense if it's just a USB 2 device, and then doesn't ask the unit to handle any additional processing if operating via USB 3, but, while this piece of software works quite well, some other pieces of AI-based object and text recognition software I have tried out just seem to lock up, or just provide no feedback.
Some of them make background use of online requests, so, while I currently have the unit tethered to a mobile hotspot, via wifi, I could expect that to provide an additional lag, but, for example, even if I ask some of them to process an image file off the storage, then they might bomb out, which makes me think it's not just related to requesting camera input?
In other words, is it possible that either whatever form of GPU-type processing is taking place, or lineage's own framework could be causing image processing issues?
Besides that, would you guys possibly recommend working with a fully wireless, bluetooth keyboard while I have the camera connected to USB socket, to make it provide even more processing power, and, one more question is, what if I want to offer, or work with microphone voice input - besides tethering hardware to GPIO pins, etc., I would fully expect attaching something like an external USB audio adapter to cause even more hassles?
Part of the reason am asking about all these simple plug-and-play options will make sense if you check out my report page - am trying to figure out a way to offer guys a cheap, easy way to put together a relatively simple and portable form of all-in-one assisstive technology gadget that doesn't necessarily require them to walk around with an android handset in their one hand, etc.
I'm not aware of any reason why there would be difference whether you use USB 2.0 or 3.0 port. Different UVC cameras might support different resolutions and use different pixel formats. I've included v4l2-ctl tool in latest releases you can use to dig some information from the connected camera (e.g. 'v4l2-ctl -d /dev/video0 --all').
IMO regular Android phone/tablet is probably better suited for your use case what comes to portability/size/battery life/etc. Recent Android devices should also support USB webcams but that depends on if the device manufacturer has implemented the external camera HAL (https://source.android.com/devices/came ... sb-cameras).