Distro
Posts: 7
Joined: Wed Apr 29, 2015 9:13 am

Programming a Touch Screen via Python and I2C port.

Fri May 29, 2015 7:58 am

Hi everyone,

I am actually building a simple project which consist in communicating and setting up a capacitive-multitouch LCD assembly (display plus touch) to an RPi 2.

The first step, i.e., communication, is just solved. I used an LCD o HDMI converter, programmed a couple of things in the RPi and its done.

Regarding the touch, I have been able to connect the screen via I2C communication port, and using some "tools" I have detected my peripheral in the I2C port (i2c-tools), process the information it sends me when a touch event is sensed over the digitizer (python) and, finally, do a mouse click event when and where I detect the touch event. Here the code:

Code: Select all

import smbus
import time
import subprocess

bus = smbus.SMBus(1)

# For knowing your screens memory address type on the terminal: i2cdetect -y 1
addr = 0x38 # Memory address of the peripheral. 
clicked = False # A boolean variable for the click algorithm.

while True:
	
	try:
		#Read the bytes from the registers we want.
		#In this example only 1 touch point is implemented.
        
		rg3 = bus.read_byte_data(addr,0x02)    #Number of active touch points.
		
        rg4 = bus.read_byte_data(addr,0x03)    #Horizontal zone of point 1: 128-132
		rg5 = bus.read_byte_data(addr,0x04)    #Horizontal value of point 1 in the previously measured horizontal zone: 0-255
		rg6 = bus.read_byte_data(addr,0x05)    #Vertical zone of point 1: 0-3
		rg7 = bus.read_byte_data(addr,0x06)    #Vertical value of point 1 in the previously measured vertical zone: 0-255
        
        # NOT USED IN THIS PROGRAMM, NOT MULTITOUCH
        # rg10 = bus.read_byte_data(addr,0x09)    #Horizontal zone of point 2: 128-132
		# rg11 = bus.read_byte_data(addr,0x0A)    #Horizontal value of point 2 in the previously measured horizontal zone: 0-255
		# rg12 = bus.read_byte_data(addr,0x0B)    #Vertical zone of point 2: 0-3
		# rg13 = bus.read_byte_data(addr,0x0C)    #Vertical value of point 2 in the previously measured vertical zone: 0-255
        
		if rg3 == 1 and rg4 >= 128 and clicked == False:    #Entry only if 1 touch point is detected and if it is located into the digitizer active area.
			#Point 1 horizontal coordinates transformation.
			horiz = (rg4 - 128)*255 + rg5

			#Point 1 vertical coordinates transformation.
			vert = rg6*255 + rg7

			subprocess.call(["xdotool","mousemove",str(horiz),str(vert)])   #Move mouse to the desired point.
			
			subprocess.call(["xdotool","click","1"])    #Perform a mouse click event where the mouse is placed.
			clicked = True
		elif rg3 == 0 and clicked == True:  #No touch points are detected.
			clicked = False
			
		time.sleep(0.01)    #Time waiting for repeat the main loop.

	except:
		break
I know is not the best mouse simulation, but at least it fullfills the task :D .

Anyway, now I want to become this touch screen system more realistic, with 2 touch points. And, why? For example, I have designed an screen keyboard in java, and I would like to test if the "Shift" button I programmed works fine. So, how could I do it?

Here are all the commands that can be used from the "Xdotools" python module:
http://www.semicomplete.com/projects/xd ... e_commands

I tought using something like "mousedown for the first point" and "mouseclick for the second one". ¿Any good idea?

Thanks to all in advance. Regards.

User avatar
paddyg
Posts: 2555
Joined: Sat Jan 28, 2012 11:57 am
Location: UK

Re: Programming a Touch Screen via Python and I2C port.

Sun May 31, 2015 9:56 am

No idea on this specific setup, and the stuff I've done was using kivy so most details already taken care of, but one system would be to keep track of how many simultaneous touches there are by incrementing and decrementing a counter. i.e. mousedown increases number of touches, mouseup decreases. I tested for a 'click' by looking at the length of time between mouseup and last mousedown event. You can do a similar thing for double and triple clicks by extending back in time to keep track of more previous events.

If you need to do anything more complicated (i.e. zooming, rotating etc) you will have to keep track of which touch point is moving by looking at the locations of the mouse events, possibly having a push/pop list of touch point objects holding a bit more info.
also https://groups.google.com/forum/?hl=en-GB&fromgroups=#!forum/pi3d

Return to “Python”