10 May 2014

I’ve not been very good at keeping this up to date. However CSP coursework is now finished, which I’ll talk more about later.

So recently I procured a fairly big touchscreen, and given that I have now two raspberry pi’s with absolutley no idea what I should use them for I decided I should make them do things together. Ideally I want this to be a standalone screen with just the touchscreen and the raspberrypi running raspbmc.

Now that XBMC Gotham is out, I decided to take advantage of it’s touchscreen support. Just installing it fresh from raspbmc seemed to do something as it comes out of standby when I touch it, but nothing is happening on the UI. Checking the terminal, xbmc.bin is looking at /dev/input/event# so it at least has the device open.

In previous adventures with the touchscreen I’ve noted that the callibration is completely off and so far I haven’t found a way to callibrate the screen from XBMC. At this point I realised I had no idea how input devices really work in linux so I poked around in the xbmc source tree and in xinput_callibrator to see whether I could figure it out. I’d used this tool in the past to slightly callibrate the touchscreen - but it requires some sort of windowing manager to run, whereas I’d prefer xbmc to stay running standalone as it does in raspbmc.

It looks like once linux determines it has valid drivers a module called usbtouchscreen is used to do all the touch config. So I’m cloning the linux kernel to see what that does. xinput_callibrator seems to be poking values in there to configure the touchscreen parameters, and I’m hoping to get to a point where I can callibrate from xbmc if I rebuild it.

However another problem that has arrisen is that periodically the touchscreen is dettaching from linux… I’m hoping this is just the lack of decent power from the raspberrypi.

To be continued…