BBTouch is now xTouch

Hello!

So, it has been a very long time since the last big update to BBTouch. But thanks to Sandor yet again for finding some budget to do some cool projects and including me in those plans. The past year or so I have been immersed in multi-touch apps on a small scale (ie the iPhone). This has kept me quite busy so I am excited to be able to do some bug MT stuff again.

Anyway, enough boring crap about me, let’s talk about BBTouch. BBTouch is great, and it was excellent fun to write and design and work out all the issues. It is still quite a good little tracking app, however there are some known issues. The big one is a crashing bug that crops up only after you have been running the tracker for a long time, (over 20 hours). This has been an issue for awhile and I have been meaning to fix it.
Some other things:

  • Multi-Cam support. BBTouch was designed with a single camera in mind, and that architecture is fairly deep. It would not be impossible to make BBTouch a multi-cam system, but it would be quicker to build another tracker.
  • Image Filtering: In the early days, before I realized anyone would actually use BBTouch for more than a lark, I wanted to build it so that it would be lightning fast and not require any complicated filtering. However, once people actually started using it for things like professional installs and conferences, it became apparent that I needed to step up a bit. I added openCV filters in the last big update, but they are very very kludgy.
  • Fiducials: similarly to multi-cam, fiducials would be a bit of an architectural modification. and now that I have this big hacky patch that is the filtering system, adding fiducials would mean even more work.

So! What is the solution? A new tracker. I have been slowly working on a bunch of changes to BBTouch over the past few months in my limited free time.

x_touch_logo256

The changes started to add up and Sandor and I had been thinking about renaming BBTouch to xTouch for awhile. The name ‘BBTouch’ was meant to be a temporary moniker at the outset (I name all my prototype stuff BBSomething, this encourages me to think of better names later). But instead of just a rename, I decided to make xTouch a new tracker.

What were my new design criteria?

  • First and foremost: reuse as much of the BBTouch code as posible. I am a lazy lazy programmer, and I dont like reinventing the wheel again.
  • GUI: cut it down. The original BBTouch is way too flashy. At the time, I built BBTouch I wanted to add stuff that made it all cool looking so when people asked me what the hell the giant odd table in my living room was I could have something to show them. Now, however, I have lots of TUIO enabled apps that are much much cooler, so I can just use those like everyone else. So I sat down and figured out the absolute minimum amount of functionality necessary for a tracker configuration and came up with this:
    xTouch Main Window
    Thats it.
  • simplicity: Along with the GUI trim I also systematically went through all the code and refactored and removed all the extra crap. xTouch currently has about half as many lines of code as BBTouch. (presumably once I add in multi-cam and fiducial support, it will be more on par with the current BBTouch)
  • modularity: The BBTouch architecture was grown from scratch basically. I didnt really know what I was doing (and still dont!) But I have a much better idea now what needs to go where and how to push all that data around a bit more efficiently. All the new code is much better OO. The various components are much more loosely copupled than they were in BBTouch. This should aid the process when I can get around to adding fiducial support and multi cam stuff.
  • multithread support: At the start of BBTouch I was developing mostly on my G4 powerbook. Somewhere along the line I upgraded to the MacBook Pro which is a dual core, but much of the single-core mentality is still in the original designs of BBTouch. BBTouch is very linear in the way it processes each frame. xTouch has a separate thread for each camera and yet another thread for each blob detector. This makes it much speedier on muti-core machines.
  • non-obsolete camera support: thie basically meant abandoning the old Sequence Grabber code which has been depricated for awhile now, and going with the newer QTCaptureKit stuff. This makes the frame grabbing code about 500 lines shorter, but the downside is that you do lose the ability to futz with the various old camera settings like exposure and focus.

So! lets have a look shall we?

Main Interface

Picture 9

We already saw the minimal main interface, here it is at load. If you are familiar with BBTouch, then this is a stark contrast. Not that BBTouch was ugly per se, but there was much more going on. Also with BBTouch in order to make the tracker get going, you had to load it up, then turn on blob detection. xTouch is a bit more clever, and it loads up, grabs a few background shots, and starts detecting and sending TUIO commands right away. This means that you can open it remotely and restart it remotely if you need to.

When you first setup your surface with xTouch, you just click the buttons from left to right. Simple.

Camera Selection

Picture 11

Hitting the ‘Cam’ button brings up a little viewer window that allows you to select which camera you are configuring. For now, this is your only camera. Later this window will allow you to pick multiple cameras and configure them separately.

Filter Config

Picture 12

Step 2 in the process is filter configuration. In the early beta version this is tailored to DI setups since that is what I use and what Sandor uses, but it is very easy to add new filter modules which are mostly just swapping in a few OpenCV filters here and there.

Here you have the choice to see the filtered image at various stages of the filter process and tweak the various filter parameters for your particular setup. The way that this all works in the code is much improved over the old way which was really a hack. Now the filter chain is a module that you can easily switch out, so making an FTIR specific filter or whatever will be much easier than previously.

Some quick stats from the beta version:

On my macbook pro (2.33 core duo) the filters run at about 120 fps and the blob detection (with ten blobs) runs at about 150 fps on average. So, all up from frame capture to TUIO distribution: about 66 FPS. (and of course, the camera only runs at 30 fps, so there is lots of time to use your cores for 3d imaging, or whatever it is that you are doing with your touches.

On my mac pro, the filters run slightly faster at 160 fps and the blob detection runs at a staggering 280 fps on average. This adds up to a bit over 100 FPS. So, much better than BBTouch :-)

(note the above metrics were with 640×480 input images. at 320×240, holy shit is it fast, from cap to TUIO: 250 FPS

Mesh Config

Here is where it gets more interesting. Those of you familiar with the way BBTouch worked will either love this or hate it. For BBTouch I had sort of settled on a hybrid configuration style that was most closely patterned from the Reactable folks. It worked very well, and was quite accurate, but the downside was that it was a colossal pain in the ass. lots of dragging of vertexes around the screen, having to take a screenshot with the IR filter off the camera.. ugh. So I finally took the hint and moved towards a more Touche/tBeta/touchlib approach where you just touch a bunch of points on the screen.

Picture 14

Here is the process: First you define the projection bounds by using the keyboard to shift a big box around the screen.

Picture 16

Next you decide how many points you want, the more the merrier (actually, the wider the angle of your camera lens, the more vertexes you will want. I have a very wide angle lens on my unibrain fire-i, and while I was building the config wizard, i was only using 6 vertexes (because, really, when you have to go through the config wizard a zillion times during testing, let me tell you you dont want to have to hit 40 points each time) and was really surprised how far off the points were in the center of the mesh boxes (where they will be the most interpolation going on). So: moral of this story: when you are configging for real, just add a few more vertexes and take the extra 30 seconds, it makes it all so much nicer in the end)

After that you go into the touch-each-point mode where you put your finger on the point till it turns green, then move on. Easy-peasy.

Picture 18

Next! Testing! Everyone’s favorite bit. Just hit the ‘t’ key once all the points are green and you can test out your new mesh. Exciting! (blue circles == touch points)

TUIO Config

Picture 20

Last but not least, TUIO. xTouch is designed for TUIO out of the gates. In BBTouch, to be quite honest, when I started I hadnt really thought that far ahead. It wasnt until later when it sort of ‘became’ something that I finally added TUIO support.

Still only at TUIO 1.0 support (because it is mostly just copied over from the BBTouch code), but I plan to add TUIO 2.0 stuff at some point. Anyhow, unlike BBTouch, xTouch kicks off the TUIO right at startup. Again, no going into the app and turnign everythign on at first.

So! anyway, just a teaser for xTouch really.. no code yet, Sandor and I need to do some more testing before I unleash it on the world.

Cheers!
-B

This entry was posted in BBTouch, Blog, code, multitouch. Bookmark the permalink.