Sunday, February 15, 2009

Face Detection and iPhone Video Streaming

I recently purchased a Linksys WVC54GCA WiFi camera. It's a wonderful little camera, but my primary reason for purchasing it was to be able to stream video to my iPhone; it uses Motion JPEG which is the only video option available in Mobile Safari. It actually works very well, though my intention is to eventually attach it to my iRobot Create to give it "vision." More on that later.


Anyway, despite the camera's strengths, there are a few limitations. First, the camera can only support 4 simultaneous clients and the performance degrades linearly with each additional client (from my anecdotal experience). Second, the only access control the camera offers is HTTP Basic Authentication backed with a 4-user list configurable from its web interface that doesn't integrate well with any other application or security system. I figured that the best and most direct way of fixing the problems was to proxy the feed through my MacPro to manage the connection and user access there instead of on the camera.


As I was imagineering this system, I also got the bright idea to go ahead and do facial detection (not recognition -yet-) on the stream. After doing some research on the technology, I decided to use the OpenCV libraries developed by Intel and subsequently open sourced. My initial prototypes were extremely slow (1-2 FPS) since the Java libraries depended on JNI calls to a non-thread-safe C library. I did more research and found the Faint (Face Annotation Interface) library which did Haar in pure, multithread-able Java. (I had to take the beta code from the SVN since it wasn't released yet.) That finally got me a much more acceptable 10+ FPS.


Now I have the camera stream being cached within a custom-built Tomcat webapp that does the detection and also provides security for the stream. It can support much more than the 4 users available from the camera and without a FPS hit. It's pretty cool. Right now it just draws a red rectangle around the detected face, but obviously more triggers and actions are possible and desirable. It should definitely be noted that the stream (with facial detection) is viewable from the iPhone! Now- to just get the damned thing attached to my iRobot and my little mobile sentry will be complete. :)




4 comments:

Student and Researcher said...

that sounds awesome. I had one question though, you arent doing the actual face detection on the iphone are you?

rammic said...

No, the face detection occurs on the mac. The overlay rectangles are done and included in the MJPEG stream to the iPhone.

Temitope said...

Well, my qusestion is how does one get an API for the faint.jar library thats been my most difficult task.

topriddy@gmail.com

Eric Newcomer said...

So far as I can tell the faint project only does Haar via JNI. I don't see an unreleased version like you mention on sourceforge.

They do have Eigenface as pure java though, but that is not what I'm after.