This may or may not apply. I built a demo app years ago that worked to capture on the fly hand or print information to categorize images. While I can't tell all I did use Tesseract and Netpbm to dig through images for the information I was looking for. It was a fun thing since we used a simple camera with an EyeFi card that made it easy to take snaps of the product and automate it all.

Netpbm let me do some greyscale work then that image went to Tesseract which spit out what it could see. My app would orchestrate the whole thing then sift through the results for the information I would use to file the images from the camera. Again I can't reveal much more than this. But I can share the gist of the idea on how the app worked and tools used. is worth reading as it popped up along the research.

Interesting. Just read abstract. Will print and follow up.
It quickly looks as if they built their own data set. The article also talks about tesserect .
It also seems this was in 2009. I'll need to look up current articles.
I'll also keep searching for databases.
Thanks !

This article has been dead for over six months. Start a new discussion instead.