Teat detection and tracking through Kinect depth data range
Hi everyone, I am designing a prototype of Milking Robot for educational purposes and I need a Kinect expert. I want to use Kinect to detect the udder and teats of the cow. I need the coordinates of the four teats in real time for my Arduino Robot.
1-Work in any light condition (depth mode)
2-Detect udder and teat from any position and orientation
4-Visualization of image, infrared and depth
5-Api for send data to the robot, the robot can request the 3D coordenates (request of either nipple, teat for positioned by robot).
6-selecting learning ability in the application image.
7-detect any drop liner.
8-I expect the full source code and functional stable version
9-Kinect Near mode support
The udder and the teats in a real cow can have various shapes and sizes, similar, but not equal. And the animal can move constantly. You can use kinect SDK, OpenCV, OpenNI,...
I can provide real kinect data captures of teats and udders of diferents animals to test it. As each teat is different you can implement a learning function of each nipple. In the future I want to implement a funtion that identificates the Animal whith the udder and teat recognition too.
I honestly don't know the time or cost that can take a project of this kind. I would like to know your opinion. I'm open to suggestions.
Additionally, the software should also detect the teat cups. Which I am using in the milking robot. These always have the same shape and size. I need that the software provide the 3D coordinates to guide them to the teats.