Let’s say you have a row of grapes that represents average yields in your entire vineyard. By counting clusters in that row you have a quick measure of the relative cluster counts over that row’s length. This measure can then be easily expanded across the entire vineyard area.
To help with this process, we have developed some open resources that can use a cellphone video as input, and a cluster count as output. We have written a public Google Colaboratory (Colab) that is freely available and can detect, track, and count flower clusters in video files. This makes it possible to get a rough count of clusters over a given distance in a vineyard. Here we’ll walk you through the steps to use our Colab notebook—a free tool from Google—to upload and analyze your own cluster videos using one of our pre-trained models. This Colab notebook is compatible with custom object detection models as well.
Getting Started
To use our video cluster counter, you will need a few things:
A Google Account
Colaboratory runs on Google Drive
A video file
This is the actual video of your grapes. We will talk about capturing this video file below.
Capturing Video
In this tutorial we will be using ‘nighttime-grape-flower-clusters/1’ which is a nighttime cluster counting model. You can explore our other cluster counting models, leverage any Roboflow Universe models, or even train your own model.
Credit: The data used to train nighttime-grape-flower-clusters/1 was originally captured and trained by Jonathan Jaramillo as part of his PHD research which motivated the creation of this guide. Read the original paper >>
Note: ‘nighttime-grape-flower-clusters/1’ is a nighttime only model. This means you’ll need to capture your video at night using a lighting source. We recommend placing light sources a few feet above and/or below your camera as in the figure below:
You will want to experiment with different frame rates and video lengths to see how effective they are in our process. Our test video clip was filmed at 250 fps and is 17 seconds long. We are actively testing much lower frame rates (60 fps) and longer videos and we will update this document as we come up with better recommendations.