day using the MobileNet as its backbone, with no sig-
nificant quality loss. Any real-time sampling using
this technology must consider these constraints.
The final observations on the set of tests display
the first set of evidence that a system using this tech-
nique is feasible for the counting and density predic-
tion tasks. Both the model evaluation and the final
counting show promising outcomes, supporting the
further development of this technology. The same
methods can be employed in future applications to
perform counting tasks in dense and sparse scenes
within other contexts.
6 CONCLUSIONS
In this work, we proposed and validated a CNN-based
method to count ants and predict their spatial distribu-
tion. We created the whole set of tools necessary to
generate this solution, including a system to annotate
the dataset in the shape of a dot map. Our results dis-
play promising evidence of the feasibility of the de-
signed approach.
Our proposed method standardizes the image di-
mensions and evaluates each section individually us-
ing a convolutional neural network backbone. Then it
compiles the results into a density map and uses the
produced data to estimate the number of ants in an im-
age. We evaluated the proposed solution considering
the capability of qualitatively predicting the density of
each section and quantitatively predicting the number
of ants per image.
Our results indicate that the system can predict the
distribution with promising quality. It predicted the
density with good approximation, and the counting
approached the ideal with a coefficient of determina-
tion that approached the ideal. Therefore, the experi-
ments validate the feasibility of this approach, encour-
aging future developments.
ACKNOWLEDGEMENTS
The authors would like to thank FAPEMIG, CAPES,
CNPq, and the Federal University of Ouro Preto
for supporting this work. This work was partially
funded by CAPES (Finance Code 001) and CNPq
(306572/2019-2).
DATA AVAILABILITY
Training codes and dataset available at https://github.
com/matcoelhos/Ant-CNN.
REFERENCES
Bjerge, K., Mann, H. M., and Høye, T. T. (2022). Real-time
insect tracking and monitoring with computer vision
and deep learning. Remote Sensing in Ecology and
Conservation, 8(3):315–327.
Eliopoulos, P., Tatlas, N.-A., Rigakis, I., and Potamitis, I.
(2018). A “smart” trap device for detection of crawl-
ing insects and other arthropods in urban environ-
ments. Electronics, 7(9):161.
Hakala, S. M., Perttu, S., and Helanter
¨
a, H. (2019). Evolu-
tion of dispersal in ants (hymenoptera: Formicidae):
A review on the dispersal strategies of sessile superor-
ganisms. Myrmecological News, 29.
Helanter
¨
a, H., Strassmann, J. E., Carrillo, J., and Queller,
D. C. (2009). Unicolonial ants: where do they come
from, what are they and where are they going? Trends
in Ecology & Evolution, 24(6):341–349.
Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D.,
Wang, W., Weyand, T., Andreetto, M., and Adam,
H. (2017). Mobilenets: Efficient convolutional neu-
ral networks for mobile vision applications. arXiv
preprint arXiv:1704.04861.
Khan, S. D. and Basalamah, S. (2021). Sparse to dense
scale prediction for crowd couting in high density
crowds. Arabian Journal for Science and Engineer-
ing, 46(4):3051–3065.
Majer, J. and Heterick, B. (2018). Planning for long-term
invertebrate studies–problems, pitfalls and possibili-
ties. Australian Zoologist, 39(4):617–626.
McGlynn, T. P. (2012). The ecology of nest movement in
social insects. Annual review of entomology, 57:291–
308.
Schneider, S., Taylor, G. W., Kremer, S. C., Burgess, P.,
McGroarty, J., Mitsui, K., Zhuang, A., deWaard, J. R.,
and Fryxell, J. M. (2022). Bulk arthropod abundance,
biomass and diversity estimation using deep learning
for computer vision. Methods in Ecology and Evolu-
tion, 13(2):346–357.
Sindagi, V. A. and Patel, V. M. (2018). A survey of recent
advances in cnn-based single image crowd counting
and density estimation. Pattern Recognition Letters,
107:3–16.
Tan, M. and Le, Q. (2021). Efficientnetv2: Smaller models
and faster training. In International Conference on
Machine Learning, pages 10096–10106. PMLR.
Tresson, P., Carval, D., Tixier, P., and Puech, W. (2021).
Hierarchical classification of very small objects: Ap-
plication to the detection of arthropod species. IEEE
Access, 9:63925–63932.
Wan, J., Wang, Q., and Chan, A. B. (2020). Kernel-
based density map generation for dense object count-
ing. IEEE Transactions on Pattern Analysis and Ma-
chine Intelligence.
ICEIS 2023 - 25th International Conference on Enterprise Information Systems
554