
REFERENCES
Achanta, R., Hemami, S., Estrada, F., and Susstrunk, S.
(2009). Frequency-tuned salient region detection. In
Conf. on Computer Vision and Pattern Recognition,
pages 1597–1604. IEEE.
Agarwal, R., Hariharan, S., Rao, M. N., and Agarwal, A.
(2021). Weed identification using k-means cluster-
ing with color spaces features in multi-spectral images
taken by uav. In Int. Geoscience and Remote Sensing
Symposium, pages 7047–7050. IEEE.
Balabantaray, A., Behera, S., Liew, C., Chamara, N., Singh,
M., Jhala, A. J., and Pitla, S. (2024). Targeted weed
management of palmer amaranth using robotics and
deep learning (yolov7). Frontiers in Robotics and AI,
11:1441371.
Chauhan, B. S., Matloob, A., Mahajan, G., Aslam, F., Flo-
rentine, S. K., and Jha, P. (2017). Emerging challenges
and opportunities for education and research in weed
science. Frontiers in plant science, 8:1537.
Chen, G., Liu, S.-J., Sun, Y.-J., Ji, G.-P., Wu, Y.-F., and
Zhou, T. (2022a). Camouflaged object detection via
context-aware cross-level fusion. Transactions on Cir-
cuits and Systems for Video Technology, 32(10):6981–
6993.
Chen, T., Xiao, J., Hu, X., Zhang, G., and Wang, S. (2022b).
Boundary-guided network for camouflaged object de-
tection. Knowledge-based systems, 248:108901.
Coleman, G. R., Bender, A., Walsh, M. J., and Neve, P.
(2023). Image-based weed recognition and control:
Can it select for crop mimicry? Weed Research,
63(2):77–82.
Fan, D.-P., Cheng, M.-M., Liu, Y., Li, T., and Borji, A.
(2017). Structure-measure: A new way to evaluate
foreground maps. In Int. Conf. on Computer Vision,
pages 4548–4557.
Fan, D.-P., Gong, C., Cao, Y., Ren, B., Cheng, M.-M., and
Borji, A. (2018). Enhanced-alignment measure for bi-
nary foreground map evaluation. arXiv.
Fan, D.-P., Ji, G.-P., Cheng, M.-M., and Shao, L. (2021).
Concealed object detection. Transactions on Pat-
tern Analysis and Machine Intelligence, 44(10):6024–
6042.
Future (2024a). Machine segmentation dataset.
https://universe.roboflow.com/future-cn6m4/
machine-segmentation. visited on 2024-12-27.
Future (2024b). Weed segmentation dataset.
https://universe.roboflow.com/future-cn6m4/
weed-segmentation-mjudm. visited on 2024-12-
27.
Gao, S.-H., Cheng, M.-M., Zhao, K., Zhang, X.-Y., Yang,
M.-H., and Torr, P. (2019). Res2net: A new multi-
scale backbone architecture. Transactions on Pattern
Analysis and Machine Intelligence, 43(2):652–662.
Hu, X., Wang, S., Qin, X., Dai, H., Ren, W., Luo, D.,
Tai, Y., and Shao, L. (2023). High-resolution iterative
feedback network for camouflaged object detection.
In AAAI Conf. on Artificial Intelligence, volume 37,
pages 881–889.
Ji, G.-P., Fan, D.-P., Chou, Y.-C., Dai, D., Liniger, A., and
Van Gool, L. (2023). Deep gradient learning for effi-
cient camouflaged object detection. Machine Intelli-
gence Research, 20(1):92–108.
L
´
opez-Granados, F., Jurado-Exp
´
osito, M., Pe
˜
na-Barrag
´
an,
J. M., and Garc
´
ıa-Torres, L. (2006). Using remote
sensing for identification of late-season grass weed
patches in wheat. Weed Science, 54(2):346–353.
Margolin, R., Zelnik-Manor, L., and Tal, A. (2014). How
to evaluate foreground maps? In Conference on Com-
puter Vision and Pattern Recognition, pages 248–255.
Moldvai, L., Ambrus, B., Teschner, G., and Ny
´
eki, A.
(2024). Weed detection in agricultural fields using ma-
chine vision. In BIO Web of Conferences, volume 125.
EDP Sciences.
Parra, L., Marin, J., Yousfi, S., Rinc
´
on, G., Mauri, P. V., and
Lloret, J. (2020). Edge detection for weed recognition
in lawns. Computers and Electronics in Agriculture,
176:105684.
Perazzi, F., Kr
¨
ahenb
¨
uhl, P., Pritch, Y., and Hornung, A.
(2012). Saliency filters: Contrast based filtering for
salient region detection. In Conf. on Computer Vision
and Pattern Recognition, pages 733–740. IEEE.
Rehman, M. U., Eesaar, H., Abbas, Z., Seneviratne, L.,
Hussain, I., and Chong, K. T. (2024). Advanced
drone-based weed detection using feature-enriched
deep learning approach. Knowledge-Based Systems,
305:112655.
Singh, V., Singh, D., and Kumar, H. (2024). Efficient appli-
cation of deep neural networks for identifying small
and multiple weed patches using drone images. Ac-
cess.
Tan, M. and Le, Q. (2019). Efficientnet: Rethinking model
scaling for convolutional neural networks. In Interna-
tional Conference on Machine Learning, pages 6105–
6114. PMLR.
Tang, J., Wang, D., Zhang, Z., He, L., Xin, J., and Xu, Y.
(2017). Weed identification based on k-means feature
learning combined with convolutional neural network.
Computers and electronics in agriculture, 135:63–70.
Wang, W., Xie, E., Li, X., Fan, D.-P., Song, K., Liang, D.,
Lu, T., Luo, P., and Shao, L. (2022). Pvt v2: Improved
baselines with pyramid vision transformer. Computa-
tional Visual Media, 8(3):415–424.
Westwood, J. H., Charudattan, R., Duke, S. O., Fennimore,
S. A., Marrone, P., Slaughter, D. C., Swanton, C., and
Zollinger, R. (2018). Weed management in 2050: Per-
spectives on the future of weed science. Weed science,
66(3):275–285.
Wu, Z., Chen, Y., Zhao, B., Kang, X., and Ding, Y. (2021).
Review of weed detection methods based on computer
vision. Sensors, 21(11):3647.
Yang, J., Wang, Q., Zheng, F., Chen, P., Leonardis, A., and
Fan, D.-P. (2024). Plantcamo: Plant camouflage de-
tection. arXiv.
DATA 2025 - 14th International Conference on Data Science, Technology and Applications
626