
proved to be effective and easily implementable meth-
ods for uncertainty estimation. The MC dropout
method requires the inclusion of dropout layers in the
network, whereas conformal prediction does not im-
pose such a requirement. In this regard, conformal
prediction offers a more flexible and straightforward
implementation for any model.
As observed in Figure 7, the uncertainty estimated
by MC dropout remained relatively consistent across
all test dataset predictions, indicating that the model
maintains a uniform level of uncertainty across its
predictions. In contrast, the uncertainty estimated by
the ”Naive” CP method results in a fixed-width inter-
val as shown in Figure 8. There remains a potential to
explore alternative variations of conformal prediction
methods beyond the ”Naive” approach, such as split
conformal prediction, to enhance uncertainty estima-
tion.
The calibration plots for both methods given in
Figs. 9 and 10 indicate that the miscalibration area
is relatively low, suggesting that the model exhibits
some degree of calibration. However, there remains
room for further calibration improvement. The mis-
calibration area (0.19) for the Monte Carlo (MC)
dropout method was slightly smaller compared to the
conformal prediction method (0.21). MC dropout
method had empirical coverage of 73% at the nominal
90% and 95% confidence levels. In contrast, confor-
mal prediction method had empirical coverages 47%
and 80% at the nominal 90% and 95% confidence lev-
els. This further showed that the MC dropout method
resulted in better calibration in uncertainty estimation
compared to conformal prediction method in this task.
To further improve the calibration of the uncertainty
estimates produced by the MC Dropout method, iso-
tonic regression (Jiang et al., 2011) was employed us-
ing the validation set. However, this resulted in no
improvement, which could be attributed to the limited
amount of data.
6 CONCLUSION
In conclusion, the study demonstrates that our
pipeline: few-shot transfer learning, combined with
techniques such as MC dropout and conformal pre-
diction, can be effectively applied to agricultural tasks
such as lettuce growth monitoring and quantify as-
sociated uncertainties, even when trained on lim-
ited data. While the model performed promisingly
with the limited data, there are still opportunities to
improve accuracy, calibration, and uncertainty esti-
mation. Incorporating uncertainty quantification not
only improves confidence in predictions but also sup-
ports safer deployment in agricultural environments,
where decisions informed by reliable model outputs
are critical.
ACKNOWLEDGEMENTS
We would like to acknowledge the financial support
from the Research Council of Norway (project num-
ber 354125) and Photosynthetic AS. We also wish to
thank the Norwegian University of Life Sciences for
granting admission.
REFERENCES
Antoniou, A., Edwards, H., and Storkey, A. (2018). How to
train your maml. arXiv, 1810.09502.
Belissent, N., Pe
˜
na, J. M., Mes
´
ıas-Ruiz, G. A., Shawe-
Taylor, J., and P
´
erez-Ortiz, M. (2024). Transfer and
zero-shot learning for scalable weed detection and
classification in uav images. Knowledge-Based Sys-
tems, 292:111586.
Chang, L. and Lin, Y.-H. (2025). Few-shot remaining useful
life prediction based on bayesian meta-learning with
predictive uncertainty calibration. Engineering Appli-
cations of Artificial Intelligence, 142:109980.
Chollet, F. et al. (2015). Keras. https://keras.io.
Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., and Fei-
Fei, L. (2009). Imagenet: A large-scale hierarchical
image database. In 2009 IEEE Conference on Com-
puter Vision and Pattern Recognition, pages 248–255.
Ding, P., Jia, M., Ding, Y., Cao, Y., Zhuang, J., and Zhao,
X. (2023). Machinery probabilistic few-shot prognos-
tics considering prediction uncertainty. IEEE/ASME
Transactions on Mechatronics, 29(1):106–118.
Dutta, A. and Zisserman, A. (2019). The VIA annotation
software for images, audio and video. In Proceedings
of the 27th ACM International Conference on Multi-
media, MM ’19, New York, NY, USA. ACM.
Finn, C., Abbeel, P., and Levine, S. (2017). Model-agnostic
meta-learning for fast adaptation of deep networks. In
International conference on machine learning, pages
1126–1135. PMLR.
Gal, Y. and Ghahramani, Z. (2016). Dropout as a bayesian
approximation: Representing model uncertainty in
deep learning. In Balcan, M. F. and Weinberger,
K. Q., editors, Proceedings of The 33rd International
Conference on Machine Learning, volume 48 of Pro-
ceedings of Machine Learning Research, pages 1050–
1059, New York, New York, USA. PMLR.
He, J., Zhang, X., Lei, S., Alhamadani, A., Chen, F., Xiao,
B., and Lu, C.-T. (2023). Clur: uncertainty estima-
tion for few-shot text classification with contrastive
learning. In Proceedings of the 29th ACM SIGKDD
Conference on Knowledge Discovery and Data Min-
ing, pages 698–710.
KDIR 2025 - 17th International Conference on Knowledge Discovery and Information Retrieval
396