Abstract:
Leaf and fruit infections are the primary cause of the maximum harm to the crop, which decreases the quality and amount of the goods. To improve the productivity of plants, the timely identification of the infection is vital, which is a highly challenging task. Deep learning (DL) with image processing allows farmers to distinguish between healthy and infected crops. This work intends to identify healthy and diseased citrus leaf images using a convolutional neural network (CNN) on the Platform as a Service (PaaS) cloud. The dataset of five types of healthy and unhealthy citrus images was used, namely, black spot, melanose, canker, greening, and healthy. Furthermore, the four-transfer learning (TL) pre-trained deep CNN (DCNN) models, namely, ResNet152V2, InceptionResNetV2, DenseNet121, and DenseNet201, were used to classify the leaf type. The performance of the CNN and four DCNNs were assessed using the confusion matrix (accuracy, precision, recall, and F1-score) and receiver operating characteristic-area under the curve (ROC-AUC) curve. An augmentation technique was utilised to enhance the dataset images, which helped to improve the model's performance and achieved an accuracy of 98 percent precision and recall and an F1 score of 99 percent and an ROC-AUC score of 0.99. Moreover, the suggested CNN has only 15 layers, 427317 parameters, and 1.68MB size, while DCNN models have more layers, parameters, and large size. The small-size CNN was deployed to the Platform as a Service (PaaS) cloud. The deployed model link is available on a smartphone to upload a citrus leaf image to the cloud, and the result is instantly available on a mobile screen.