remotesensing-logo

Journal Browser

Journal Browser

Artificial Intelligence and Automation in Sustainable Smart Farming

A special issue of Remote Sensing (ISSN 2072-4292).

Deadline for manuscript submissions: closed (28 February 2022) | Viewed by 20657

Special Issue Editors

School of Engineering and Technology, Melbourne Campus, Central Queensland University, Rockhampton, Australia
Interests: artificial intelligence (AI) for autonomous decision making for IoT based applications in smart farming, smart cities etc.; precision livestock; remote sensing; application of IoT and UAVs for smart farming; UAV image processing; deep learning alogrithms; AI for facial recognition; AI for fraud detection; AI for drowsiness detection for safe driving
Special Issues, Collections and Topics in MDPI journals
School of Information Technology and Engineering, Melbourne Institute of Technology, 288 Latrobe Street, Melbourne, VIC 3000, Australia
Interests: remote sensing; sensors, smart environments; deep learning; IoT; radar (lidar); wireless power transer
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In "The 2030 Agenda for Sustainable Development", the United Nation (UN) and international community set a target to eliminate Hunger from the world by 2030. Additionally, the world population is anticipated to reach to 10 billion by 2050, as per a report by World Resources Institutes (WRI) published in 2018. Hence, to reach this anticipated increase in food demand, artificial intelligence (AI) based sustainable smart farming and precision livestock is an inevitable approach. The aim of AI based smart farming and precision livestock is to improve productivity, increase yields and profitability and reduce the environmental footprint by utilizing different techniques such as efficient irrigation, targeted and precise use of pesticides and fertilizers for crops, vaccination scheduling of livestocks and tracking livestocks. Implementation of AI and automation techniques can bring promising developments and innovations to agricultural sectors by utilizing data science, computer vision and deep learning-based algorithms.

The main purpose of this Special Issue is to identify and report innovative and novel research outcomes on applications of AI, machine learning, deep learning, remote sensing and autonomous systems in smart farming and precision livestock. Contributions may include, but not limited to, the use of autonomous tractors, sprinklers and other instruments; infestation detection and removal using UAV images; crop health monitoring and yield prediction; smart and autonomous irrigation; soil mapping and fertilizer advisories; vegetation stress identification; livestock monitoring; tracking and controlling; vaccination scheduling of livestocks; the use of big data and high performance computing for agriculture and livestock.

Dr. Nahina Islam
Dr. Santoso Wibowo
Prof. Dr. Johnson Ihyeh Agbinya
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Artificial intelligence in smart farming
  • Automation in smart farming
  • Remote sensing
  • Machine learning
  • Deep learning
  • Sustainable smart farming
  • Sensors
  • Image processing

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

18 pages, 5868 KiB  
Article
Common Latent Space Exploration for Calibration Transfer across Hyperspectral Imaging-Based Phenotyping Systems
by Tanzeel U. Rehman, Libo Zhang, Dongdong Ma and Jian Jin
Remote Sens. 2022, 14(2), 319; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14020319 - 11 Jan 2022
Cited by 2 | Viewed by 1696
Abstract
Hyperspectral imaging has increasingly been used in high-throughput plant phenotyping systems. Rapid advancement in the field of phenotyping has resulted in a wide array of hyperspectral imaging systems. However, sharing the plant feature prediction models between different phenotyping facilities becomes challenging due to [...] Read more.
Hyperspectral imaging has increasingly been used in high-throughput plant phenotyping systems. Rapid advancement in the field of phenotyping has resulted in a wide array of hyperspectral imaging systems. However, sharing the plant feature prediction models between different phenotyping facilities becomes challenging due to the differences in imaging environments and imaging sensors. Calibration transfer between imaging facilities is crucially important to cope with such changes. Spectral space adjustment methods including direct standardization (DS), its variants (PDS, DPDS) and spectral scale transformation (SST) require the standard samples to be imaged in different facilities. However, in real-world scenarios, imaging the standard samples is practically unattractive. Therefore, in this study, we presented three methods (TCA, c-PCA, and di-PLSR) to transfer the calibration models without requiring the standard samples. In order to compare the performance of proposed approaches, maize plants were imaged in two greenhouse-based HTPP systems using two pushbroom-style hyperspectral cameras covering the visible near-infrared range. We tested the proposed methods to transfer nitrogen content (N) and relative water content (RWC) calibration models. The results showed that prediction R2 increased by up to 14.50% and 42.20%, while the reduction in RMSEv was up to 74.49% and 76.72% for RWC and N, respectively. The di-PLSR achieved the best results for almost all the datasets included in this study, with TCA being second. The performance of c-PCA was not at par with the di-PLSR and TCA. Our results showed that the di-PLSR helped to recover the performance of RWC, and N models plummeted due to the differences originating from new imaging systems (sensor type, spectrograph, lens system, spatial resolution, spectral resolution, field of view, bit-depth, frame rate, and exposure time) or lighting conditions. The proposed approaches can alleviate the requirement of developing a new calibration model for a new phenotyping facility or to resort to the spectral space adjustment using the standard samples. Full article
(This article belongs to the Special Issue Artificial Intelligence and Automation in Sustainable Smart Farming)
Show Figures

Figure 1

12 pages, 6910 KiB  
Article
Human vs. Machine, the Eyes Have It. Assessment of Stemphylium Leaf Blight on Onion Using Aerial Photographs from an NIR Camera
by Mary Ruth McDonald, Cyril Selasi Tayviah and Bruce D. Gossen
Remote Sens. 2022, 14(2), 293; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14020293 - 09 Jan 2022
Cited by 2 | Viewed by 1502
Abstract
Aerial surveillance could be a useful tool for early detection and quantification of plant diseases, however, there are often confounding effects of other types of plant stress. Stemphylium leaf blight (SLB), caused by the fungus Stemphylium vesicarium, is a damaging foliar disease [...] Read more.
Aerial surveillance could be a useful tool for early detection and quantification of plant diseases, however, there are often confounding effects of other types of plant stress. Stemphylium leaf blight (SLB), caused by the fungus Stemphylium vesicarium, is a damaging foliar disease of onion. Studies were conducted to determine if near-infrared photographic images could be used to accurately assess SLB severity in onion research trials in the Holland Marsh in Ontario, Canada. The site was selected for its uniform soil and level topography. Aerial photographs were taken in 2015 and 2016 using an Xnite-Canon SX230NDVI with a near-infrared filter, mounted on a modified Cine Star—8 MK Heavy Lift RTF octocopter UAV. Images were taken at 15–20 m above the ground, providing an average of 0.5 cm/pixel and a field of view of 15 × 20 m. Photography and ground assessments of disease were carried out on the same day. NDVI (normalized difference vegetation index), green NDVI, chlorophyll index and plant senescence reflective index (PSRI) were calculated from the images. There were differences in SLB incidence and severity in the field plots and differences in the vegetative indices among the treatments, but there were no correlations between disease assessments and any of the indices. Full article
(This article belongs to the Special Issue Artificial Intelligence and Automation in Sustainable Smart Farming)
Show Figures

Figure 1

15 pages, 2643 KiB  
Article
Using UAV and Multispectral Images to Estimate Peanut Maturity Variability on Irrigated and Rainfed Fields Applying Linear Models and Artificial Neural Networks
by Adão F. Santos, Lorena N. Lacerda, Chiara Rossi, Leticia de A. Moreno, Mailson F. Oliveira, Cristiane Pilon, Rouverson P. Silva and George Vellidis
Remote Sens. 2022, 14(1), 93; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14010093 - 25 Dec 2021
Cited by 11 | Viewed by 3671
Abstract
Using UAV and multispectral images has contributed to identifying field variability and improving crop management through different data modeling methods. However, knowledge on application of these tools to manage peanut maturity variability is still lacking. Therefore, the objective of this study was to [...] Read more.
Using UAV and multispectral images has contributed to identifying field variability and improving crop management through different data modeling methods. However, knowledge on application of these tools to manage peanut maturity variability is still lacking. Therefore, the objective of this study was to compare and validate linear and multiple linear regression with models using artificial neural networks (ANN) for estimating peanut maturity under irrigated and rainfed conditions. The models were trained (80% dataset) and tested (20% dataset) using results from the 2018 and 2019 growing seasons from irrigated and rainfed fields. In each field, plant reflectance was collected weekly from 90 days after planting using a UAV-mounted multispectral camera. Images were used to develop vegetation indices (VIs). Peanut pods were collected on the same dates as the UAV flights for maturity assessment using the peanut maturity index (PMI). The precision and accuracy of the linear models to estimate PMI using VIs were, in general, greater in irrigated fields with R2 > 0.40 than in rainfed areas, which had a maximum R2 value of 0.21. Multiple linear regressions combining adjusted growing degree days (aGDD) and VIs resulted in decreased RMSE for both irrigated and rainfed conditions and increased R2 in irrigated areas. However, these models did not perform successfully in the test process. On the other hand, ANN models that included VIs and aGDD showed accuracy of R2 = 0.91 in irrigated areas, regardless of using Multilayer Perceptron (MLP; RMSE = 0.062) or Radial Basis Function (RBF; RMSE = 0.065), as well as low tendency (1:1 line). These results indicated that, regardless of the ANN architecture used to predict complex and non-linear variables, peanut maturity can be estimated accurately through models with multiple inputs using VIs and aGDD. Although the accuracy of the MLP or RBF models for irrigated and rainfed areas separately was high, the overall ANN models using both irrigated and rainfed areas can be used to predict peanut maturity with the same precision. Full article
(This article belongs to the Special Issue Artificial Intelligence and Automation in Sustainable Smart Farming)
Show Figures

Figure 1

Review

Jump to: Research

21 pages, 1622 KiB  
Review
A Systematic Literature Review on Crop Yield Prediction with Deep Learning and Remote Sensing
by Priyanga Muruganantham, Santoso Wibowo, Srimannarayana Grandhi, Nahidul Hoque Samrat and Nahina Islam
Remote Sens. 2022, 14(9), 1990; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14091990 - 21 Apr 2022
Cited by 74 | Viewed by 11662
Abstract
Deep learning has emerged as a potential tool for crop yield prediction, allowing the model to automatically extract features and learn from the datasets. Meanwhile, smart farming technology enables the farmers to achieve maximum crop yield by extracting essential parameters of crop growth. [...] Read more.
Deep learning has emerged as a potential tool for crop yield prediction, allowing the model to automatically extract features and learn from the datasets. Meanwhile, smart farming technology enables the farmers to achieve maximum crop yield by extracting essential parameters of crop growth. This systematic literature review highlights the existing research gaps in a particular area of deep learning methodologies and guides us in analyzing the impact of vegetation indices and environmental factors on crop yield. To achieve the aims of this study, prior studies from 2012 to 2022 from various databases are collected and analyzed. The study focuses on the advantages of using deep learning in crop yield prediction, the suitable remote sensing technology based on the data acquisition requirements, and the various features that influence crop yield prediction. This study finds that Long Short-Term Memory (LSTM) and Convolutional Neural Networks (CNN) are the most widely used deep learning approaches for crop yield prediction. The commonly used remote sensing technology is satellite remote sensing technology—in particular, the use of the Moderate-Resolution Imaging Spectroradiometer (MODIS). Findings show that vegetation indices are the most used feature for crop yield prediction. However, it is also observed that the most used features in the literature do not always work for all the approaches. The main challenges of using deep learning approaches and remote sensing for crop yield prediction are how to improve the working model for better accuracy, the practical implication of the model for providing accurate information about crop yield to agriculturalists, growers, and policymakers, and the issue with the black box property. Full article
(This article belongs to the Special Issue Artificial Intelligence and Automation in Sustainable Smart Farming)
Show Figures

Graphical abstract

Back to TopTop