Latest Tweets
Loading Events
  • This event has passed.

Agricultural Crop Classification with Synthetic Aperture Radar and Optical Remote Sensing

 October 5, 2021 - October 19, 2021

For years, mapping of crop types and assessment of their characteristics has been carried out to monitor food security, inform optimal use of the landscape, and contribute to agricultural policy. High-quality crop mapping has become a requirement for most nations given its importance in national and international economics, trade, and food security, and is a major topic of interest in the domains of policy, economics, and land management. Most countries or economic regions currently and increasingly use freely available satellite imagery for crop type classification and biophysical variable assessment as they provide a synoptic view, multi-temporal coverage, and are cost-effective. Remote sensing methods based on optical and/or microwave sensors have become an important means of extracting crop information as they explain vegetation structure and biochemical properties.

This five-part, intermediate webinar series focused on the use of synthetic aperture radar (SAR) from Sentinel-1 and/or optical imagery from Sentinel-2 to map crop types and assessed their biophysical characteristics. The webinar covered a SAR and optical refresher along with pre-processing and analysis of Sentinel-1 and Sentinel-2 data using the Sentinel Application Platform (SNAP) and Python code written in JupyterLab, a web-based interactive development environment for scientific computing and machine learning. The webinar also covered an operational roadmap for mapping crop type, including best practices for collecting field data to train and validate models for classifying crops on a national level. The final session of this series covered crop biophysical variable retrievals using optical data.

This webinar series was part of the UNOOSA-ESA-ISRO-NASA Earth Observation Trainings for Agriculture, a partnership between ESA, the United Nations Office for Outer Space Affairs (UNOOSA), the Indian Space Research Organisation (ISRO) and the National Aeronautics and Space Administration (NASA). Organised in the context of the UN/Austria Symposium 2021, the objective of this partnership was to raise awareness and build capacity regarding the use of EO technologies for agriculture. For more information about the other trainings of this partnership, visit the UNOOSA website

 

OBJECTIVE:

By the end of this training attendees were be able to identify:

  • The information content in passive (optical) and active (microwave) remote sensing data and how it relates to agricultural parameters
  • The characteristics of passive and/or active sensors used in operational crop mapping and biophysical retrievals
  • Where to acquire satellite data for conducting agricultural analysis
  • The steps for pre-processing optical and radar imagery
  • The steps to classify crop types using supervised and unsupervised techniques and explain the difference among classifiers (e.g., Decision Tree and Random Forest)
  • Best practices for collecting field-based training data
  • Retrieval of crop biophysical variables

 

AUDIENCE:

This webinar series was intended for users of Earth Observation data from organisations related to agriculture and food security interested to deepen their knowledge in the domain of crop type mapping.

 

COURSE FORMAT:

  • Five, 2.5 hour sessions
  • Training sessions were held on Tuesdays and Thursdays on October 5, 7, 12, 14, & 19
  • The afternoon session was presented in English: 16:00 – 18:30 CEST (10:00 AM – 12:30 PM EDT) and the late afternoon session was presented in Spanish  19:00 – 21:30 CEST (13:00 – 15:30 PM EDT)
  • Those who attended all 5 sessions and completed the homework were be awarded a certificate of attendance.

Access here below the full programme and the training materials of past webinars from the series (video recordings, theory presentations, instructions and datasets to repeat the exercises.). The materials are available both in English and in Spanish.

 

 

SHARE