L-band and S-band frequencies assisted NISAR: A state of the art technology for sustainable agrarian monitoring
DOI:
https://doi.org/10.69968/ijisem.2024v3si2309-314Keywords:
NISAR, L-band, S-band, Agrarian monitoring Synthetic Aperture Radar (SAR), dual-frequency, Artificial IntelligenceAbstract
Agrarian monitoring plays a vital role in promoting sustainable development by providing detailed insights into the distribution, health, and types of agricultural practices within a region. This information is crucial for managing natural resources, planning land use, and preserving biodiversity. Current agrarian monitoring techniques, such as multi-spectral imaging, LiDAR, Normalized Difference Vegetation Index (NDVI), thermal infrared imaging, UAVs (Unmanned Aerial Vehicles), photogrammetry, vegetation indices, and machine learning, offer significant advantages in ease of monitoring. However, they also have limitations, particularly in their ability to provide all-weather, day-and-night imaging, wide-area coverage, large data volumes, and high-resolution imagery. NISAR (NASA-ISRO Synthetic Aperture Radar) has emerged as a cutting-edge technology for sustainable agrarian monitoring. It provides valuable and precise datasets for applications such as land subsidence monitoring, cryosphere studies, deforestation tracking, flood prediction, forest canopy analysis, biomass estimation, and crop growth and health assessment. NISAR's ability to operate in all weather conditions and at any time of day makes it especially valuable for monitoring in cloudy or densely vegetated regions. By integrating AI techniques with NISAR's advanced capabilities, we can enhance the analysis of the vast datasets it generates, enabling more accurate predictions and better decision-making in agricultural management. These features collectively position NISAR as a critical tool for advancing our understanding of Earth's dynamic systems and supporting the sustainable management of natural resources.
References
[1] Anderson, K., & Gaston, K. J. (2013). UAVs in spatial ecology. "Front. Ecol. Environ.", 11(3), 138-146.https://doi.org/10.1890/120150
[2] Chuvieco, E., & Caselles, V. (1994). Thermal inertia for vegetation water stress mapping. "Remote Sens. Environ.", 48(2), 211-217.
[3] Colomina, I., & Molina, P. (2014). UAVs for photogrammetry and remote sensing: A review. "ISPRS J. Photogramm. Remote Sens.", 92, 79-97.https://doi.org/10.1016/j.isprsjprs.2014.02.013
[4] Dubayah, R., & Drake, J. (2000). Lidar remote sensing for forestry. "J. For.", 98(6), 44-46.https://doi.org/10.1093/jof/98.6.44
[5] Dubey, A., Srivastava, S., & Pandey, R. (2021). AI in agriculture. "Comput. Sci. Rev.", 40, 100406.
[6] Foley, J. A., et al. (2011). Solutions for a cultivated planet. "Nature", 478(7369), 337-342.https://doi.org/10.1038/nature10452
[7] Garnett, T., et al. (2013). Sustainable intensification in agriculture. "Science", 341(6141), 33-34.https://doi.org/10.1126/science.1234485
[8] Hardin, P. J., & Jensen, R. R. (2011). UAVs in environmental remote sensing. "GISci. Remote Sens.", 48(1), 99-111.https://doi.org/10.2747/1548-1603.48.1.99
[9] Hunt, E. R., et al. (2010). NIR-Green-Blue UAV photographs for crop monitoring. "Remote Sens.", 2(1), 290-305.https://doi.org/10.3390/rs2010290
[10] Jensen, J. R. (2007). "Remote sensing of the environment: An earth resource perspective" (2nd ed.). Pearson.
[11] Kamilaris, A., & Prenafeta-Boldú, F. X. (2018). CNNs in agriculture. "J. Agric. Food Chem.", 66(22), 6738-6748.
[12] Lavender, S., et al. (2019). Machine learning for earth observation. "Earth Observ. Open Sci. J.", 3(1), 1-20.
[13] Lee, J. S., & Pottier, E. (2009). "Polarimetric radar imaging: From basics to applications". CRC Press.
[14] Liakos, K. G., et al. (2018). Machine learning in agriculture: A review. "Sensors", 18(8), 2674.https://doi.org/10.3390/s18082674
[15] Minchew, B. M., et al. (2020). Multi-frequency radar interferometry with NISAR. "IEEE Trans. Geosci. Remote Sens.", 58(3), 1623-1634.
[16] Mulla, D. J. (2013). Remote sensing in precision agriculture: Advances and gaps. "Biosyst. Eng.", 114(4), 358-371.https://doi.org/10.1016/j.biosystemseng.2012.08.009
[17] NASA. (2021). NISAR: Advancing our understanding of Earth's dynamic systems. Retrieved from https://www.nasa.gov/nisar
[18] Pettorelli, N., et al. (2005). NDVI for ecological responses to environmental change. "Trends Ecol. Evol.", 20(9), 503-510.https://doi.org/10.1016/j.tree.2005.05.011
[19] Rathore, M., et al. (2020). AI in agriculture and future perspectives. "Sustainability", 12(7), 2820.
[20] Reyes, M. A., et al. (2019). Sentinel-1 and Sentinel-2 for vegetation monitoring. "IEEE Trans. Geosci. Remote Sens.", 57(11), 8217-8232.
[21] Rosen, P. A., et al. (2021). The NASA-ISRO SAR (NISAR) mission. "Remote Sens. Environ.", 258, 112383.https://doi.org/10.1109/RadarConf2147009.2021.9455211
[22] Rouse, J. W., et al. (1974). Monitoring vegetation systems with ERTS. "NASA Spec. Publ.", 351, 309.
[23] Saatchi, S. S., et al. (2011). Forest carbon stocks in tropical regions. "Proc. Natl. Acad. Sci.", 108(24), 9899-9904.https://doi.org/10.1073/pnas.1019576108
[24] Simard, M., et al. (2011). Mapping forest canopy height with lidar. "J. Geophys. Res.: Biogeosciences", 116(G4).https://doi.org/10.1029/2011JG001708
[25] Simons, M., et al. (2020). The NASA-ISRO SAR mission and Earth's surface deformation. "Science", 368(6486), eaaz4382.
[26] Thenkabail, P. S., et al. (Eds.). (2012). Hyperspectral remote sensing of vegetation. CRC Press.https://doi.org/10.1201/b11222-41
[27] Tucker, C. J. (1979). Red and photographic infrared linear combinations for vegetation monitoring. "Remote Sens. Environ.", 8(2), 127-150.https://doi.org/10.1016/0034-4257(79)90013-0
[28] Turner, D., et al. (2012). UAV imagery for georectified mosaics. "Remote Sens.", 4(5), 1392-1410.https://doi.org/10.3390/rs4051392
[29] Wallace, L., et al. (2012). UAV-LiDAR system for forest inventory. "Remote Sens.", 4(6), 1519-1543.https://doi.org/10.3390/rs4061519
[30] Zhang, C., & Kovacs, J. M. (2012). UAVs in precision agriculture: A review. "Precision Agric.", 13(6), 693-712.https://doi.org/10.1007/s11119-012-9274-5
[31] Zhang, L., et al. (2018). Deep learning for remote sensing data. "IEEE Geosci. Remote Sens. Mag.", 4(2), 22-40.https://doi.org/10.1109/MGRS.2016.2540798
[32] Manoj Kumar K. Singh, Shashank Mohan, Brajesh Kumar, "Fusion of hyperspectral and LiDAR data using sparse stacked autoencoder for land cover classification with 3D-2D convolutional neural network," J. Appl. Rem. Sens. 16(3) 034523 (24 August 2022)https://doi.org/10.1117/1.JRS.16.034523
[33] Manoj K. Singh, Shashank Mohan, Brajesh Kumar, "Hyperspectral image classification using deep convolutional neural network and stochastic relaxation labeling," J. Appl. Rem. Sens. 15(4) 042612 (5 October 2021)https://doi.org/10.1117/1.JRS.15.042612
Downloads
Published
Issue
Section
License
Copyright (c) 2024 Shashank Mohan, Brajesh Kumar

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Re-users must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. This license allows for redistribution, commercial and non-commercial, as long as the original work is properly credited.