class: center, middle, inverse, title-slide .title[ # Sentinel-2 ] .subtitle[ ## The Remote Sensing Sensor ] .author[ ### Rahmadita Listianingrum ] .date[ ### 2023/01/27 ] --- class: center middle <center><img src="https://www.earthdata.nasa.gov/s3fs-public/2022-02/Air-Quality-Transparent-Blue.gif?VersionId=z3k6nWjVZXyNXN30iMu4GZnVDBVqHYZ0" alt="Sensor" height="300px" /></center> .bg-.b--yellow.ba.bw2.br2.shadow-2.ph3.mt3[ ## "Remote sensing is the acquiring of information from a distance." .tr[ **— NASA** ]] --- class: inverse center middle # Summary Earth Observation mission from the Copernicus Programme --- ## Sentinel-2 .panelset[ .panel[.panel-name[**Sentinel-2**] <img src="img/sentinel-2.png" width="50%" style="display: block; margin: auto;" /> Source: [ESA](https://sentinels.copernicus.eu/web/sentinel/missions/sentinel-2) <a name=cite-esaSentinel2MissionsSentinel></a>([ESA, 2023](#bib-esaSentinel2MissionsSentinel)) ] .panel[.panel-name[**Overview**] .pull-left[ SENTINEL-2 is a high-resolution, multi-spectral imaging mission consisting of two twin satellites in the same orbit with a high revisit frequency of 5 days at the Equator. The satellites carry an optical instrument payload that samples 13 spectral bands at different spatial resolutions, providing an orbital swath width of 290 km. The SENTINEL-2 mission continues the legacy of SPOT and LANDSAT and supports various services and applications offered by Copernicus, including land management, agriculture, forestry, disaster control, humanitarian relief operations, risk mapping, and security concerns. ] .pull-right[ <img src="img/sentinel-2-sat.png" width="100%" style="display: block; margin: auto;" /> Source: [Astrium GmBH, Germany](https://artes.esa.int/contractors/eads-astrium-gmbh)<a name=cite-eadsEADSAstriumGmbH></a>([EADS, 2023](#bib-eadsEADSAstriumGmbH)) ] ] .panel[.panel-name[**Key-Features**] .pull-left[ Sentinel is the [Earth Observation](https://joint-research-centre.ec.europa.eu/scientific-activities-z/earth-observation_en) mission from the [Copernicus Programme](https://www.copernicus.eu/en/about-copernicus) The SENTINEL-2 satellite aims to provide high-resolution, multispectral images with a high revisit frequency on a global scale. Its objectives are outlined in the Mission Requirements Document and include providing continuity for multi-spectral imagery from SPOT and LANDSAT satellites, generating data for operational products like land-cover maps, and contributing to Copernicus themes like climate change and land monitoring. With its 13 spectral bands, 290 km swath width, and high revisit frequency, SENTINEL-2's MSI instrument is well-suited for a range of land studies and programmes, including land cover/change classification, atmospheric correction, and cloud/snow masks. ] .pull-right[ <img src="img/key-features.png" width="100%" style="display: block; margin: auto;" /> Source: [Astrium GmBH, Germany](https://artes.esa.int/contractors/eads-astrium-gmbh)([EADS, 2023](#bib-eadsEADSAstriumGmbH)) ] ] .panel[.panel-name[**Comparison**] <img src="img/comparison.png" width="55%" style="display: block; margin: auto;" /> Comparison of the capabilities of Landsat, SPOT and Sentinel-2.<br> Source: [Astrium GmBH, Germany](https://artes.esa.int/contractors/eads-astrium-gmbh)([EADS, 2023](#bib-eadsEADSAstriumGmbH)) ] .panel[.panel-name[**Bands**] <style> div.remark-slide-content { padding: 2em; /*default is 1em 4em*/ font-size: .6em; } </style>
Source: [EOS Data Analytics](https://eos.com/find-satellite/sentinel-2/)<a name=cite-eosSentinel2SatelliteImagery2021></a>([EOS, 2023](#bib-eosSentinel2SatelliteImagery2021)) ] ] --- class: inverse center middle # Application --- ## Marine/Coastal Monitoring The study by Yustisi Lumban-Gaol, Ken Arroyo Ohori, and Ravi Peters, published in Marine Geodesy in 2022, extracted water depth information in coastal areas using multi-temporal Sentinel-2 satellite images and convolutional neural networks (CNNs). The study is motivated by the importance of water depth information for various applications, such as navigation safety, coastal zone management, and ecosystem conservation. <a name=cite-lumban-gaolExtractingCoastalWater2022></a>([Lumban-Gaol, Ohori, and Peters, 2022](#bib-lumban-gaolExtractingCoastalWater2022)) used a CNN model architecture that consists of two convolutional layers followed by two fully connected layers. The input to the CNN model is a multi-temporal Sentinel-2 image, and the output is a water depth map. The authors also experimented with different input combinations of Sentinel-2 bands, temporal differences, and spatial filters to optimize the performance of the CNN model. The accuracy of the depth estimates achieved in this study is promising for applications such as navigation safety and coastal zone management. The authors suggest that further research can explore the application of the CNN models to other coastal areas with different water properties and bathymetric characteristics. .pull-left[ <img src="img/result1.png" width="55%" style="display: block; margin: auto;" /> Source: ([Lumban-Gaol, Ohori, and Peters, 2022](#bib-lumban-gaolExtractingCoastalWater2022)) ] .pull-right[ <img src="img/error1.png" width="75%" style="display: block; margin: auto;" /> Source: ([Lumban-Gaol, Ohori, and Peters, 2022](#bib-lumban-gaolExtractingCoastalWater2022)) ] --- class: inverse center middle # Reflection --- ## Reflection it is clear that Sentinel-2 is a powerful tool for various applications such as land cover mapping, vegetation monitoring, and water quality assessment. Its multi-spectral imaging capability and high spatial resolution provide valuable data for research and analysis. The study you mentioned on extracting coastal water depths using Sentinel-2 images is a great example of the satellite's potential for oceanographic applications. The study of Sentinel-2 application demonstrates how the use of convolutional neural networks (CNNs) can extract information on coastal water depths from Sentinel-2 images with high accuracy. This approach can provide a cost-effective and efficient way of collecting data on coastal bathymetry, which is essential for a variety of marine applications such as coastal zone management, fisheries management, and environmental monitoring. The capabilities of Sentinel-2 and the advancements in image processing techniques, such as CNNs, have allowed for new insights and opportunities in remote sensing applications. The increasing availability of high-quality satellite data has the potential to revolutionize our understanding of the natural world and inform policy and decision-making. --- # Reference <a name=bib-eadsEADSAstriumGmbH></a>[EADS, A. G.](#cite-eadsEADSAstriumGmbH) (2023). _EADS Astrium GmbH_. URL: [https://artes.esa.int/contractors/eads-astrium-gmbh](https://artes.esa.int/contractors/eads-astrium-gmbh) (visited on Mar. 24, 2023). <a name=bib-eosSentinel2SatelliteImagery2021></a>[EOS, E.](#cite-eosSentinel2SatelliteImagery2021) (2023). _Sentinel-2: Satellite Imagery, Overview, And Characteristics_. URL: [https://eos.com/find-satellite/sentinel-2/](https://eos.com/find-satellite/sentinel-2/) (visited on Mar. 24, 2023). <a name=bib-esaSentinel2MissionsSentinel></a>[ESA, E.](#cite-esaSentinel2MissionsSentinel) (2023). _Sentinel-2 - Missions - Sentinel Online_. URL: [https://copernicus.eu/missions/sentinel-2](https://copernicus.eu/missions/sentinel-2) (visited on Mar. 24, 2023). <a name=bib-lumban-gaolExtractingCoastalWater2022></a>[Lumban-Gaol, Y., K. A. Ohori, and R. Peters](#cite-lumban-gaolExtractingCoastalWater2022) (2022). "Extracting Coastal Water Depths from Multi-Temporal Sentinel-2 Images Using Convolutional Neural Networks". In: _Marine Geodesy_ 45.6, pp. 615-644. ISSN: 0149-0419. DOI: [10.1080/01490419.2022.2091696](https://doi.org/10.1080%2F01490419.2022.2091696). URL: [https://doi.org/10.1080/01490419.2022.2091696](https://doi.org/10.1080/01490419.2022.2091696) (visited on Mar. 24, 2023). --- class: inverse center middle # Thanks! Slides created via the R packages: [**xaringan**](https://github.com/yihui/xaringan) by<br>[**Yihui Xi**](https://yihui.org)<br> [**xaringanthemer**](https://github.com/gadenbuie/xaringanthemer) and [**xaringanExtra**](https://github.com/gadenbuie/xaringanExtra) by<br>[**Garrick Aden Buie**](https://www.garrickadenbuie.com) This slide is inspired by and adapted from<br>[**Dr Andrew Maclachlan's lecture**](https://andrewmaclachlan.github.io/CASA0023-lecture-2/#1)