How to produce prediction maps of Natural hazards in GIS (by Videos)
Landslides bury Malaysian orphanage, killing at least 16, May 2011 |
Introduction
Natural hazards assessment is generally performed by summing up two main independent components: the spatial and temporal probability of the occurrence of the triggering conditions, like in landslides (Guzzetti et al., 2005).
Many studies have been conducted to address the relationship between these two components in many areas. Literature reviews address the challenges faced by scientists, planners, and land developers in the application and development of these probabilities. These reviews also highlight the uncertainties involved in data acquisition and preparation as well as in model selection and calibration techniques.
Although few studies investigated the susceptibility analysis of prone areas using statistical and data mining approaches with promising prediction accuracy. But, there was a massive ignorance to uncertainty errors that usually accumulates along the data collection, analysis methodology, and mapping process.
Guzzetti et al (2006) claimed that most of a published literature discuss the susceptibility model only, but provide no information about the quality of the proposed modeling process. Moreover, he insisted on the vital role behinds determining the errors associated with a hazard susceptibility assessment for prediction uncertainty reduction.
1. Understanding the natural behavior, distribution, and frequency of slope failures widens the range of applicability of susceptibility models (Lin et al., 2010; Melchiorre et al., 2008; Van Westen et al., 2008). Moreover, preparation for the distributive pattern of hazard inventory was ignored.
2. Determine the proper mapping unit based on a logical conceptual formula.
3. Test the significant of the relationship between hazard occurrence (dependent factor) and the conditioning factors (independent factors) (Interrelationship tolerance, and co-linearity index between thematic maps ) (Van Den Eeckhaut et al., 2012).
4. Investigating the role of the thematic information through statistical measurements tests like the difference in the −2 log likelihood (−2LL) is considered as an effective indicator of the improvement of the model over the null model. Moreover, Cox/Snell’s and Nagelkerke’s R-square tests were used to measure the usefulness of the model.
5. Determine of hazard process and its relationship with temporal probability evaluation (Guzzetti et al., 2006).
As a result, if the above-mentioned error were not discussed and treated scientifically within the susceptibility modeling, that will lead to the following:
1- The concept behind the capability of increasing or decreasing the number of thematic maps always vague.
2. The possibility of successfully replicating those models elsewhere or another dataset will be questionable.
Hazard susceptibility analysis and its prediction methods
The three main methods of susceptibility analysis include the deterministic approach, qualitative or heuristic approach, and statistical approach. (1) Deterministic approaches mainly consider slope geometry, the characteristics of slope materials, the relative homogeneity of the surface of the study area, and the surface and subsurface water level (Armaş et al., 2014; Dugonjić Jovančević et al., 2013; Gökceoglu and Aksoy, 1996; Wu and Sidle, 1995). (2) Qualitative or heuristic approaches mainly consider the opinions and experiences of experts in evaluating susceptible areas (Ives and Messerli, 1981; Regmi et al., 2010; Rupke et al., 1988). (3) Statistical approaches are based on the assumptions of environmental conditions that suggest future hazards and share similar characteristics to historical conditions. The probabilistic analysis thus considers the statistical relationships between historical locations and the conditioning factors Althuwaynee et al. (2014).
Since the late 1980s, the widely popular and efficient geographic information system (GIS) has facilitated the development of new machine learning, data-driven, and empirical methods that reduce generalization errors. Empirical ensemble techniques are based on valid integrations between statistically based algorithms combined through the outputs of multiple classifiers. Ensemble classification methodologies are popular empirical techniques that provide more accurate and reliable estimates than individual models do. Such methodologies are powerful tools for improving and increasing the reliability of the prediction results of hazard maps (Hu, 2001; Rokach, 2010).
In the following videos, I have shared basic techniques, to help you start your first prediction map using ArcMap and Excel only.
You will learn how to:
1- Calculate the spatial correlation between; prediction factors, and the dependent factor.
2- Calculate autocorrelations between; the prediction factors, by considering their prediction importance or contribution.
3- Produce susceptibility map using; Microsoft Excel and ESRI ArcMap only.
4- Validate the prediction accuracy using; most common statistical method of Area under the curve (AUC).
To Watch the entire course in Video format with 90% off discount, please visit:
https://www.udemy.com/spatial-prediction-in-gis-using-arcgis-and-excel-only/?couponCode=90_OFF_ALTHUWAYNEE
https://www.udemy.com/spatial-prediction-in-gis-using-arcgis-and-excel-only/?couponCode=90_OFF_ALTHUWAYNEE
If you have any question about data mining approaches or spatial analysis in GIS, please post a comment below.
Happy learning!
About the author:
Comments
Post a Comment