PlumX Metrics
Embed PlumX Metrics

A Comprehensive Experimental and Computational Investigation on Estimation of Scour Depth at Bridge Abutment: Emerging Ensemble Intelligent Systems

Water Resources Management, ISSN: 1573-1650, Vol: 37, Issue: 9, Page: 3745-3767
2023
  • 22
    Citations
  • 0
    Usage
  • 33
    Captures
  • 1
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Citations
    22
  • Captures
    33
  • Mentions
    1
    • News Mentions
      1
      • 1

Most Recent News

Study Findings from University of Zanjan Provide New Insights into Technology (A Comprehensive Experimental and Computational Investigation On Estimation of Scour Depth At Bridge Abutment: Emerging Ensemble Intelligent Systems)

2023 JUN 22 (NewsRx) -- By a News Reporter-Staff News Editor at Middle East Daily -- Current study results on Technology have been published. According

Article Description

Several bridges failed because of scouring and erosion around the bridge elements. Hence, precise prediction of abutment scour is necessary for the safe design of bridges. In this research, experimental and computational investigations have been devoted based on 45 flume experiments carried out at the NIT Warangal, India. Three innovative ensemble-based data intelligence paradigms, namely categorical boosting (CatBoost) in conjunction with extra tree regression (ETR) and K-nearest neighbor (KNN), are used to accurately predict the scour depth around the bridge abutment. A total of 308 series of laboratory data (a wide range of existing abutment scour depth datasets (263 datasets) and 45 flume data) in various sediment and hydraulic conditions were used to develop the models. Four dimensionless variables were used to calculate scour depth: approach densimetric Froude number (F), the upstream depth (y) to abutment transverse length ratio (y/L), the abutment transverse length to the sediment mean diameter (L/d ), and the mean velocity to the critical velocity ratio (V/V). The Gradient boosting decision tree (GBDT) method selected features with higher importance. Based on the feature selection results, two combinations of input variables (comb1 (all variables as model input) and comb2 (all variables except F)) were used. The CatBoost model with Comb1 data input (RMSE = 0.1784, R = 0.9685, MAPE = 10.4724) provided better accuracy when compared to other machine learning models.

Bibliographic Details

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know