PlumX Metrics
Embed PlumX Metrics

Optimizing bearing health condition monitoring: exploring correlation feature selection algorithm

Engineering Research Express, ISSN: 2631-8695, Vol: 6, Issue: 2
2024
  • 0
    Citations
  • 0
    Usage
  • 4
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

Article Description

Vibration signals are a critical source of information for detecting and diagnosing bearing faults, making this research particularly relevant to the condition monitoring of industrial machinery, particularly bearings using vibration signals. This study delves into how feature selection can be done using Pearson’s Correlation Co-efficient within the context of monitoring bearing health conditions, utilizing two distinct approaches. Approach-1 involves feature selection without considering labels, while Approach-2 incorporates labels for feature selection. Comparative analysis is conducted against outcomes obtained when all features are selected. The research scrutinizes the impact of feature selection on classifier performance, accuracy, and execution times, utilizing various machine learning algorithms such as Decision Tree (DT), K Nearest Neighbor (KNN), Support Vector Machine (SVM), and Naïve Bayes (NB). The findings underscore that feature selection significantly enhances classifier accuracy while reducing execution times. Specifically, only DT and KNN with 50 neighbors achieved 100% accuracy when all features were considered. However, with feature selection using Approach-1 (without labels), DT, KNN, SVM (excluding 100 neighbors), and NB (with Normal/Gaussian kernel) attained 100% accuracy. Employing Approach-2 (with labeled features), DT with 0.7 and 0.9 thresholds, SVM-G with all thresholds (0.6, 0.7, and 0.9), KNN with all thresholds (except 100 neighbors), and NB-n (with all thresholds) achieved 100% accuracy. The study emphasizes the pivotal role of feature selection using Pearson’s Correlation Coefficient in enhancing machine learning classifier performance, offering promising avenues for future research and practical applications across diverse domains.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know