PlumX Metrics
Embed PlumX Metrics

Therapeutic peptides identification via kernel risk sensitive loss-based k-nearest neighbor model and multi-Laplacian regularization

Briefings in Bioinformatics, ISSN: 1477-4054, Vol: 25, Issue: 6
2024
  • 0
    Citations
  • 0
    Usage
  • 2
    Captures
  • 1
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Captures
    2
  • Mentions
    1
    • News Mentions
      1
      • News
        1

Most Recent News

New Peptides Study Findings Has Been Reported by a Researcher at University of Electronic Science and Technology of China (Therapeutic peptides identification via kernel risk sensitive loss-based k-nearest neighbor model and multi-Laplacian ...)

2024 NOV 04 (NewsRx) -- By a News Reporter-Staff News Editor at NewsRx Drug Daily -- A new study on peptides is now available. According

Article Description

Therapeutic peptides are therapeutic agents synthesized from natural amino acids, which can be used as carriers for precisely transporting drugs and can activate the immune system for preventing and treating various diseases. However, screening therapeutic peptides using biochemical assays is expensive, time-consuming, and limited by experimental conditions and biological samples, and there may be ethical considerations in the clinical stage. In contrast, screening therapeutic peptides using machine learning and computational methods is efficient, automated, and can accurately predict potential therapeutic peptides. In this study, a k-nearest neighbor model based on multi-Laplacian and kernel risk sensitive loss was proposed, which introduces a kernel risk loss function derived from the K-local hyperplane distance nearest neighbor model as well as combining the Laplacian regularization method to predict therapeutic peptides. The findings indicated that the suggested approach achieved satisfactory results and could effectively predict therapeutic peptide sequences.

Bibliographic Details

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know