Image synthesis of apparel stitching defects using deep convolutional generative adversarial networks
Heliyon, ISSN: 2405-8440, Vol: 10, Issue: 4, Page: e26466
2024
- 4Citations
- 28Captures
- 1Mentions
Metric Options: Counts1 Year3 YearSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Metrics Details
- Citations4
- Citation Indexes4
- CrossRef3
- Captures28
- Readers28
- 28
- Mentions1
- News Mentions1
- 1
Most Recent News
New Science and Technology Research from National Textile University Outlined (Image synthesis of apparel stitching defects using deep convolutional generative adversarial networks)
2024 MAR 13 (NewsRx) -- By a News Reporter-Staff News Editor at NewsRx Science Daily -- Research findings on science and technology are discussed in
Article Description
In industrial manufacturing, the detection of stitching defects in fabric has become a pivotal stage in ensuring product quality. Deep learning-based fabric defect detection models have demonstrated remarkable accuracy, but they often require a vast amount of training data. Unfortunately, practical production lines typically lack a sufficient quantity of apparel stitching defect images due to limited research-industry collaboration and privacy concerns. To address this challenge, this study introduces an innovative approach based on DCGAN (Deep Convolutional Generative Adversarial Network), enabling the automatic generation of stitching defects in fabric. The evaluation encompasses both quantitative and qualitative assessments, supported by extensive comparative experiments. For validation of results, ten industrial experts marked 80% accuracy of the generated images. Moreover, Fréchet Inception Distance also inferred promising results. The outcomes, marked by high accuracy rate, underscore the effectiveness of proposed defect generation model. It demonstrates the ability to produce realistic stitching defective data, bridging the gap caused by data scarcity in practical industrial settings.
Bibliographic Details
http://www.sciencedirect.com/science/article/pii/S2405844024024976; http://dx.doi.org/10.1016/j.heliyon.2024.e26466; http://www.scopus.com/inward/record.url?partnerID=HzOxMe3b&scp=85185804951&origin=inward; http://www.ncbi.nlm.nih.gov/pubmed/38420437; https://linkinghub.elsevier.com/retrieve/pii/S2405844024024976; https://dx.doi.org/10.1016/j.heliyon.2024.e26466
Elsevier BV
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know