Distinguishing Similar Design Pattern Instances through Temporal Behavior Analysis
SANER 2020 - Proceedings of the 2020 IEEE 27th International Conference on Software Analysis, Evolution, and Reengineering, Page: 296-307
2020
- 3Citations
- 140Usage
- 17Captures
Metric Options: CountsSelecting the 1-year or 3-year option will change the metrics count to percentiles, illustrating how an article or review compares to other articles or reviews within the selected time period in the same journal. Selecting the 1-year option compares the metrics against other articles/reviews that were also published in the same calendar year. Selecting the 3-year option compares the metrics against other articles/reviews that were also published in the same calendar year plus the two years prior.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Example: if you select the 1-year option for an article published in 2019 and a metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019. If you select the 3-year option for the same article published in 2019 and the metric category shows 90%, that means that the article or review is performing better than 90% of the other articles/reviews published in that journal in 2019, 2018 and 2017.
Citation Benchmarking is provided by Scopus and SciVal and is different from the metrics context provided by PlumX Metrics.
Metrics Details
- Citations3
- Citation Indexes3
- Usage140
- Downloads112
- Abstract Views28
- Captures17
- Readers17
- 17
Conference Paper Description
Design patterns (DPs) encapsulate valuable design knowledge of object-oriented systems. Detecting DP instances helps to reveal the underlying rationale, thus facilitates the maintenance of legacy code. Resulting from the internal similarity of DPs, implementation variants, and missing roles, approaches based on static analysis are unable to well identify structurally similar instances. Existing approaches further employ dynamic techniques to test the runtime behaviors of candidate instances. Automatically verifying the runtime behaviors of DP instances is a challenging task in multiple aspects. This paper presents an approach to improve the verification process of existing approaches. To exercise the runtime behaviors of DP instances in cases that test cases of legacy systems are often unavailable, we propose a markup language, TSML (Test Script Markup Language), to direct the generation of test cases by putting a DP instance into use. The execution of test cases is monitored based on a trace method that enables us to specify runtime events of interest using regular expressions. To characterize runtime behaviors, we introduce a modeling and specification method employing Allen's interval-based temporal relations, which supports variant behaviors in a flexible way without hard-coded algorithms. A prototype tool has been implemented and evaluated on six open source systems to verify 466 instances reported by five existing approaches with respect to five DPs. The results show that the dynamic analysis increases the F-score by 53.6% in distinguishing similar DP instances.
Bibliographic Details
http://www.scopus.com/inward/record.url?partnerID=HzOxMe3b&scp=85083582823&origin=inward; http://dx.doi.org/10.1109/saner48275.2020.9054804; https://ieeexplore.ieee.org/document/9054804/; https://ink.library.smu.edu.sg/sis_research/5614; https://ink.library.smu.edu.sg/cgi/viewcontent.cgi?article=6617&context=sis_research
Institute of Electrical and Electronics Engineers (IEEE)
Provide Feedback
Have ideas for a new metric? Would you like to see something else here?Let us know