An Empirical Study of Bad Smells during Software Evolution Using Designite Tool

Mamdouh Alenezi*, Mohammad Zarour**
* Chief Information & Technology Officer (CITO), Prince Sultan University, Riyadh, Saudi Arabia.
** Faculty Member, Computer and Information Sciences, Prince Sultan University, Riyadh, Saudi Arabia.
Periodicity:April - June'2018
DOI : https://doi.org/10.26634/jse.12.4.14958

Abstract

Bad smells are not uncommon in software systems. Such problems arise as a result of incomplete, inconsistent or incorrect requirements followed, accordingly, by bad design decisions which travel to the construction phase ending up with malfunctioning software. Such problems are expected to be handled and resolved during the evolution of the software which may result in more complicated systems that are difficult to maintain, and the software starts aging. Various tools are available to help in uncovering, analyzing and visualizing various bad smells. Once the bad smells are uncovered, a remedial action should be taken such as refactoring. One of the new tools to detect and measure a big number of bad smells is Designite. In this paper, we use Designite to analyze six open source systems and see if bad smells are resolved while software is evolving or systems keep stinking. We found that software quality, in terms of resolving bad smells, gained less focus as the software evolves on the expense of focusing on adaptive and corrective actions and that would keep the software stinking. We also discussed some recommendations on how to reduce bad smells during the software process and some other recommendations to enhance the Designite tool.

Keywords

Software Evolution, Bad Smell, Architecture Smell, Design Smell, Implementation Smell, Designite.

How to Cite this Article?

Alenezi, M., Zarour, M.(2018). An Empirical Study of Bad Smells during Software Evolution Using Designite Tool. i-manager's Journal on Software Engineering, 12(4), 12-27. https://doi.org/10.26634/jse.12.4.14958

References

[1]. Alenezi, M. (2016). Software architecture quality measurement stability and understandability. International Journal of Advanced Computer Science and Applications (IJACSA), 7(7), 550-559.
[2]. Alenezi, M., & Abunadi, I. (2015). Evaluating software metrics as predictors of software vulnerabilities. International Journal of Security and Its Applications, 9(10), 231-240.
[3]. Alenezi, M., & Zarour, M. (2016). Does software structures quality improve over software evolution? Evidences from open-source projects. International Journal of Computer Science and Information Security, 14, 61-75.
[4]. Ayewah, N., Hovemeyer, D., Morgenthaler, J. D., Penix, J., & Pugh, W. (2008). Using static analysis to find bugs. IEEE Software, 25(5), 22-29.
[5]. Bass, L., Clements, P., & Kazman, R. (2013). Software Architecture in Practice. Addison-Wesley.
[6]. Bertran, I. M. (2011). Detecting architecturallyrelevant code smells in evolving software systems. In Proceedings of the 33rd International Conference on Software Engineering (pp. 1090-1093). ACM.
[7]. Campbell, G. A., & Papapetrou, P. P. (2014). SonarQube in Action, 1st Edition. Manning Pub.
[8]. Chatzigeorgiou, A., & Manakos, A. (2010). Investigating the evolution of bad smells in object- oriented code. In International Conference on the Quality of Information and Communications Technology (pp. 106-115). IEEE.
[9]. de Andrade, H. S., Almeida, E., & Crnkovic, I. (2014). Architectural bad smells in software product lines: An exploratory study. In Proceedings of the WICSA 2014 Companion Volume (p. 12). ACM.
[10]. De Silva, L., & Balasubramaniam, D. (2012). Controlling software architecture erosion: A survey. Journal of Systems and Software, 85(1), 132-151.
[11]. Flanagan, C., Leino, K. R. M., Lillibridge, M., Nelson, G., Saxe, J. B., & Stata, R. (2002). Extended Static Checking for Java. Pldi, pp. 234-245.
[12]. Fontana, F. A., Pigazzini, I., Roveda, R., Tamburri, D., Zanoni, M., & Di Nitto, E. (2017). Arcan: A tool for architectural smells detection. In 2017 IEEE International Conf. on Software Architecture Workshops, ICSA Workshops 2017 (pp. 282-285). IEEE.
[13]. Fowler, M., & Beck, K. (1999). Refactoring: Improving the Design of Existing Code. Addison-Wesley.
[14]. Ganesh, S. G., Sharma, T., & Suryanarayana, G. (2013). Towards a Principle-based Classification of Structural Design Smells. Journal of Object Technology, 12(2), 1-29.
[15]. Girba, T., Ducasse, S., & Lanza, M. (2004). Yesterday's weather: Guiding early reverse engineering efforts by summarizing the evolution of changes. In Software Maintenance, 2004. Proceedings. 20th IEEE International Conference on (pp. 40-49). IEEE.
[16]. Guimaraes, E., Garcia, A., & Cai, Y. (2014). Exploring Blueprints on the prioritization of architecturally relevant code anomalies-A controlled experiment. In Computer Software and Applications Conference (COMPSAC), 2014 IEEE 38th Annual (pp. 344-353). IEEE.
[17]. International Organization for Standardization, International Electrotechnical Commission, Institute of Electrical and Electronics Engineers, and IEEE-SA Standards Board. Systems and software engineering: Architecture description = Ingenieérie des systeèmes et des logiciels: description de l'architecture. ISO, 2011.
[18]. Iqbal, S., Khalid, M., & Khan, M. N. A. (2013). A Distinctive suite of performance metrics for software design. International Journal of Software Engineering and its Applications, 7(5), 197-208.
[19]. Johnson, B., Song, Y., Murphy-Hill, E., & Bowdidge, R. (2013). Why don't software developers use static analysis tools to find bugs? In Proceedings of the 2013 International Conference on Software Engineering (pp. 672-681). IEEE Press.
[20]. Kim, S., & Ernst, M. D. (2007). Which warnings should i fix first? In Proceedings of the 6th Joint Meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering (pp. 45-54). ACM.
[21]. Le, D. M., Carrillo, C., Capilla, R., & Medvidovic, N. (2016). Relating architectural decay and sustainability of software systems. In Software Architecture (WICSA), 2016 13th Working IEEE/IFIP Conference on (pp. 178-181). IEEE.
[22]. Lippert, M., & Roock, S. (2006). Refactoring in Large Software Projects: Performing Complex Restructurings Successfully. John Wiley & Sons.
[23]. Lozano, A., Wermelinger, M., & Nuseibeh, B. (2007). Assessing the impact of bad smells using historical information. In Ninth International Workshop on Principles of Software Evolution: In Conjunction with the 6th ESEC/FSE Joint Meeting (pp. 31-34). ACM.
[24]. Maneerat, N., & Muenchaisri, P. (2011). Bad-smell prediction from software design model using machine learning techniques. In Computer Science and Software Engineering (JCSSE), 2011 Eighth International Joint Conference on (pp. 331-336). IEEE.
[25]. Marinescu, R. (2001). Detecting design flaws via metrics in object-oriented systems. In Technology of Object-Oriented Languages and Systems, 2001. TOOLS 39. 39th International Conference and Exhibition on (pp. 173-182). IEEE.
[26]. Marinescu, R. (2004). Detection strategies: Metricsbased rules for detecting design flaws. In Software Maintenance, 2004. Proceedings. 20th IEEE International Conference on (pp. 350-359). IEEE.
[27]. Martin, R. C. (2013). Agile Software Development, Principles, Patterns, and Practices, First Edition. Pearson.
[28]. Moha, N., Gueheneuc, Y. G., & Duchien, A. F. (2010). Decor: A method for the specification and detection of code and design smells. IEEE Transactions on Software Engineering (TSE), 36(1), 20-36.
[29]. Munro, M. J. (2005). Product metrics for automatic identification of “bad smell” design problems in java source-code. In Software Metrics, 2005. 11th IEEE International Symposium. IEEE.
[30]. Palomba, F., Bavota, G., Di Penta, M., Oliveto, R., De Lucia, A., & Poshyvanyk, D. (2013). Detecting bad smells in source code using change history information. In Proceedings of the 28th IEEE/ACM International Conference on Automated Software Engineering (pp. 268-278). IEEE Press.
[31]. Parnas, D. L. (1994). Software aging. In Proceedings of 16th International Conference on Software Engineering (pp. 279-287).
[32]. Pérez, P. M., Filipiak, J., & Sierra, J. M. (2011). LAPSE+ static analysis security software: Vulnerabilities detection in java EE applications. In Future Information Technology (pp. 148-156). Springer, Berlin, Heidelberg.
[33]. Ratzinger, J., Sigmund, T., & Gall, H. C. (2008). On the relation of refactorings and software defect prediction. In Proceedings of the 2008 International Working Conference on Mining Software Repositories (pp. 35-38). ACM.
[34]. Riel, A. J. (1996). Object-oriented Design Heuristics. Addison-Wesley Longman Publishing Co., Inc..
[35]. Rutar, N., Almazan, C. B., & Foster, J. S. (2004). A comparison of bug finding tools for Java. In Null (pp. 245- 256). IEEE.
[36]. Sharma, T., Mishra, P., & Tiwari, R. (2016). Designite: A software design quality assessment tool. In Proceedings of the 1st International Workshop on Bringing Architectural Design Thinking into Developers' Daily Activities (pp. 1-4). ACM.
[37]. Slinger, S., & Moonen, I. L. M. F. (2005). Code Smell Detection in Eclipse. Delft University of Technology, Department of Software Technology.
[38]. Suryanarayana, G., Samarthyam, G., & Sharma, T. (2014). Refactoring for Software Design Smells: Managing Technical Debt. Morgan Kaufmann.
[39]. Tahvildari, L., & Kontogiannis, K. (2003). A metric-based approach to enhance design quality through meta-pattern transformations. In Software Maintenance and Reengineering, 2003. Proceedings. Seventh European Conference on (pp. 183-192). IEEE.
[40]. Taylor, R. N., Medvidovic, N., Eric, E. M., & Dashofy, M. (2010). Software Architecture: Foundations, Theory, and Practice. Wiley.
[41]. Travassos, G., Shull, F., Fredericks, M., & Basili, V. R. (1999). Detecting defects in object-oriented designs: Using reading techniques to increase software quality. In Proceddings of the 14th Conference on Object-Oriented Programming Systems, Languages, and Applications (OOPSLA) (pp. 47-56).
[42]. Trifu, A., & Marinescu, R. (2005). Diagnosing design problems in object oriented systems. In Reverse Engineering, 12th Working Conference on (pp. 155-164). IEEE.
[43]. Van Emden, E., & Moonen, L. (2002). Java quality assurance by detecting code smells. In Reverse Engineering, 2002. Proceedings. Ninth Working Conference on (pp. 97-106). IEEE.
[44]. Vidal, S. A., Marcos, C., & Díaz-Pace, J. A. (2016a). An approach to prioritize code smells for refactoring. Automated Software Engineering, 23(3), 501-532.
[45]. Vidal, S., Guimaraes, E., Oizumi, W., Garcia, A., Pace, A. D., & Marcos, C. (2016b). Identifying architectural problems through prioritization of code smells. In Software Components, Architectures and Reuse (SBCARS), 2016 X Brazilian Symposium on (pp. 41- 50). IEEE.
[46]. Vidal, S., Guimaraes, E., Oizumi, W., Garcia, A., Pace, A. D., & Marcos, C. (2016c). On the criteria for prioritizing code anomalies to identify architectural problems. In Proceedings of the 31st Annual ACM Symposium on Applied Computing (pp. 1812-1814). ACM.
[47]. Wong, S., Cai, Y., Kim, M., & Dalton, M. (2011). Detecting software modularity violations. In Software rd Engineering (ICSE), 2011  33rd International Conference on (pp. 411-420). IEEE.
If you have access to this article please login to view the article or kindly login to purchase the article

Purchase Instant Access

Single Article

North Americas,UK,
Middle East,Europe
India Rest of world
USD EUR INR USD-ROW
Pdf 35 35 200 20
Online 35 35 200 15
Pdf & Online 35 35 400 25

Options for accessing this content:
  • If you would like institutional access to this content, please recommend the title to your librarian.
    Library Recommendation Form
  • If you already have i-manager's user account: Login above and proceed to purchase the article.
  • New Users: Please register, then proceed to purchase the article.