Username   Password       Forgot your password?  Forgot your username? 

 

Decomposing Constraints for Better Coverage in Test Data Generation

Volume 14, Number 6, June 2018, pp. 1251-1262
DOI: 10.23940/ijpe.18.06.p16.12511262

Ju Qian, Kun Liu, Hao Chen, Zhiyi Zhang, and Zhe Chen

College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics, Nanjing, 210016, China

(Submitted on March 6, 2018; Revised on April 21, 2018; Accepted on May 19, 2018)

Abstract:

In black-box testing, a possible choice for test data generation is to derive test data from interface constraints using some constraint solving techniques. However, directly performing constraint solving to the whole constraint formula may not be able to fully leverage the information embodied in a constraint. Thus, it is difficult to obtain a high coverage test set. For example, when solving a constraint (a > 0 or b < 0) in a whole, we cannot guarantee that data covering sub-constraint b < 0 will be involved in the test set. To address the problem, in this paper, we firstly define a hierarchy of coverage criteria at the specification constraint level. Then, algorithms are designed to decompose constraints according to such coverage criteria and to generate test input sets. The experiments on a set of benchmark programs show that decomposing constraints according to constraint-level coverage criteria does effectively lead to better coverage in test data generation.

 

References: 22

        1. L. v. Aertryck and T. Jensen, “UML-Casting: Test Synthesis from UML Models Using Constraint Resolution,” In Proceedings of the Approches Formelles dans l'Assistance au Développement de Logiciels (AFADL), 2003.
        2. S. Ali, M. Z. Iqbal, A. Arcuri, and L. C. Briand, “Generating Test Data from OCL Constraints with Search Techniques,” IEEE Transactions on Software Engineering, vol. 39, no. 10, pp. 1376-1402, 2013
        3. S. Ali, M. Z. Iqbal, M. Khalid, and A. Arcuri, “Improving the Performance of OCL Constraint Solving with Novel Heuristics for Logical Operations: a Search-Based Approach,” Empirical Software Engineering, vol. 21, no.6, pp.2459-2502, 2016
        4. S. Anand, E. K. Burke, T. Y. Chen, J. Clark, M. B. Cohen, W. Grieskamp, M. Harman, M. J. Harrold, and P. McMinn, “An Orchestrated Survey of Methodologies for Automated Software Test Case Generation,” Journal of Systems and Software, vol. 86, no. 8, pp.1978-2001, 2013
        5. A. Arcuri and L. Briand, “Adaptive Random Testing: An Illusion of Effectiveness?” In Proceedings of the International Symposium on Software Testing and Analysis (ISSTA), pp.265-275, Toronto, Ontario, Canada, July 2011
        6. I. Banerjee, B. Nguyen, V. Garousi, and A. Memon, “Graphical User Interface (GUI) testing: Systematic Mapping and Repository,” Information and Software Technology, vol. 55, no.10, pp.1679-1694, 2013
        7. M. Benattou, J.-M. Bruel, and N. Hameurlain, “Generating Test Data from OCL Specification,” In Proceeding of ECOOP Workshop on Integration and Transformation of UML Models, 2002
        8. A. D. Brucker and B. Wolff, “On Theorem Prover-Based Testing,” Formal Aspects of Computing, vol. 25, no. 5, pp 683–721, 2013
        9. C. Cadar, D. Dunbar, and D. Engler, “KLEE: Unassisted and Automatic Generation of High-Coverage Tests for Complex Systems Programs,” In Proceedings of the 8th USENIX Symposium on Operating Systems Design and Implementation (OSDI), pp. 209-224, San Diego, California, USA, 2008
        10. L. de Moura and N. Bjørner, “Z3: An Efficient SMT Solver,” In Proceedings of the 14th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS), pp. 337-340, Budapest, Hungary, March 2008
        11. P. Godefroid, N. Klarlund, and K. Sen, “DART: Directed Automated Random Testing,” In Proceedings of the ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI), pp. 213-223, Chicago, IL, USA, June 2005
        12. P. Godefroid, H. Peleg, and R. Singh, “Learn&Fuzz: Machine Learning for Input Fuzzing,” In Proceedings of the IEEE/ACM International Conference on Automated Software Engineering (ASE), pp. 50-59, Urbana-Champaign, IL, USA, 2007
        13. C. A. González and J. Cabot, “Test Data Generation for Model Transformations Combining Partition and Constraint Analysis,” In Proceedings of the International Conference on Theory and Practice of Model Transformations, pp. 25-41, York, UK, July 2014
        14. A. Gotlieb. “Chapter Two: Constraint-Based Testing: An Emerging Trend in Software Testing,” Advances in Computers, vol. 99, pp. 67-101, 2015
        15. F. M. Kifetew, R. Tiella, and P. Tonella, “Generating Valid Grammar-Based Test Inputs by Means of Genetic Programming and Annotated Grammars,” Empirical Software Engineering, vol.22, no.2, pp 928–961, 2017
        16. H. Mei and L. Zhang, “A Framework for Testing Web Services and Its Supporting Tool,” In Proceedings of the IEEE International Workshop on Service-Oriented System Engineering (SOSE), pp. 207-214, Beijing, China, October, 2005.
        17. K. Sen, D. Marinov, and G. Agha, “CUTE: A Concolic Unit Testing Engine for C,” In Proceedings of the International Symposium on the Foundations of Software Engineering (FSE), pp. 263-272 , Lisbon, Portugal, September 2005
        18. M. Utting, A. Pretschner, and B. Legeard, “A Taxonomy of Model-Based Testing Approaches,” Software Testing, Verification and Reliability, vol.22, no.5, pp.297-312, 2012
        19. S. Weiβleder and B.-H. Schlingloff, “Deriving Input Partitions from UML Models for Automatic Test Generation,” In Proceedings of the International Conference on Model Driven Engineering Languages and Systems (MODELS), pp. 151-163, Nashville, TN, USA, September 2007
        20. S. Weiβleder andB.-H. Schlingloff, “Quality of Automatically Generated Test Cases Based on OCL Expressions,” In Proceedings of the International Conference on Software Testing, Verification, and Validation (ICST), pp. 517-520, Lillehammer, Norway, June 2008
        21. Coreutils - GNU core utilities, 2018, http://www.gnu.org/software/coreutils/
        22. gcov—a Test Coverage Program, 2018, https://gcc.gnu.org/onlinedocs/gcc/Gcov.html

               

              Please note : You will need Adobe Acrobat viewer to view the full articles.Get Free Adobe Reader

              Attachments:
              Download this file (IJPE-2018-06-16.pdf)IJPE-2018-06-16.pdf[Decomposing Constraints for Better Coverage in Test Data Generation]655 Kb
               
              This site uses encryption for transmitting your passwords. ratmilwebsolutions.com