skip to main content
10.5555/2819009.2819101acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

How (much) do developers test?

Published: 16 May 2015 Publication History

Abstract

What do we know about software testing in the real world? It seems we know from Fred Brooks' seminal work "The Mythical Man-Month" that 50% of project effort is spent on testing. However, due to the enormous advances in software engineering in the past 40 years, the question stands: Is this observation still true? In fact, was it ever true? The vision for our research is to settle the discussion about Brooks' estimation once and for all: How much do developers test? Does developers' estimation on how much they test match reality? How frequently do they execute their tests, and is there a relationship between test runtime and execution frequency? What are the typical reactions to failing tests? Do developers solve actual defects in the production code, or do they merely relax their test assertions? Emerging results from 40 software engineering students show that students overestimate their testing time threefold, and 50% of them test as little as 4% of their time, or less. Having proven the scalability of our infrastructure, we are now extending our case study with professional software engineers from open-source and industrial organizations.

References

[1]
F. Brooks, The mythical man-month. Addison-Wesley, 1975.
[2]
G. Myers, C. Sandler, and T. Badgett, The art of software testing. John Wiley & Sons, 2011.
[3]
D. Janzen and H. Saiedian, "Does test-driven development really improve software design quality?" Software, IEEE, vol. 25, no. 2, 2008.
[4]
B. Wang, F. Madani, X. Wang, L. Wang, and C. White, "Design structure matrix," in Planning and Roadmapping Technological Innovations. Springer, 2014.
[5]
P. Runeson, C. Andersson, and M. Höst, "Test processes in software product evolutiona qualitative survey on the state of practice," Journal of software maintenance and evolution, vol. 15, no. 1, 2003.
[6]
A. Begel and T. Zimmermann, "Analyze this! 145 questions for data scientists in software engineering." in Proceedings of the International Conference on Software Engineering (ICSE), 2014.
[7]
G. Meszaros, XUnit Test Patterns: Refactoring Test Code. Prentice Hall PTR, 2006.
[8]
T. Xie, J. de Halleux, N. Tillmann, and W. Schulte, "Teaching and training developer-testing techniques and tool support," in Proceeding of the International Conference on Object oriented programming systems languages and applications (OOPSLA Companion). ACM, 2010.
[9]
P. Muntean, C. Eckert, and A. Ibing, "Context-sensitive detection of information exposure bugs with symbolic execution," in Proceedings of the International Workshop on Innovative Software Development Methodologies and Practices. ACM, 2014.
[10]
J. Adair, "The hawthorne effect: A reconsideration of the methodological artifact." Journal of applied psychology, vol. 69, no. 2, 1984.
[11]
D. Athanasiou, A. Nugroho, J. Visser, and A. Zaidman, "Test code quality and its relation to issue handling performance," IEEE Transactions on Software Engineering, vol. 40, no. 11, 2014.
[12]
L. Hattori and M. Lanza, "Syde: a tool for collaborative software development," in Proceedings of the International Conference on Software Engineering - Volume 2 (ICSE), 2010.
[13]
R. Robbes and M. Lanza, "Spyware: a change-aware development toolset," in Proceedings of the International Conference on Software Engineering (ICSE), 2008.
[14]
L. Pinto, S. Sinha, and A. Orso, "Understanding myths and realities of test-suite evolution," in Symposium on the Foundations of Software Engineering (FSE). ACM, 2012.

Cited By

View all
  • (2019)Assessing Incremental Testing Practices and Their Impact on Project OutcomesProceedings of the 50th ACM Technical Symposium on Computer Science Education10.1145/3287324.3287366(407-413)Online publication date: 22-Feb-2019
  • (2018)Toward an empirical theory of feedback-driven developmentProceedings of the 40th International Conference on Software Engineering: Companion Proceeedings10.1145/3183440.3190332(503-505)Online publication date: 27-May-2018
  • (2018)On the dichotomy of debugging behavior among programmersProceedings of the 40th International Conference on Software Engineering10.1145/3180155.3180175(572-583)Online publication date: 27-May-2018
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ICSE '15: Proceedings of the 37th International Conference on Software Engineering - Volume 2
May 2015
1058 pages

Sponsors

Publisher

IEEE Press

Publication History

Published: 16 May 2015

Check for updates

Qualifiers

  • Research-article

Conference

ICSE '15
Sponsor:

Acceptance Rates

Overall Acceptance Rate 276 of 1,856 submissions, 15%

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)4
Reflects downloads up to 01 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2019)Assessing Incremental Testing Practices and Their Impact on Project OutcomesProceedings of the 50th ACM Technical Symposium on Computer Science Education10.1145/3287324.3287366(407-413)Online publication date: 22-Feb-2019
  • (2018)Toward an empirical theory of feedback-driven developmentProceedings of the 40th International Conference on Software Engineering: Companion Proceeedings10.1145/3183440.3190332(503-505)Online publication date: 27-May-2018
  • (2018)On the dichotomy of debugging behavior among programmersProceedings of the 40th International Conference on Software Engineering10.1145/3180155.3180175(572-583)Online publication date: 27-May-2018
  • (2018)Analyzing the effects of test driven development in GitHubEmpirical Software Engineering10.1007/s10664-017-9576-323:4(1931-1958)Online publication date: 1-Aug-2018
  • (2018)The impact of rapid release cycles on the integration delay of fixed issuesEmpirical Software Engineering10.1007/s10664-017-9548-723:2(835-904)Online publication date: 1-Apr-2018
  • (2017)Oops, my tests broke the buildProceedings of the 14th International Conference on Mining Software Repositories10.1109/MSR.2017.62(356-367)Online publication date: 20-May-2017
  • (2016)Usage, costs, and benefits of continuous integration in open-source projectsProceedings of the 31st IEEE/ACM International Conference on Automated Software Engineering10.1145/2970276.2970358(426-437)Online publication date: 25-Aug-2016
  • (2016)How to catch 'em allProceedings of the 3rd International Workshop on Software Engineering Research and Industrial Practice10.1145/2897022.2897027(53-56)Online publication date: 14-May-2016

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media