Call for Papers

Software systems (e.g., smartphone apps, desktop applications, telecommunication infrastructures and enterprise systems, etc.) have strict requirements on software performance. Failing to meet these requirements may cause business losses, customer defection, brand damage, and other serious consequences. In addition to conventional functional testing, the performance of these systems must be verified through load testing or benchmarking to ensure quality service.

Load testing examines the behavior of a system by simulating hundreds or thousands of users performing tasks at the same time. Benchmarking compares the system's performance against other similar systems in the domain. The workshop is not limited by traditional load testing; it is open to any ideas of re-inventing and extending load testing, as well as any other way to ensure systems performance and resilience under load, including any kind of performance testing, resilience / reliability / high availability / stability testing, operational profile testing, stress testing, A/B and canary testing, volume testing, and chaos engineering.

Load testing and benchmarking software systems are difficult tasks that require a deep understanding of the system under test and customer behavior. Practitioners face many challenges such as tooling (choosing and implementing the testing tools), environments (software and hardware setup), and time (limited time to design, test, and analyze). Yet, little research is done in the software engineering domain concerning this topic.

Adjusting load testing to recent industry trends, such as cloud computing, agile / iterative development, continuous integration / delivery, micro-services, serverless computing, AI/ML services, and containers poses major challenges, which are not fully addressed yet.

This one-day workshop brings together software testing and software performance researchers, practitioners, and tool developers to discuss the challenges and opportunities of conducting research on load testing and benchmarking software systems. Our ultimate goal is to grow an active community around this important and practical research topic.

We solicit two tracks of submissions:

  1. Research or industry papers:
  2. Presentation track for industry or research talks:
Research/Industry papers should follow the standard ACM SIG proceedings format and need to be submitted electronically via HotCRP. Extended abstracts for the presentation track need to be submitted as "abstract only'' submissions via HotCRP as well. Accepted papers will be published in the ICPE 2026 Companion Proceedings. Submissions can be research papers, position papers, case studies, or experience reports addressing issues including but not limited to the following:


Instructions for Authors from ACM

By submitting your article to an ACM Publication, you are hereby acknowledging that you and your co-authors are subject to all ACM Publications Policies, including ACM's new Publications Policy on Research Involving Human Participants and Subjects. Alleged violations of this policy or any ACM Publications Policy will be investigated by ACM and may result in a full retraction of your paper, in addition to other potential penalties, as per ACM Publications Policy.

Authors of accepted papers are required to obtain an ORCID ID in order to complete the publication process. ACM uses ORCID IDs to ensure proper attribution, improve author discoverability, and support accurate name normalization across its digital library.

ORCID IDs are not required for the presentation-only track.

Important Dates

Paper Track (research and industry papers):

[Submission Link]
Abstract submission (Optional): January 19, 2026, AOE;
Paper submission: January 23, 2026, AOE;
Author notification: February 27, 2026;
Camera-ready version: March 4, 2026

Presentation Track:

Extended abstract submission: February 16, 2026, AOE;
Author notification: February 27, 2026;
Workshop date: May 4, 2026


Organization:

Chairs:

Alexander Podelko Amazon, USA
Alireza Pourali York University, Canada
Kundi Yao Ontario Tech University, Canada


Program Committee:

Daniel Seybold benchANT GmbH, Germany
Daniele Di Pompeo University of L'Aquila, Italy
Filipe Costa Oliveira Redis, Portugal
Gerson Sunyé LS2N - Nantes Université, France
Junjielong Xu The Chinese University of Hong Kong, Shenzhen, China
Lizhi Liao Memorial University of Newfoundland, Canada
Qiaolin Qin Polytechnique Montreal, Canada
Sai Sindhur Malleni Red Hat, USA
Yiming Tang Rochester Institute of Technology, USA
Zhenhao Li York University, Canada


Past LTB Workshops: