posted by user: selagamsetty || 110 views || tracked by 2 users: [display]

IISWC 2025 : IEEE International Symposium on Workload Characterization

FacebookTwitterLinkedInGoogle


Conference Series : IEEE International Symposium on Workload Characterization
 
Link: https://iiswc.org/iiswc2025/
 
When Oct 12, 2025 - Oct 14, 2025
Where Irvine, CA
Submission Deadline Jun 21, 2025
Notification Due Aug 12, 2025
Final Version Due Sep 1, 2025
Categories    workload charachterization   microarchitecture    performance evaluation   scientific computing
 

Call For Papers

IISWC invites manuscripts that present original unpublished research in all areas related to the characterization and analysis of computing system workloads, including translational research related to production-oriented commercial systems. Work focusing on emerging technologies and interdisciplinary work are especially welcome. Topics of interest include (but are not limited to) characterization of applications in traditional and emerging domains, characterization of system software and middleware, implications of workloads in system design, benchmarking methodologies and suites, and tools for computer systems. A detailed list of the topics can be found at the end of this CFP.
Submission GuidelinesPermalink

Submissions to IISWC can be made in one of the following two categories: (1) regular papers and (2) tool and benchmark papers. Authors are expected to use the IISWC 2024 submission template. The primary focus of regular papers (submission length: 10 pages, excluding references) should be to describe new research ideas supported by experimental implementation and evaluation of the proposed research ideas. The primary focus of tool and benchmark papers should be to describe the design, development, and evaluation of new open-source tools and benchmarks suites. Submissions in the regular papers category are also encouraged to open-source their software or hardware artifacts.

The authors are required to indicate the category of the paper as a part of the submitted manuscript’s title. On the submission system entry, we ask the authors to add a prefix to the title indicating the type of the submission as follows: 1. regular papers: “Regular-TITLE” and 2. tool and benchmark papers: “Tools-TITLE”.

Papers in the tool and benchmark category with relatively shorter length (6 pages) are welcome if the contributions can be well articulated and substantiated. However, all submissions in the tool and benchmark category have the flexibility of using all 10 pages (excluding references).

The submissions in both categories will be evaluated to the same standards in terms of novelty, scientific value, demonstrated usefulness, and potential impact to the field. The nature of the contribution differs between the two categories (new research idea vs. new open-source benchmark-suite / tool) and papers will be evaluated based on the intended nature of the contribution, as declared by the chosen paper category at the time of the submission. The chosen category at the time of the submission cannot be changed after the submission deadline.

Double-blind submission guidelines apply to the submissions in both categories.

Open-source benchmarks and tools that have not been previously published (but may have been open-sourced) are eligible for submission in the tool and benchmark papers category. When including source code links in their submission, we require the authors to use new or anonymized code repositories to preserve the integrity of double-blind review process. All submitted papers should have obtained legal permission (if applicable) to open-source the benchmark-suite / tool at the time of submission.
Artifact EvaluationPermalink

This year, IISWC will continue to include an artifact evaluation process to promote the reproducibility of experimental results. We will invite the authors of accepted IISWC papers to submit their supporting materials to the Artifact Evaluation process, which is to assess how the artifacts support the work described in the papers. This submission will be voluntary and will not influence the final decision regarding acceptance of the paper. The description of the artifact will not be included in the page limit. The artifact submission deadline will be shortly after the notification of the paper’s acceptance — authors should prepare in advance to ensure sufficient time for artifact assembly and documentation. More details of artifact evaluation will be made available to the authors of the accepted paper.
Topics of InterestPermalink

Characterization of applications in domains including

Life sciences, bioinformatics, scientific computing, finance, forecasting
Machine learning, deep learning, generative AI and LLMs, data analytics, data mining
Cyber-physical systems, pervasive computation, and Internet of Things (IoT)
Security and privacy-preserving computing
High performance computing
Cloud and edge computing
Mobile computing
Human-computer interaction (HCI)
Search engines, e-commerce, web services, and databases
Embedded, multimedia, real-time, 3D-graphics, gaming
Blockchain services
Augmented reality and virtual reality

Characterization of workloads for emerging workloads and architectures, such as

Accelerator-based computing
Quantum computations and communication
Serverless computing
Near-threshold computing
Near data processing architectures
Neuromorphic and brain-inspired computing
Transactional memory systems
Biology (e.g., DNA sequencing) and chemistry workloads

Characterization of OS, Virtual Machine, middleware and library behavior, including

Virtual machines, .NET, Java VM, databases
Graphics libraries, scientific libraries
Operating system and hypervisor effects and overheads

Implications of workloads in system design, such as

Power-aware computing and carbon footprinting
Dependable system and software architectures
Security, privacy, performance
Processors, memory hierarchy, I/O, and networks
Design of accelerators, FPGAs, GPUs, CGRAs, etc.
Large-scale computing infrastructures and facilities

Benchmark methodologies and suites, including

Representative benchmarks for emerging workloads
Benchmark cloning methods
Profiling, trace collection, synthetic traces
Validation of benchmarks

Measurement tools and techniques, including

Measurement tools and software for carbon footprinting
Instrumentation methodologies for workload verification and characterization
Techniques for accurate analysis/measurement of production systems
Analytical and abstract modeling of program behavior and systems

Related Resources

PAWR 2026   IEEE Topical Conference on RF/Microwave Power Amplifiers for Radio and Wireless Applications 2026
eScience 2025   21st IEEE International eScience Conference
SANER 2026   The 33rd IEEE International Conference on Software Analysis, Evolution and Reengineering
CIFEr 2026   IEEE Computational Intelligence in Financial Engineering and Economics
BigData 2025   2025 IEEE International Conference on Big Data
IEEE ICoIAS 2025   IEEE--2025 the 7th International Conference on Intelligent Autonomous Systems (ICoIAS 2025)
IEEE ICAIT 2025   2025 IEEE 17th International Conference on Advanced Infocomm Technology (ICAIT 2025)
IEEE SmartIoT 2025   The 9th IEEE International Conference on Smart Internet of Things (SmartIoT 2025)
BIBE 2025   The 25th IEEE International Conference on Bioinformatics and Bioengineering
IEEE CSPE 2026   IEEE--2026 International Conference on Computational Science and Power Engineering (CSPE 2026)