posted by user: gannimo || 12162 views || tracked by 17 users: [display]

CSET 2017 : 10th USENIX Workshop on Cyber Security Experimentation and Test

FacebookTwitterLinkedInGoogle

Link: https://www.usenix.org/conference/cset17
 
When Aug 14, 2017 - Aug 14, 2017
Where VANCOUVER, BC
Submission Deadline May 2, 2017
Notification Due Jun 13, 2017
Final Version Due Jul 11, 2017
Categories    security   cyber security   information security   testing
 

Call For Papers

Important Dates
Submissions due: Tuesday, May 2, 2017, 11:59 pm PDT (no extensions)
Notification to authors: Tuesday, June 13, 2017
Final papers due: Tuesday, July 11, 2017

Workshop Organizers

Program Co-Chairs
Mathias Payer, Purdue University
José M. Fernandez, École Polytechnique de Montréal

Program Committee
Anil Somayaji, Carleton University
Anna Shubina, Dartmouth College
Antoine Lemay, École Polytechnique de Montréal
Aravind Prakash, Binghamton University
Brendan Dolan-Gavitt, New York University
Chao Zhang, Tsinghua University
Cristina Nita-Rotaru, Northeastern University
Dave Levin, University of Maryland
Erik van der Kouwe, Vrije Universiteit Amsterdam
Fanny Lalonde Lévesque, École Polytechnique de Montréal
Gianluca Stringhini, University College London
Jelena Mirkovic, USC Information Sciences Institute (ISI)
John Aycock, University of Calgary
Kevin Borgolte, UC Santa Barbara
Laura S. Tinnel, SRI International
Lucas Davi, University of Duisburg-Essen
Peter Stelzhammer, AV-Comparatives
Ryan Gerdes, Virginia Tech University
Saurabh Bagchi, Purdue University
Sergey Bratus, Dartmouth College
Simon Edwards, SE Labs
Sonia Fahmy, Purdue University
Stefan Mangard, TU Graz
Sven Dietrich, CUNY John Jay College & The Graduate Center

Steering Committee
Terry V. Benzel, USC Information Sciences Institute (ISI)
Sean Peisert, University of California, Davis, and Lawrence Berkeley National Laboratory
Stephen Schwab, USC Information Sciences Institute (ISI)

Overview
The CSET workshop invites submissions on cyber security evaluation, experimentation, measurement, metrics, data, simulations, and testbeds for software, hardware, or malware.
The science of cyber security poses significant challenges. For example, experiments must recreate relevant, realistic features in order to be meaningful, yet identifying those features and modeling them is very difficult. Repeatability and measurement accuracy are essential in any scientific experiment, yet hard to achieve in practice. Few security-relevant datasets are publicly available for research use and little is understood about what "good datasets" look like. Finally, cyber security experiments and performance evaluations carry significant risks if not properly contained and controlled, yet often require some degree of interaction with the larger world in order to be useful.
Addressing all these challenges is fundamental not only for scientific advancement in the field of Computer Security but also in order to enable evidence-based decision making on security products and policies by industry, government and individual users. Meeting these challenges requires transformational advances, including understanding the relationship between scientific method and cyber security evaluation, advancing capabilities of underlying experimental infrastructure, and improving data usability.

Topics
Topics of interest include but are not limited to:
Benchmarks for security: e.g., development and evaluation of benchmark suites that evaluate certain security metrics.
Research methods for cyber security experiments: e.g., experiences with and discussions of experimental methodologies; experiment design and conduct addressing cybersecurity challenges for software, hardware, and malware.
Measurement and metrics: e.g., what are useful or valid metrics, test cases, and benchmarks? How do we know? How does measurement interact with (or interfere with) evaluation?
Data sets: e.g., what makes good data sets? How do we know? How do we compare data sets? How do we collect new ones or generate derived ones? How do they hold up over time?
Security product evaluation methodologies: e.g. what product evaluation methodologies provide more accurate prediction of real-world performance? How should user-related characteristics (behaviour, demographics) be modeled for in security product performance evaluation?
Simulations and emulations: e.g., what makes good ones? How do they scale (up or down)?
Design and planning of cyber security studies: e.g., hypothesis and research
question, study design, data (collection, analysis, and interpretation), accuracy (validity, precision).
Ethics of cyber security research: e.g., experiences balancing stakeholder considerations; frameworks for evaluating the ethics of cyber security experiments.
Testbeds and experimental infrastructure: e.g., tools for improving speed and fidelity of testbed configuration; sensors for robust data collection with minimal testbed artifacts; support for interconnected non-IT systems such as telecommunications or industrial control.

Special note: Papers that primarily focus on computer security education are likely a better fit for the 2017 USENIX Advances in Security Education Workshop (ASE '17), also co-located with the USENIX Security Symposium. Authors of education-centered papers should strongly consider submitting their work to ASE.

Workshop Format
Because of the complex and open nature of the subject matter, CSET '17 is designed to be a workshop in the traditional sense. Presentations are expected to be interactive, and presenters should ensure that sufficient time is reserved for questions and audience discussion. Audience participation is encouraged. To ensure a productive workshop environment, attendance will be limited to 80 participants.

Submission Instructions
Research papers and position papers are welcome as submissions. Research papers should have a clearly stated methodology including a hypothesis and experiments designed to prove or disprove the hypothesis. Position papers, particularly those that critique past work, should present detailed solutions, either proposed or implemented. Submissions that recount experiences (e.g., from experiments or deployments) are especially desired; these should highlight takeaways and lessons learned that might help researchers in the future. For all submissions, the program committee will give greater weight to papers that lend themselves to interactive discussion among attendees.
Submissions must be no longer than 8 pages including all tables, figures, and references. Text should be formatted in two columns on 8.5"x11" paper using 10-point type on 12-point leading ("single-spaced"), with the text block being no more than 6.5"x9". Text outside the 6.5"x9" block will be ignored. Authors are encouraged to use the LaTeX and Word guides from the USENIX paper templates page. The review process will be single-blind; submissions do not need to be anonymized.
All papers must be submitted in PDF format via the Web submission form. Please do not email submissions.
All papers will be available online to registered attendees before the workshop. If your accepted paper should not be published prior to the event, please notify production@usenix.org. The papers will be available online to everyone beginning on the day of the workshop. At least one author from every accepted paper must attend the workshop and present the paper.
Simultaneous submission of the same work to multiple venues, submission of previously published work, or plagiarism constitutes dishonesty or fraud. USENIX, like other scientific and technical conferences and journals, prohibits these practices and may take action against authors who have committed them. See the USENIX Conference Submissions Policy for details. Questions? Contact your program co-chairs, cset17chairs@usenix.org, or the USENIX office, submissions-policy@usenix.org.
Papers accompanied by nondisclosure agreement forms will not be considered. Accepted submissions will be treated as confidential prior to publication on the USENIX CSET '17 Web site; rejected submissions will be permanently treated as confidential.

Related Resources

Security 2025   Special Issue on Recent Advances in Security, Privacy, and Trust
CSET 2024   Cyber Security Experimentation and Test (CSET)
USENIX Security 2025   The 34th USENIX Security Symposium - Cycle 2
BIBC 2024   5th International Conference on Big Data, IOT and Blockchain
EEI 2024   10th International Conference on Emerging Trends in Electrical, Electronics & Instrumentation Engineering
IEEE CSR 2025   2025 IEEE International Conference on Cyber Security and Resilience
MATHCS 2024   2nd International Conference on Mathematics, Computer Science & Engineering
USENIX Security 2024   The 33rd USENIX Security Symposium (Winter)
MSEJ 2024   Advances in Materials Science and Engineering: An International Journal
USENIX Security 2025   The 34th USENIX Security Symposium - Cycle 1