Badged Papers
- A feasibility study of usability and UX evaluation technologies in multi-touch context: A quantitative and qualitative analysis Artifacts Available Artifacts Functional Guilherme E. Konopatzki Filho (Federal University of Paraná), Guilherme Corredato Guerino (State University of Maringá), and Natasha M. C. Valentim (Federal University of Paraná)
- A Thematic Synthesis on Empathy in Software Engineering based on the Practitioners' Perspective Artifacts Available Lidiany Cerqueira (Federal University of Bahia and Federal University of Sergipe), Sávio Freire (Federal Institute of Ceará), João Pedro Silva Bastos (State University of Feira de Santana), Rodrigo Spínola (Virginia Commonwealth University), Manoel Mendonça (Federal University of Bahia), and José Amancio Macedo Santos (State University of Feira de Santana)
- Analyzing the Impact of CI Sub-practices on Continuous Code Quality in Open-Source Projects: An Empirical Study Artifacts Available Guilherme Freitas (Federal University of Rio Grande do Norte), João Helis Bernardo (Federal University of Rio Grande do Norte and Federal Institute of Rio Grande do Norte), Gustavo Sizílio (Federal Institute of Rio Grande do Norte), Daniel Alencar da Costa (University of Otago), and Uirá Kulesza (Federal University of Rio Grande do Norte)
- Architectural Technical Debt - A Systematic Mapping Study Artifacts Available Armando Sousa (Federal University of Piauí), Lincoln Rocha (Federal University of Ceará), and Ricardo Britto (Ericsson / BTH)
- Do you see what happens around you? Men's Perceptions of Gender Inequality in Software Engineering Artifacts Available Edna Dias Canedo (University of Brasília), Larissa Rocha (State University of Feira de Santana), Geovana Ramos Sousa Silva (University of Brasília), Verônica Souza dos Santos (University of Brasília), and Fabiana Freitas Mendes (University of Brasília)
- How The Retry Pattern Impacts Application Performance: A Controlled Experiment Artifacts Available Carlos M. Aderaldo (University of Fortaleza) and Nabor C. Mendonça (University of Fortaleza)
- On the Experiences of Practitioners with Requirements Elicitation Techniques Artifacts Available Rodrigo Pereira de Mesquita (University of Brasília), Geovana Ramos Sousa Silva (University of Brasília), and Edna Dias Canedo (University of Brasília)
- Similar Bug Reports Recommendation System using BERT Artifacts Available Guilherme Carneiro (Universidade Federal de Campina Grande), José Manoel Ferreira (Universidade Federal de Campina Grande), Franklin Ramalho (Universidade Federal de Campina Grande), and Tiago Massoni (Universidade Federal de Campina Grande)
Call for Artifacts
Introduction
OpenScienSE 2023 introduces an Artifact Track and artifact badging in the context of CBSoft! International conferences already have artifact evaluation tracks and artifact badging. This movement inspired OpenScienSE to bring this tendency to CBSoft.
According to the ESEC/FSE 2023 webpage, papers associated with badges contain reusable products that other researchers can use to bootstrap their own research, and experience shows that such papers earn increased citations and greater prestige in the research community.
Artifact Badging
Authors of papers accepted to the SBES 2023 Research Track are invited to submit artifacts associated with those papers to the OpenScienSE 2023 Artifact Track for evaluation. The artifacts might be considered Available or Functional, which would result in earning the respective artifact badges (see below). These badges are considered independent and both can be applied to any given paper.
If a submitted artifact is accepted in the Artifact Track:
- The paper will be marked with the badge(s) in the list of accepted papers on the webpage of the SBES 2023 Research Track;
- The artifact will compete for a Best Artifact Award, which will be given by OpenScienSE to recognize the effort of authors in creating and sharing outstanding research artifacts.
Author-created artifacts relevant to the paper have been placed on a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier for the object is provided.
The artifacts associated with the research are found to be documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation.
Artifact Kinds
"Artifact" is an umbrella term for several kinds of materials and products. It includes simpler materials such as interview questionnaires applied in surveys and more complex products such as fully-automated tools. All artifacts that might be useful for future research projects are welcome!
Artifacts of interest include, but are not limited to:
- Tools and frameworks, which are implementations of systems and services that can be used and potentially extended.
- Data repositories, which are data used for or produced by a study.
- Replication package, which is a mix of the above depending on the type of the research paper.
Important Dates
All dates refer to midnight 23:59:59 AoE.
- Artifact Submission Deadline: August 7, 2023 (firm)
- Artifact Notification: September 4, 2023
Submission Instructions
Only authors of papers accepted to the SBES 2023 Research Track can submit candidate Available and Functional artifacts. Artifacts must be submitted electronically through the JEMS system.
Two files are required for submission:
- A PDF file (one page max), containing 1) the link(s) to their repository(ies) and 2) the kind of badge(s) the authors are applying for as well as the reasons why the authors believe that the artifact(s) deserve(s) that badge(s).
- A copy of the accepted paper in PDF format.
The research artifacts themselves (i.e., the content of repositories) should be self-contained. This means that all instructions about the artifacts (how they are organized, how they can be used, etc.) should be in the repositories. This is because people other than the OpenScienSE committee must be able to access and use the artifacts. That is the whole reason for sharing artifacts, after all.
Authors must perform the following steps to prepare an artifact to be submitted.
Preparing and Documenting the Artifact
A submitted repository’s artifact must contain the artifact files themselves and documentation about them. Below, authors can see the expected repository’s structure:
├── <artifact folder and files> ├── README ├── LICENSE ├── REQUIREMENTS (for source-code-related artifacts) └── INSTALL (for source-code-related artifacts)
A README file is the document that any person will want to read when accessing the artifacts’ repositories. It should describe what the artifact is, cite the paper associated with the artifact, and explain how the repository is organized. This is the very minimum for other people to get interested in using the artifact. Artifacts that focus on data should, in principle, cover aspects relevant to understanding the context, data provenance, ethical and legal statements (as long as relevant), and storage requirements. Artifacts that focus on software should, in principle, cover aspects relevant to how to install and use it (and be accompanied by a small example). For the latter, the README file should link the REQUIREMENTS and INSTALL files (see below).
A LICENSE file describing the distribution rights. The license needs to be some form of open-source license.
A REQUIREMENTS file for artifacts that focus on software. This file must describe the requirements for executing the software system. Basic requirements, e.g., Java version, should be described. If relevant, a requirements.txt with explicit versioning information (e.g., for Python-only environments) should be provided. For completeness and fully-functional software systems, this file should cover aspects of hardware environment requirements (e.g., performance, storage, or non-commodity peripherals) and software environments (e.g., Docker, VM, and operating system).
An INSTALL file with installation instructions. These instructions should include notes illustrating a very basic usage example or a method to test the installation. This could be, for instance, on what output to expect that confirms that the code is installed and working.
As an overall rule, the authors should provide enough associated instruction, code, and data such that some CS person with a reasonable knowledge of scripting, build tools, etc., could install, build, and run the code.
Making the Artifact Available
The authors must make the artifact publicly available so that the Evaluation Committee (and potential users of the artifact) can access it. We suggest a link to a public repository (e.g., GitHub) or to a single archive file in a widely available archive format. We encourage authors to use permanent repositories dedicated to data sharing where no registration is necessary for those accessing the artifacts (e.g., please avoid using services such as GoogleDrive).
This means that the JEMS submission should include the research abstract only providing links to the repositories where the artifact is permanently stored and available. Submitting artifacts themselves through JEMS without making them publicly accessible (through a repository or an archival service) will not be sufficient for any further badge.
Submitting the Artifact
Artifacts must be submitted electronically through the JEMS system.
Review Process
The OpenScienSE Evaluation Committee may contact the authors within the reviewing period to request clarifications on the basic installation and start-up procedures or to resolve simple installation problems. Instructions will further be sent to the authors (and reviewers) along the reviewing process.
Given the short review time available, the authors are expected to respond within a 48-hour period. Authors may update their research artifacts during the reviewing period only for changes requested by reviewers in the reviewing period.
In case of questions, please do not hesitate to contact the chairs.
Full disclosure: this call for artifacts was inspired by the ESEC/FSE 2023's call for artifacts.