Badged Papers

SBES 2024 Research Track


  • Categorizing IoT Software Systems Security Vulnerabilities Through Literature Studies Artifacts Available
    Clinton Hudson Moreira Pessoa (COPPE/UFRJ), Guilherme Horta Travassos (COPPE/UFRJ)

  • Explorando a detecção de conflitos semânticos nas integrações de código em múltiplos métodos Artifacts Available
    Toni Maciel (UFPE), Paulo Borba (UFPE), Leuson Da Silva (Polytechnique Montreal), Thaís Burity (UFAPE)

  • Eyes on Code Smells: Analyzing Developers’ Responses During Code Snippet Analysis Artifacts Available
    Vinícius Martins (PUC-Rio), Pedro Lopes Verardo Ramos (PUC-Rio), Breno Braga Neves (PUC-Rio), Maria Vitoria Lima (PUC-Rio), Johny Arriel (PUC-Rio), João Victor Godinho (PUC-Rio), Joanne Ribeiro (PUC-Rio), Alessandro Garcia (PUC-Rio), Juliana Alves Pereira (PUC-Rio)

  • Promise+: expandindo a base de dados de requisitos de software Promise_exp Artifacts Available Artifacts Functional
    Bruno Silva (UFMA), Rodrigo Nascimento (UFMA), Luis Rivero (UFMA), Geraldo Braz (UFMA), Rodrigo Pereira dos Santos (UNIRIO), Luiz Eduardo Martins (UNIFESP), Davi Viana (UFMA)

  • Revisiting Aristotle vs. Ringelmann: The influence of biases on measuring productivity in Open Source software development Artifacts Available Artifacts Functional
    Christian Gut (IME-USP), Alfredo Goldman (IME-USP)

  • Robotic-supported Data Loss Detection in Android Applications Artifacts Available
    Davi Freitas (UFPE), Breno Miranda (UFPE), Juliano Iyoda (UFPE)

  • Unveiling the Landscape of System Thinking Modeling Tools Use in Software Engineering Artifacts Available
    Júlia de Souza Borges (UFES), Thiago Felippe Neitzke Lahass (UFES), Amanda Brito Apolinário (UFES), Paulo Sérgio dos Santos Júnior (IFES), Monalessa Barcellos (UFES)


SBES 2024 Education Track


  • Assisting Novice Developers Learning in Flutter Through Cognitive-Driven Development Artifacts Available
    Ronivaldo Ferreira (UFPA), Victor Hugo Santiago C. Pinto (UFPA), Cleidson R. B. de Souza (UFPA), Gustavo Pinto (UFPA & Zup Innovation)

  • It's not all about gender: A Multi-dimensional Course Perspective on Diversity and Inclusion in Software Engineering Education Artifacts Available
    Kiev Gama (UFPE), Reydne Santos (UFPE)

  • Teaching Software Engineering: An Overview of Current Approaches and Practices in the Last Decade of SBES Artifacts Available
    Ana Clementino (UFERSA), Erick Lima (UFCA), Luann Lima (UFCA), André Guedes (IFMT), Dorgival Netto (UFCA), Jarbele Coutinho (UFERSA)


SBES 2024 Insightful Ideas and Emerging Results Track


  • A New Integration Approach to support the Development of Build-time Micro Frontend Architecture Applications Artifacts Available Artifacts Functional
    Fernando Moraes (UNESP), Frank José Affonso (UNESP)

  • Multilingual Crowd-Based Requirements Engineering Using Large Language Models Artifacts Available
    Arthur Pilone da Silva (USP), Paulo Meirelles (USP), Fabio Kon (USP), Walid Maalej (Universität Hamburg)

  • On the Identification of Self-Admitted Technical Debt with Large Language Models Artifacts Available Artifacts Functional
    Pedro Lambert (PUC-MG), Lucila Ishitani (PUC-MG), Laerte Xavier (PUC-MG)


SBES 2024 Tools Track


  • Knowledge Islands: Visualizing Developers Knowledge Concentration Artifacts Available
    Otávio Cury da Costa Castro (UFPI), Guilherme Avelino (UFPI)

  • PoP-ARE: A Tool for Extracting Systems-of-Systems Non-Functional Requirements from Processes-of-Business Processes Artifacts Available
    Murilo Gustavo Nabarrete Costa (UFMS), Sidny de Almeida Molina Pereira (UFMS), Debora Maria Barroso Paiva (UFMS), Maria Istela Cagnin (UFMS)

  • SP2Mic: Uma ferramenta para geração de código de microsserviços a partir de stored procedures Artifacts Available Artifacts Functional
    Ingrid Coutinho (UECE), Paulo Maia (UECE)


SBCARS 2024


  • Evaluating the performance of NSGA-II and NSGA-III on Product Line Architecture Design Artifacts Available
    Lucas Wolschick (UEM), Paulo Cesar Gonçalves (UEM), João Choma Neto (UEM), Willian Marques Freire (UEM), Aline Maria M. M. Amaral (UEM), Thelma Elita Colanzi (UEM)


SBLP 2024


  • Memoization of Mutable Objects Artifacts Available Artifacts Functional
    Caio Raposo, Fernando Magno Quintao Pereira (UFMG)

Call for Artifacts

Introduction


OpenScienSE 2024 has an Artifact Evaluation Track and artifact badging in the context of CBSoft! International conferences already have artifact evaluation tracks and artifact badging. This movement inspired OpenScienSE to bring this tendency to CBSoft.

According to the ESEC/FSE 2023 webpage, papers associated with badges contain reusable products that other researchers can use to bootstrap their own research, and experience shows that such papers earn increased citations and greater prestige in the research community.


Artifact Badging

Authors of papers accepted to SBES (any track), SAST, SBCARS, and SBLP 2024 are invited to submit artifacts associated with those papers to the OpenScienSE 2024 Artifact Evaluation Track. The artifacts might be considered Available or Functional, which would result in earning the respective artifact badges (see below). These badges are considered independent and both can be applied to any given paper.

If a submitted artifact is accepted in the Artifact Evaluation Track:

  • The paper will be marked with the badge(s) in the list of accepted papers on the webpage of the respective symposium;
  • The artifact will compete for a Best Artifact Award, which will be given by OpenScienSE to reward the effort of authors in creating and sharing outstanding research artifacts.

Author-created artifacts relevant to the paper have been placed on a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier for the object is provided.

The artifacts associated with the research are found to be documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation.

Artifact Kinds

"Artifact" is an umbrella term for several kinds of materials and products. It includes simpler materials such as interview questionnaires applied in surveys and more complex products such as fully automated tools. All artifacts that might be useful for future research projects are welcome!

Artifacts of interest include, but are not limited to:

  • Tools and frameworks, which are implementations of systems and services that can be used and potentially extended.
  • Data repositories, which are data used for or produced by a study.
  • Replication package, which is a mix of the above depending on the type of the research paper.


Important Dates

All dates refer to midnight 23:59:59 AoE.

  • Artifact Submission Deadline: August 19, 2024
  • Author Notification: September 16, 2024


Submission Instructions

Only authors of papers accepted to SBES (any track), SAST, SBCARS, and SBLP 2024 can submit candidate Available and Functional artifacts.

The research artifacts themselves (i.e., the content of repositories) should be self-contained. This means that all instructions about the artifacts (how they are organized, how they can be used, etc.) should be in the repositories. This is because people other than the OpenScienSE committee must be able to access and use the artifacts. That is the whole reason for sharing artifacts, after all.

Authors must perform the following steps to prepare an artifact to be submitted.

Preparing and Documenting the Artifact

A submitted repository’s artifact must contain the artifact files themselves and documentation about them. Below, authors can see the expected repository’s structure:

                                ├── <artifact folder and files>
                                ├── LICENSE
                                └── README
                            

A LICENSE file describing the distribution rights. The license needs to be some form of open-source license.

A README file is the document that any person will want to read when accessing the artifacts’ repositories. It should describe what the artifact is, cite the paper associated with the artifact, and explain how the repository is organized. This is the very minimum for other people to get interested in using the artifact.

  • For artifacts that focus on data, the README file must cover aspects relevant to understanding the context, data provenance, ethical and legal statements (as long as relevant), and storage requirements.
  • For artifacts that focus on code, the README file must cover aspects relevant to how to install and execute it. The README file should contain two special sections: one for requirements and one for installation. The requirements section must describe the requirements for executing the software system. Basic requirements, e.g., Java version, should be described. If relevant, a requirements.txt with explicit versioning information (e.g., for Python-only environments) should be provided. For completeness and fully-functional software systems, the requirements section should cover aspects of hardware environment requirements (e.g., performance, storage, or non-commodity peripherals) and software environments (e.g., Docker, VM, and operating system). The installation section must include notes illustrating a very basic usage example or a method to test the installation. This could be, for instance, on what output to expect that confirms that the code is installed and working.

As an overall rule, the authors should provide enough associated instruction, code, and data such that some CS person with a reasonable knowledge of scripting, build tools, etc., could install, build, and run the code.

The README file should contain a link to the accepted paper. The paper pdf itself can be within the artifact’s repository or in an external service (e.g., ArXiv).

Making the Artifact Available

The authors must make the artifact publicly available so that the Evaluation Committee (and potential users of the artifact) can access it. We suggest a link to a public repository (e.g., GitHub) or to a single archive file in a widely available archive format. We encourage authors to use permanent repositories dedicated to data sharing where no registration is necessary for those accessing the artifacts (e.g., please avoid using services such as Google Drive).

Submitting the Artifact

Artifacts must be submitted electronically through the JEMS system.

No file is required for submission because all required information should be provided in the Abstract field in the submission system. The Abstract to be entered into the JEMS system should be structured as follows:

  • Description: A simple description of the artifact to be evaluated (1 paragraph);
  • Repository: The link to the artifact to be evaluated;
  • Badges: A brief explanation of why the artifact is eligible for the claimed badges;
  • Required skills: Skills and knowledge required by a reviewer to properly review and execute the artifacts (e.g., programming languages, pieces of technology, etc.);
  • Required resources: Requirements to run the artifact (RAM, disk, packages, specific devices, operating system, etc). As we have explained before, such requirements should be “reasonable” for software engineering researchers.


Review Process

The OpenScienSE Evaluation Committee may contact the authors within the reviewing period to request clarifications on the basic installation and start-up procedures or to resolve simple installation problems. Instructions will further be sent to the authors (and reviewers) along the reviewing process.

Given the short review time available, the authors are expected to respond within a 48-hour period. Authors may update their research artifacts during the reviewing period only for changes requested by reviewers in the reviewing period.

In case of questions, please do not hesitate to contact the chairs.

Full disclosure: this call for artifacts was inspired by the ESEC/FSE 2023's call for artifacts and the ICSME 2024's call for artifacts.

Contact and Follow Us