MEDFORD: A human and machine readable metadata markup language

04/20/2022
by   Polina Shpilker, et al.
0

Reproducibility of research is essential for science. However, in the way modern computational biology research is done, it is easy to lose track of small, but extremely critical, details. Key details, such as the specific version of a software used or iteration of a genome can easily be lost in the shuffle, or perhaps not noted at all. Much work is being done on the database and storage side of things, ensuring that there exists a space to store experiment-specific details, but current mechanisms for recording details are cumbersome for scientists to use. We propose a new metadata description language, named MEDFORD, in which scientists can record all details relevant to their research. Human-readable, easily-editable, and templatable, MEDFORD serves as a collection point for all notes that a researcher could find relevant to their research, be it for internal use or for future replication. MEDFORD has been applied to coral research, documenting research from RNA-seq analyses to photo collections.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2023

Cluster Analysis of Open Research Data and a Case for Replication Metadata

Research data are often released upon journal publication to enable resu...
research
09/17/2020

Building Containerized Environments for Reproducibility and Traceability of Scientific Workflows

Scientists rely on simulations to study natural phenomena. Trusting the ...
research
05/04/2020

EngMeta – Metadata for Computational Engineering

Computational engineering generates knowledge through the analysis and i...
research
03/18/2021

KGTorrent: A Dataset of Python Jupyter Notebooks from Kaggle

Computational notebooks have become the tool of choice for many data sci...
research
12/26/2019

On the Reproducibility of Experiments of Indexing Repetitive Document Collections

This work introduces a companion reproducible paper with the aim of allo...
research
03/17/2021

The Human Evaluation Datasheet 1.0: A Template for Recording Details of Human Evaluation Experiments in NLP

This paper introduces the Human Evaluation Datasheet, a template for rec...
research
01/20/2022

CUF-Links: Continuous and Ubiquitous FAIRness Linkages for reproducible research

Despite much creative work on methods and tools, reproducibility – the a...

Please sign up or login with your details

Forgot password? Click here to reset