All Collections
Publishing on Code Ocean
The verification process
Code Ocean's verification process for computational reproducibility & quality
Code Ocean's verification process for computational reproducibility & quality

Things to attend to before publishing or submitting for peer review.

Shahar Zaks avatar
Written by Shahar Zaks
Updated over a week ago

All capsules published on Code Ocean are verified for computational reproducibility and quality.

In practice, this means that Code Ocean staff check everything that is submitted for publication or peer review on the platform. Here is what we look for.

Code Ocean's computational reproducibility & Quality checklist: 

1. Save all concrete results to the /results folder.

Because Code Ocean runs headlessly by default (that is, without the option for user input during runtime), files should be saved explicitly. If your result is a computation printed to the console, it will be saved in a filed called 'output'.

2. Ensure the capsule’s title is meaningful, to give readers a better sense of what the capsule contains:

You can use the manuscript title or see examples on the Explore page.

3. Write a main script to reproduce your analysis as completely as possible.

If you have five analysis scripts, your main script should run them all in sequence. If you have multiple datasets, your code should analyze them all by default (whenever possible). Because all published results are verified to be reproducible, there is no need for others to run your published code unless they are making modifications or extensions -- so long run times are not an issue.

  • Note: you can manually modify the run  script to run multiple scripts, and we advise taking out all comments to the effect of # The previous version of this file was commented-out and follows below .

4. Install all libraries and dependencies via the environment screen, and not during runtime.

This is to guarantee long-term reproducibility. If you install things each and every time the code is run, we cannot guarantee that those commands will continue to execute successfully in the future. By contrast, packages installed via built-in package managers, and commands run via postInstall script, will be executed just once, when the environment is built, and then have their results cached into the environment. For published capsules, this means that the installed libraries are also part of the Docker image available for download via export capsule.

  • Note: dependencies should generally be downloaded rather than uploaded to the /code folder; but if a dependency is no longer available online, this is at your discretion. If necessary, please clarify for readers in your documentation what is a dependency and what code is uniquely responsible for generating your results.

5. Upload all necessary data files to the /data folder.

To ensure clarity for the reader, and to simplify Git tracking in Code Ocean, data should not go in the /code folder. Because URLs and download syntax tend to change over time, data should also not be downloaded during runtime.

  • Note: if you feel strongly about having large files being somewhere other than /data , please add them to your .gitignore

6. Upload source code rather than compiled binaries, and then compile binaries during runtime.

Reproducibility implies inspectability, and whenever possible, readers should be able to verify the inner workings of your algorithm or analysis.



7. Provide sufficient metadata for widespread intelligibility, including:

  • Any appropriate tags;

  • A few lines from your abstract, or the entire thing, in the description pane;

  • Any information about an associated publication;

  • All affiliations for authors (use 'N/A' if none is available);

  • If you wish to change the default licenses (MIT for code, CC0 for data), new licenses;

  • A representative image (hover over the language symbol in metadata -> 'Upload Image').

8. Delete unnecessary files

e.g., logs, <>/__pycache__, temp, empty files, etc.

Does Code Ocean verify that results in a capsule match those in a published article?

We do not. We are not performing peer review. Our verification process ensures that results are reproducible in the mechanical sense of: a user presses the 'run' button and gets results, and unless the user has made changes to the code, those results should be identical, or extremely close to it, on each run.

Questions or comments?

We'd be happy to hear from you at support@codeocean.com.

Did this answer your question?