Wednesday, 4 February 2015

New year's digest on collaborative & reproducible research

This list is aggregated from public and private messages or during web browsing. Don't hesitate to send me links via our public mailing list or LinkedIn group (to have an acknowledgment):!forum/collective-mind

=== Misc articles ===

* "Research Wranglers: Initiatives to Improve Reproducibility of Study Findings"

* Dennis McCafferty, "Should Code be Released?",
  Communications of ACM, 2010/10, Vol.53, No.10, DOI:10.1145/1831407.1831415

* Chris Drummond, "Replicability is not Reproducibility: Nor is it Good Science"
  Proc. of the Evaluation Methods for Machine Learning Workshop
  at the 26th ICML, Montreal, Canada, 2009.
  Copyright: National Research Council of Canada

* Science is in a reproducibility crisis - how do we resolve it?

* My blog article on "Automatic performance tuning and reproducibility as a side effect"
  for the Software Sustainability Institute:

* Puzzling Measurement of "Big G" Gravitational Constant Ignites Debate

* White House takes notice of reproducibility in science, and wants your opinion

* Problems during performance benchmarking:

We also experienced many similar issues during our work on auto-tuning and machine learning:

* ACM SIGOPS Operating Systems Review - Special Issue on Repeatability
  and Sharing of Experimental Artifacts:

* Vinton G. Cerf. "Bit Rot: Long-Term Preservation of Digital Information" [Point of View]

* Less related though interesting (about citations):

=== Future events ==

* February 9, 2015, CGO/PPoPP joint session on artifact evaluation experience
  San Francisco, 17:15 - 17:35
  Grigori Fursin and Bruce Childers

* November 1-4, 2015, Dagstuhl Perspective Workshop
  "Artifact Evaluation for Publications"
  Bruce Childers, Shriram Krishnamurthi, Grigori Fursin, Andreas Zeller

* March 13 - 18 , 2016, Dagstuhl Seminar 16111
  "Rethinking Experimental Methods in Computing"

=== Past events ===

* Oct 27-30, 2014, Washington DC, US
  "1st International Workshop on Collaborative methodologies to Accelerate
  Scientific Knowledge discovery in big data (CASK) 2014"

  In conjunction with 2014 IEEE International Conference on Big Data
  (IEEE BigData 2014)

* September 1, 2014:  Special journal issue on reproducible research methodologies
  in IEEE Transactions on Emerging Topics in Computing (TETC).

* January 2015:

  ACM SIGOPS Operating Systems Review
  Special Issue on Repeatability and Sharing
  of Experimental Artifacts

=== Journals/Conferences with reproducible articles ===
* IPOL Journal: Image Processing On Line

=== Tools ===
* NGS pipelines - ntegrates pipelines and user interfaces
  to help biologists to analyse data outputed from biological
  applications such as RNAseq, sRNAseq, ChipSeq, BS-seq:

* Skoll: A process & Infrastructure for Distributed, continuous Quality assurance

* NEPI: Simplifying network experimentation:

* RR (Mozilla project): records nondeterministic executions and debugs them deterministically

* Burrito: Rethinking the Electronic Lab Notebook

* Collective Knowledge (cTuning v4): our tool and repository to simplify code and data sharing as reusable components (for collaborative and reproducible R&D):

=== Online workflows ===

* RunMyCode:

* AptLab:

=== Projects ===
* OpenLab:

* EU Recode project

* CERN: opendata

* Research Data Alliance:

* Open Data Institute:

=== Online archives/repos ===

* Olive Archive (preserving executable content):


* OpenAire (CERN)

* Zenodo:

* ResearchCompendia:

* Internet Archive:

* The national archives:

* WikiData:

* The digital preservation network:

* Open datasets:

* DataHub:

* Datacite: citing data as DOI (Germany, has connections with CERN)

* CrossRef:

* International DOI Foundation

* Our new pilot Collective Knowledge repository:

No comments:

Post a Comment