Monday, 10 February 2014
As a continuing effort for validation of experimental results by the community and for testing of Collective Mind repository, we validated two papers from ADAPT'14 workshop with the great help of volunters (Alberto Magni from the University of Edinburgh, UK and Sascha Hunold from Vienna University of Technology, Austria) and shared results here:
Finally, I found some time to add all slides from the ADAPT'14 panel on reproducible research methodologies and new publication models here: http://adapt-workshop.org/program.htm
I was very glad to see many participants and hot discussions on how to:
- improve publication reviewing process involving more reviewers, sharing of all related material for reproducibility, developing standards for fair and statistical evaluation , etc.
- involve ACM and encourage conferences/journals to promote sharing of related research material and validation of experimental results
- validate and rank shared research artifacts
- encourage validation and public implementation of already published techniques
- develop common repository for shared material including benchmarks, data sets, tools, models, etc.
- encourage companies to share tools and data
Though there is now yet unified way on the above problems, I was very glad to see active participation and hot discussions. In order to continue and systematize these discussions, I strongly encourage to submit (short position) papers on these topics to our 1st ACM SIGPLAN TRUST'14 workshop that will be co-located with PLDI'14 (June 12, Edinburgh, UK).