WSSSPE working groups: Reproducibility, Reuse, and Sharing (Neil Che Hong)
Our group focused on journal policies regarding software papers. Our objectives were:
A Survey of journals that publish software papers. The Software Sustainability Institute already maintains a list
A summary of policies each of these journals has in place regarding software papers. (e.g. licensing requirements, repository requirements, required sections in the manuscripts regarding installation or tests, etc).
Developing a five-star rating system for ranking these policies.
Apply our rating system to each of these journals.
Solicit feedback & iterate.
We got about half way though this for some of the most recognized journals on the list; see Google Doc notes.
Feedback for WSSSPE:
WSSSPE’s conference-proceedings model of submitting a short papers that get five very thorough expert reviews ahead of time is really excellent. This is not common practice in my field, so this was my first time participating in such a model. Not only did I benefit from both the chance to write up our piece ahead of time and get expert feedback from people coming from a broader range of backgrounds than I can usually interact with, but also the ability to read the full papers and not just the abstracts of other attendees in advance of the workshop was an invaluable way to learn more, make the most of the time we had, and keep a record.
A full-day workshop is a big travel commitment (travel costs, 2 nights lodging, and using up most of the preceding and following day) while simultaneously being not much time to meet people, share ideas, and start working towards any actual products.
The format proposed at the end of the session that seemed most popular in the show of hands for future WSSSPEs – a two to three day event uncoupled from Supercomputing, based in the US in an easy city to fly into, and with more time to move ideas forward into products using a small group / hackathon model would address most of my criticsm.
Misc notes/discussions
Interesting discussion/ideas for tracking usage of software based on updating patterns, from James Horowitz and company: Heartbeat (pdf).
Neil mentioned a similar workshop he had recently taken part in creating a reviewer’s oath, recently submitted as an opinion piece to F1000. Certainly more of a guideline than most journals give, if a bit pedantic at times (for instance, as much as I believe in signing my own reviews, I would not recommend it to someone else as a blanket policy in the same vein as basic ethics like acknowledging what I don’t know. I think the ‘Oath’ needs to treat this with greater nuiance.) Anyway, food for thought.
(I didn’t manage to catch much with twitter this time, guess too much happening in in-person discussions).