systemml-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Deron Eriksson <>
Subject Formalize a release candidate review process?
Date Sat, 21 May 2016 16:49:21 GMT

It might be nice to formalize what needs to be done when reviewing a
release candidate. I don't mean this as something that would add
bureaucracy that would slow us down. Rather, it would be nice to have
something as simple as a basic checklist of items that we could volunteer
to check. That way, we could avoid potentially duplicating effort, which
would speed us up, and we could avoid potentially missing some critical
checks, which would help validate the integrity of our releases.

Some potential items to check:
1) Entire test suite should pass on OS X, Windows, and Linux.
2) All artifacts and accompanying checksums are present (see
3) All artifacts containing SystemML classes can execute a 'hello world'
4) LICENSE and NOTICE files for all the artifacts have been checked
5) SystemML runs algorithms locally in standalone single-node
5) SystemML runs algorithms on local Hadoop (hadoop jar ...)
6) SystemML runs algorithms on local Spark (spark-submit ...)
7) SystemML runs algorithms on a Hadoop cluster
8) SystemML runs algorithms on a Spark cluster
9) SystemML performance suite has been run on a Hadoop cluster
10) SystemML performance suite has been run on a Spark cluster

Would this be too many things to check or too few? Are there any critical
items missing?


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message