Hi, Trevor and I were able to get the latest version of Mahout to compile
with Spark 2.2.  The main tweaks being that Spark 2.2 requires java8 and
hadoop2.6 or greater.   This issue is that we have a hadoop2 profile that
sets up hadoop.version= 2.4.1, and loads dependencies and this is only
compatible with Spark 2.1 and below.   I would like to propose removing the
hadoop2 profile, and just baking in the hadoop version and dependencies
within each spark profile.  I wanted to run that by the community before I
went to far with it and get some feedback if there would be a better
alternative.  Trevor can you weigh in if I missed something?
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB