spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From freedafeng <>
Subject correct/best way to install custom spark1.2 on cdh5.3.0?
Date Thu, 08 Jan 2015 23:01:03 GMT
Could anyone come up with your experience on how to do this? 

I have created a cluster and installed cdh5.3.0 on it with basically core +
Hbase. but cloudera installed and configured the spark in its parcels
anyway. I'd like to install our custom spark on this cluster to use the
hadoop and hbase service there. There could be potentially conflicts if this
is not done correctly. Library conflicts are what I worry most.

I understand this is a special case. but if you know how to do it, please
let me know. Thanks.

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message