I have been wanting to try out the hive, hadoop, and cassandra combination for a long time and nothing comes closer to it than brisk. So for the record I have initiated a demo cluster on aws using the brisk ami's and have followed the tutorial (http://www.datastax.com/docs/0.8/brisk/install_brisk_ami) very closely (including launching 6 instances). However, things havent gone to plan.
my nodetool dump:
Address Status State Load Owns Token
10.170.175.47 Up Normal 11.13 KB 16.67% 0
10.171.53.68 Up Normal 8.92 KB 16.67% 28356863910078205288614550619314017621
10.170.227.11 Up Normal 8.92 KB 16.67% 56713727820156410577229101238628035242
10.171.51.195 Up Normal 8.92 KB 16.67% 85070591730234615865843651857942052863
10.171.38.197 Up Normal 8.92 KB 16.67% 113427455640312821154458202477256070484
10.170.123.20 Up Normal 8.92 KB 16.67% 141784319550391026443072753096570088105
- seeds: "10.170.175.47,10.171.51.195"
#note it only shows two IP's whereas there are 6 showing in the nodetool dump
# note there is not fixed IP in here, I have tried replacing this with the line below but yet no result
finally $ brisk/bin/brisktool jobtracker
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/ubuntu/brisk/resources/cassandra/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/ubuntu/brisk/resources/hadoop/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
No jobtracker found
I have the same error popping out when I start Hive, nothing seems to work !