Hadoop Jobs

11 were found based on your criteria

  • Hourly – Less than 1 week – Less than 10 hrs/week – Posted
    We need a Big Data consultant to do Hands on Training Workshop on 25th April in Santa Clara ,CA. Topics covered would be basics to Big data ,Hadoop,Spark,Shark etc. Workshop Material etc would be provided. Only Folks who can be present in Santa Clara on 25th April should Apply
  • Fixed-Price – Est. Budget: $3,000.00 Posted
    I am looking strong candidates on Java and Web technologies who has best communication and interviewing skills. This person won't do any programming or design work but he will prepare the best interviewing framework. He will monitor job candidates on indeed.com database and select some of them and interview them. He need to return back detailed interview results : text, audio/video (if candidate approves recording of his voice or video). You should try to find the best candidates ...
  • Hourly – Less than 1 month – Less than 10 hrs/week – Posted
    HI, I have hadoop experience. iam looking for someone to teach me oozie. it includes writing oozie work flows which includes sqoop, hive, pig and hdfs no need to teach me hdfs, hive, pig and sqoop. I am looking for just the oozie part iam looking for forks, joins, various control actions in oozie
  • Hourly – More than 6 months – 10-30 hrs/week – Posted
    Quantum Strides LLC is looking for Big Data Trainers to teach online classes virtually, on an ongoing basis. Flexible hours per day ranging from 2 to 6 hours. So even if you are employed and have a few hours to spare in the evening, this opportunity will provide you the ability to earn additional income from the comfort of your home. Suitable for individuals with passion for mentoring and teaching. To qualify you must have the following experience: - At least ...
  • Hourly – 1 to 3 months – 10-30 hrs/week – Posted
    We are building next generation telecom product. For this product we are looking for experts in big data area specifically Cassandra to explore the possibilities of data management as well as storage. If you are an expert in this area, please contact us. This could be your opportunity to associate yourself with next generation telecom product using big data. This is a real hot field these days. We are United States based company with a solid team on board to ...
  • Hourly – More than 6 months – 30+ hrs/week – Posted
    Im looking for big data engineers to build our data infrastructure already composed by kafka, camus(linked or streaming data from kafka to hadoop) and hadoop. The infrastructure has just started and we managed to set up all this environment in a matter of two weeks but we understand that we need more man power to support all the services we are planning to build. Some details about The company has a traffic of 30k rps as a peak and ...
  • Hourly – 3 to 6 months – 10-30 hrs/week – Posted
    Looking for engineers with following skill set. The position requires building automation code and scripts to run set of Hive queries on EMR cluster for processing data. It also involves running and looking over queries running the cluster. - Good understanding of Hadoop - Good understanding of Hive - Java and scripting such as Python/Ruby - Knowledge of EMR is a plus
  • Hourly – 3 to 6 months – 10-30 hrs/week – Posted
    I am looking for Hadoop and NoSQL experts with strong knowledge of Hadoop, NoSQL databases such as Cassandra, MongoDB,HBase etc. You should have sound knowledge of Hive, Pig, Jaql and technical, hands-on experience designing, building, installing, configuring and supporting Hadoop in a Linux environment. Hands-on Hadoop database tuning and troubleshooting experience.
  • Hourly – 3 to 6 months – 30+ hrs/week – Posted
    Using Cascading and OOzie assist us in processing data in our AWS environment which uses Accumulo, Hadoop and Map-Reduce
  • Hourly – More than 6 months – 30+ hrs/week – Posted
    We have a new project in the emerging field of Deep Learning. This is a long term project. You should have strong skills in one or more of the listed skills: - Big Data (Hadoop) - Graph Databases (Neo4j) - Deep Neural Networks (Matlab) - Amazon Web Services (AWS) You should have ideally too knowledge of Intelligent Transport Systems or an ability to learn about them. You will have good skills in Matlab, and understand the principles of using data to train a neural ...
loading