Hadoop Jobs

20 were found based on your criteria

  • Hourly – More than 6 months – 10-30 hrs/week – Posted
    Should have good working experience in Hadoop Ecosytem . Should be willing to work on remote machine . Should be familiar with Java . Interested candidates please reply with your skype ID .
  • Hourly – Less than 1 month – 10-30 hrs/week – Posted
    We are looking for a resource to help with capacity and layout planning of a hadoop cluster. This will include deciding between a physical or VPS setup and understanding the amount of network bandwidth required. We will also require help with configuration of maintenance of the system. This will include hooking up postgres via sqoop and using things like zookeeper and habase. Finally we will eventually be implementing functionality using things like Cascading or Hive.
  • Hourly – Less than 1 month – Less than 10 hrs/week – Posted
    HI, I have hadoop experience. iam looking for someone to teach me oozie. it includes writing oozie work flows which includes sqoop, hive, pig and hdfs no need to teach me hdfs, hive, pig and sqoop. I am looking for just the oozie part iam looking for forks, joins, various control actions in oozie
  • Hourly – Less than 1 week – 10-30 hrs/week – Posted
    I am looking to crawl about 15-20k sites, all investment funds and investment banks. I would like to gather all their historical transaction data from the news/press release sections of their sites. With this data we would want to create visuals to understand trends and patterns in a variety ways. Then I would like to scrape these sites daily to start to create a feed of transactions as they are announced.
  • Hourly – More than 6 months – 30+ hrs/week – Posted
    New team to be built in Dublin Ireland to build new distribution platform for the travel industry. It is a full time, permanent role for an already very successful business, that wants to be ahead of time and build new platforms for their clients, based on the most modern technologies out there
  • Hourly – More than 6 months – 10-30 hrs/week – Posted
    Quantum Strides LLC is looking for Big Data Trainers to teach online classes virtually, on an ongoing basis. Flexible hours per day ranging from 2 to 6 hours. So even if you are employed and have a few hours to spare in the evening, this opportunity will provide you the ability to earn additional income from the comfort of your home. Suitable for individuals with passion for mentoring and teaching. To qualify you must have the following experience: - At least ...
  • Hourly – 1 to 3 months – 10-30 hrs/week – Posted
    We are building next generation telecom product. For this product we are looking for experts in big data area specifically Cassandra to explore the possibilities of data management as well as storage. If you are an expert in this area, please contact us. This could be your opportunity to associate yourself with next generation telecom product using big data. This is a real hot field these days. We are United States based company with a solid team on board to ...
  • Hourly – 1 to 3 months – Less than 10 hrs/week – Posted
    This project has several components #1) create integrated feed consisting of IMAP / Pop email (gmail, yahoo, iCloud, outlook ), social feeds ( Facebook, Twitter, LinkedIn, Instagram, Pinterest) , app notifications, texts, IMs and phone records/ voice mails # 2) provide method for user to link incoming items across different methods to same communicator #3) provide default mechanism to respond to any incoming communication method based on form of message or alternatively to any other form available through #2 #4 ) provide method for calculating users ...
  • Hourly – More than 6 months – 30+ hrs/week – Posted
    Looking for a Java + Hadoop Developer for a long-term full-time (40 hours a week) project. Remote development. Requirements: 4+ years in JEE development 2+ years experience with Hadoop Please apply with a CV with Java and Hadoop projects descriptions. Thank you!
  • Hourly – More than 6 months – 30+ hrs/week – Posted
    Im looking for big data engineers to build our data infrastructure already composed by kafka, camus(linked or streaming data from kafka to hadoop) and hadoop. The infrastructure has just started and we managed to set up all this environment in a matter of two weeks but we understand that we need more man power to support all the services we are planning to build. Some details about The company has a traffic of 30k rps as a peak and ...
loading