Hadoop Jobs

46 were found based on your criteria

  • Hourly – Less than 1 month – Less than 10 hrs/week – Posted
    Hello, we need to properly setup server side for Snowplow Analytics to be ready for being used to track website events. The analytics tool requires multiple services by Amazon. Snowplow - the open-source, web-scale analytics platform powered by Hadoop, Hive and Redshift. ---------------------------------------------------------- Technical specification ---------------------------------------------------------- Find helpful information about the architecture and neccessary setup here: https://github.com/snowplow/snowplow http://snowplowanalytics.com/technology/index.html http://www.slideshare.net/alexanderdean/introduction-to-snowplow-big-data-data-science-israel?next_slideshow=1 ---------------------------------------------------------- What I need from you if we hire ...
  • Fixed-Price – Est. Budget: $10.00 Posted
    Develop script deploy.sh which downloads hadoop-2.6.0.tar.gz and deploys hadoop to ubuntu 14.04: 1) HDSF files should be in separate directory /app/hadoop 2) hadoop and hdfs should work from hadoop account hduser:hadoop 3) yarn is needed 4*) hadoop should use 64 native binaries
  • Hourly – 1 to 3 months – 30+ hrs/week – Posted
    We are seeking to hire a server/API developer to (1) complete the build of a long running m-commerce API project and (2) begin the development of a new Big Data project. These two projects includes the development of: Big Data relevance algorithms based on fashion insider insight Lightning-fast prediction algorithms requiring 50 million in-memory pre-calculations per country (consumer launch planned for 29 countries) SQL and NoSQL data manipulation and migration A world first for the fashion market; a public ...
  • Hourly – Less than 1 month – Less than 10 hrs/week – Posted
    Hi, I am looking for Hadoop MapReduce Expert to help writing interview questions. Questions are multiple choice (must have 4 possible options)
  • Hourly – Less than 1 month – Less than 10 hrs/week – Posted
    I have need of a hadoop professional to help me configure and setup a testing environment on googles app engine. I need someone who knows how to configure ubuntu / hadoop. I am looking for a hadoop multinode cluster setup. If you have not configured this before, please do not apply.
  • Hourly – 1 to 3 months – 30+ hrs/week – Posted
    We need a chat bot that does three things: 1) Recognizes topics in a user's chat window 2) Links those topics to knowledge base articles 3) Responds to users intelligently (when there is a match) This needs to be built in Java and preferably leverage BigML or Weka for machine learning from historical chats.
  • Fixed-Price – Est. Budget: $5.00 Posted
    I have the following HDFS directory structure. +user ++df +++crawldirectory ++++segments Inside the segments directory are multiple directories named like 20141102230503. Inside each of these directories if there is a file named _SUCCESS , I need to delete the directory 20141102230503. I will be checking this in a hourly crontab. I need a script which uses HDFS commands to delete. I have the directory structure depicted below again to explain. +user ++df +++crawldirectory ++++segments. +++++20141102230503 ++++++_SUCCESS(file) +++++20141103032250 ++++++_SUCCESS(file) +++++20141103073030 ...
  • Hourly – Less than 1 week – 30+ hrs/week – Posted
    1. Export Image from Rackspace Cloud Server according to: https://community.rackspace.com/products/f/25/t/3583 2. Set up Hadoop Clusters on Google Compute Engine according to: https://cloud.google.com/developers/articles/managing-hadoop-clusters-on-google-compute-engine/ and https://cloud.google.com/compute/docs/ 3. Transfer Image 4. Delete Rackspace Servers
  • Fixed-Price – Est. Budget: $350.00 Posted
    We are looking for a developer to work on the following. Please provide your resume and skype id. To do: 1. Install Hortonworks 2.1 with HDFS , Yarn , Spark , Hbase 2. 20 TB data with replication 3. Yarn Cluster for HDP 2.1 4. Ubuntu 12.4 is the preferred OS. 5. Each instance should have full root access with installation permission. 6. Each instance should have access to Internet . 7. Each Instance should have intra-communications port open. 8. Deploy ...
loading