Data scraping Jobs

55 were found based on your criteria

  • Hourly – Less than 1 month – 10-30 hrs/week – Posted
    Given a set of base URLs, mimic the behavior of the GitHub Jobs API https://jobs.github.com/api by scraping relevant contents from the career pages of various companies. Expect crawler to be written in Python/Perl, but if you have a different language proposed, please make the case. In addition to providing results in JSON please write to a Postgres database. Accurate and complete results and time to delivery are the key success criteria. Details attached.
  • Hourly – Less than 1 month – 10-30 hrs/week – Posted
    I need someone that can collect contact data on hunting guides, outfitters, and other hunting opportunities. This will likely be unstructured data, so it will likely be a very manual process. I will give you the websites, you will collect the data in a Google speadsheet. I am looking for the following fields: Email, Name, Website, Phone
  • Fixed-Price – Est. Budget: $700.00 Posted
    **About us** The SaaS Co. is a young startup company located in Berlin, Germany. We are a sales accelerator specialized on SaaS products. One of our key service is the Lead Generation for our clients. **About the Job** For this purpose we are looking for someone who can provide us with the data feed for xing.com, a social network platform comparable to LinkedIn (see: http://en.wikipedia.org/wiki/XING) The data fields we are especially interested in are ...
  • Fixed-Price – Est. Budget: $150.00 Posted
    Hi, I would like to collect a list of veterinary practices in Australia based on the AVA webite http://www.ava.com.au/findavet The AVA has a map with a number of locations. I would like the details of each location captured. The information is: Clinic Name Clinic Address Phone AVA members at this practice Website
  • Hourly – 1 to 3 months – Less than 10 hrs/week – Posted
    I am looking for someone to work on transforming the data that our vendors provide us in our excel spreadsheet format so that we can then use it for our multichannel ecommerce business. We will provide the vendor data in the form of a spreadsheet, excel catalog, vendor website, etc - you will then get that data organized so that the data matches the columns in our spreadsheet. May also require image resizing and renaming so that the images match up ...
  • Fixed-Price – Est. Budget: $250.00 Posted
    Please see the video link below. All the information about this project is explained in the video. Small Description 1) Get data from http://graphicriver.net/category/print-templates/flyers 2) Collect Name, Links, Thumbnails, Picture Link, Description 3) Let ImportIO collect and organize the information. https://www.youtube.com/watch?v=GYVZ7JPw_y0 Please when you're done with the project as proof either use the oDesk APP or "Screecast-o-Matic" to upload a video showing it works.
  • Hourly – 3 to 6 months – 10-30 hrs/week – Posted
    Data Scraper – Websites, Feeds, API’s We are looking for someone who has experience scraping structured data from the web. The data will have multiple sources and, depending on the source, could be accessible through an API, XML feed, JSON, or only raw HTML from the frontend. We will likely require multiple scrapers over time. Each scraper will need to be integrated into our web-based platform and any options/settings controllable from our operator dashboard. The data gathered in each ...
  • Hourly – Less than 1 week – Less than 10 hrs/week – Posted
    I'm interested in creating a scraping script for a Website to help launch a directory site. I'd prefer to use Ruby and Nokogiri or Python, but am open to other options. Ideally, you would log-in to my Mac to help set up the script. I'm looking for someone with good English skills and patience to help me get more familiar with scraping.
  • Hourly – Less than 1 week – Less than 10 hrs/week – Posted
    I would like you to develop a Wordpress plugin that can scrape a webpage for URLs and display them in a results box on a page. The tool needs to be simple and easy to use, with a box for a URL, a button to start the scrape and a results section where the URLs are displayed. The links it needs to scrape will be listed as such and not hidden behind anchor text, for example: https://www.odesk.com ...
  • Hourly – Less than 1 month – Less than 10 hrs/week – Posted
    • We are looking for a freelancer to go on a data mining mission and capture the data of individuals at businesses with specific job titles. We are looking for strong, accurate data of these people, in order to reach out to them with our products and services. • We need the freelancer to contact and discover the individuals holding the title of UK based Finance Directors, Directors of Finance and Chief Finance Officers. Especially concentrating on the Financial and Insurance, Construction ...
loading