Mozenda Scraper Jobs

14 were found based on your criteria

  • Hourly – More than 6 months – 10-30 hrs/week – Posted
    Hello We have a list of 1000 websites per month that we require certain content from and we will provide the urls for each site When applying for this job please write the words "fatalanplasticbag" as your subject line, if you dont do this then you are no going to be interviewed Please ask questions or provide a price. We will pay $1 per site 1. Scrape the content we require from each site to our excat spec 2. Save ...
  • Hourly – More than 6 months – 10-30 hrs/week – Posted
    We are looking to scrape a variety of different sites. We need someone who has had a lot of experience using Mozenda or an equivalent scraping software to create a variety of different types of agents.
  • Fixed-Price – Est. Budget: $14.00 Posted
    We would like to normalize the following phone numbers to a unified format. - Country code + Area Code + Number - No leading zeros from country code - Numeric characters only Ex. 16505551234 (US number) 16505551235 491726929293 (Germany number) ~~~~~~~~~~~~ Column C is the country of the client Column M is the phone number which should be normalized Please find the country code in the number and format it according to the above mentions requests. ~~~~~~~~~~~~\ Most of the lines are country code Germany, 49. In ...
  • Fixed-Price – Est. Budget: $30.00 Posted
    Hello there, We have some system which needs to integrate on Wordpress. We coded theme but could not transform site contents to blog. We are looking for people who can write a small crawler to scrape all sites by crawling them and export data to Wordpress' XML format. You also need to scrape image files. Please apply the job if you are enough to do everything I listed, please don't forget to start with -pineapple- it proofs that you ...
  • Fixed-Price – Est. Budget: $29.00 Posted
    We would like to create a database for calling clients. We are creating the database based on erento.de website and existing excel files which were scraped from erento.de We would need to have the following data manipulations made on this file: A. Pull the city name from column E , it is located between the postal code and the word Deutschland http://screencast.com/t/DW6OnjTGm locate the city name in column F. B. Insert the state of each ...
  • Fixed-Price – Est. Budget: $20.00 Posted
    Need to have a quote on scraping the following site of all entries: top pinterest users dot com Need the following info in a csv. 1. User Name 2. User Follower Count 3. Entire User Description 4. Link to User Profile For #4 the only way to do this might be to scrape the first link to the linked to board and then mass in excel take off the last /boardname which will leave us with the direct link to ...
  • Fixed-Price – Est. Budget: $100.00 Posted
    There are 11,461 Meetup.com Groups near New York, NY See: http://www.meetup.com/cities/us/ny/new_york/ I would like to have this data scarped. This project is needed asap and can start right away. I need the following fields Name of Meetup Meetup URL Number of Members Miles from New York Contact URL Website URL Facebook URL Twitter URL Founded Group Reviews Upcoming Meetups Past Meetups Main Organizer name Main Organizer Contact URL Secondary Organizers (Name ...
  • Hourly – Less than 1 week – Less than 10 hrs/week – Posted
    I have a database of 530 LinkedIn contacts where I have the Title, Company Name, Location but no contact name. The contact name is simply listed as "LinkedIn Member". I want someone to find the names wherever possible by pasting the title and company in Google search. Very often, the LinkedIn public profile will bring up the contact name. I then want to clip and past the name into the database field for all 530 records. For example, out of ...
  • Hourly – Less than 1 week – Less than 10 hrs/week – Posted
    I'm looking for an Outwit tutorial written especially to fit some needs. Need to scrape a particular part of complex content on a site. A hands-on step by step tutorial with screenshots would be nice. Other techniques are welcome but it needs to be less than 10 min of work. Need to work on Mac (Apple Computer). No Windows programs pls. Scraperwiki is an option as well, if possible. I need to scrape every column of this url + the ...
loading