Mozenda Scraper Jobs

90 were found based on your criteria

  • Hourly – Less than 1 week – Less than 10 hrs/week – Posted
    I need a freelancer that can design a Mozenda agent to scrape information from the following site: eweb login: eweb password: marketing I require the following information: Company Name Contact Name Contact Email Company Website Contact Phone Number Company/Contact Address Instructions: * I am only interested in data from the U.S.
  • Fixed-Price – Est. Budget: $30.00 Posted
    Hello there, We have some system which needs to integrate on Wordpress. We coded theme but could not transform site contents to blog. We are looking for people who can write a small crawler to scrape all sites by crawling them and export data to Wordpress' XML format. You also need to scrape image files. Please apply the job if you are enough to do everything I listed, please don't forget to start with -pineapple- it proofs that you ...
  • Hourly – Less than 1 week – Less than 10 hrs/week – Posted
    I will provide you with an xls list of blogs/publications. I need you to provide the following: - Blogger/Editor name - Valid email address - twitter profile - facebook profile - linkedin profile I will provide manual instructions on how to find this information. But, I prefer someone who can take that knowledge and use a scraper to acquire the information. This job is a test. If you successfully provide valid information, it will turn into ongoing work.
  • Fixed-Price – Est. Budget: $25.00 Posted
    We would like to create a sitemap CSV file for The site map should be on a CSV file. The file should contain the following columns for each product: ProductID - To be taken from here - Title - To be taken from here - URL - To be taken from here - ImageUrl - To be taken from here - Price - To be taken from ...
  • Fixed-Price – Est. Budget: $20.00 Posted
    For each of the following links Frankfurt - 50 Km,,,,,umkreis-50000 Hanau - 50 km,,,,,umkreis-50000 Wiesbaden - 50 km,,,,,umkreis-50000 Giesen - 50 km,,,,,umkreis-50000 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ We would like to scrape the following information: Company name, email address ( if exists) , Website, address, state(always put hessen), Category (always put "baumaschinen_baustelle". and put it in excel spreadsheet
  • Fixed-Price – Est. Budget: $500.00 Posted
    looking for seasoned scraper that knows how to scrape names addresses, emails, and tel numbers from several directories that cater to the cosmetic/anti aging/wellness industry. examples of sites of interest that we need to pull doctors/medspas . several others that we can give you- we are looking to pay .05/name- basically $500/100k we want as many as you can find world wide-
  • Fixed-Price – Est. Budget: $29.00 Posted
    We would like to create a database for calling clients. We are creating the database based on website and existing excel files which were scraped from We would need to have the following data manipulations made on this file: A. Pull the city name from column E , it is located between the postal code and the word Deutschland locate the city name in column F. B. Insert the state of each ...
  • Hourly – Less than 1 week – 30+ hrs/week – Posted
    Using Mozenda (or a similar product to be agreed) scrap data from 2 to 3 websites. I want you to setup the agents to work on the selected sites and download the required content. The websites don't change format a lot, so once the agents are setup and working I need you to spend some time handing over the project to my in-house people manage. (however we may decided to arrange an ongoing fee for you to maintain the ...
  • Hourly – 1 to 3 months – 10-30 hrs/week – Posted
    I have many data to upload in my cms and If I upload one by one then this will take me many months so I need custom scripts which will auto upload data in my cms from Excel Sheet.. There are 2 fields in cms in Category basis. 1. Radio Name 2. Streaming URL All data are managed in Excel sheet. And have to auto upload in CMS from the collected data. If you are ready to start now then ...