Web scraping Jobs

1,054 were found based on your criteria

  • Hourly – Less than 1 week – 10-30 hrs/week – Posted
    1. Write code in any language to subtract the information the txt files and fill it in the new prospects sheet which contains: first name, last name, title, company, city, state, and email. 2. Compare the names in the new prospect sheet with the all leads and all contacts spreadsheets and remove them from the new prospect sheet. 3. Due to the volume of the list (text files’ size is of up to 1 MB total), manual labor would take ...
  • Fixed-Price – Est. Budget: $200.00 Posted
    We need all the available emails from the member directory of the journalism site, Online News Association. We need this delivered to us in the form of a CSV file. We are fine to it being done programmatically or by hand. A username and password will be required to access the list. We will provide this to the candidate.
  • Fixed-Price – Est. Budget: $5.00 Posted
    I need help taking the data on this website and placing it into a spreadsheet. https://www.finandfield.com/ To start I will have you do one section only. You will type 'New Jersey, United States' in the 'where do you want to go?' box and click 'all fishing charters' in the 'what do you want to hunt/catch?' tab. From there you will see a list of results. On the top of the page select the 'charter boat' filter ...
  • Hourly – Less than 1 week – Less than 10 hrs/week – Posted
    What I Have: I have a list of Eviction records for my county. The records show the address of the subject property as well as either the owner's name (landlord), or the owners attorney's name, but it does not specify which. What I Need: I need the mailing address of the landlord as well as the landlords name so I can mail to them about their rental property. I am not sure how this would be done. Possibly ...
  • Hourly – Less than 1 month – Less than 10 hrs/week – Posted
    The job task is to go to indeed.com and find 1,000 companies that are looking for either outside sales reps, sales reps, etc and enter them into an excel spreadsheet that we will provide you. Then, for each entry, go to LinkedIn and find out who the hiring manager is for that company.
  • Fixed-Price – Est. Budget: $200.00 Posted
    i need to scrap al the information from this website it shows 500,000 something record , and i need them all . i need 1 excel file for each categorie on the side . it is a little complicated because the site have IP protections , and also it limits your search in every categorie when you try to get pass page 14 . This is what I need http://awesomescreenshot.com/0154li9319
  • Fixed-Price – Est. Budget: $25.00 Posted
    I need someone to extract and return on a spreadsheet the following from the link provided below. Name Phone Number Address Facebook/Linkedin (if they have it listed) Their email (This is not listed but I will give you $0.25 for every valid email you can get me about the agent) Website. I will pay $25 for the list and the extra $0.25 per valid email you can retrieve. The site to extract the data is Zillow: http ...
  • Fixed-Price – Est. Budget: $200.00 Posted
    We need help finding all the contact information of all the contributors to the citizen journalism website www.allvoices.com. We would like their email/facebook/twitter/linkedin/personal website if possible. We are fine with this being done programmatically or "by hand". There are only between 100 to 150 individuals. We would like this organized and put into a CSV file.
  • Fixed-Price – Est. Budget: $85.00 Posted
    I need a scrapy crawl that crawl all domains containing in a csv file. The csv file is attached in this job. The crawler should crawl every url on this domains and try to find a specifc div class The div class is: abstractcomponent openinghours here are some example urls with this div-class: http://www.mercedes-benz-berlin.de/content/deutschland/retail-plz1/berlin/de/desktop/passenger-cars/about-us/locations/location.6150.html http://www.europa.mercedes-benz.be/content/belgium/retail-m-r/mercedes-europa/fr/desktop ...