Web Crawling Job Cost Overview
Typical total cost of oDesk Web Crawling projects based on completed and fixed-price jobs.
oDesk Web Crawling Jobs Completed Quarterly
On average, 19 Web Crawling projects are completed every quarter on oDesk.
Time to Complete oDesk Web Crawling Jobs
Time needed to complete a Web Crawling project on oDesk.
Average Web Crawling Freelancer Feedback Score
Web Crawling oDesk freelancers typically receive a client rating of 4.78.
I have experience of web programming, data mining and web crawling using PHP, PERL, JS I am having - 7+ year of experience with PHP/ MySQL - 3+ Years of experience with PERL - Data mining algorithm design - Database Management - Software Quality Assurance At Oxygen IT I was responsible to manage and design large data mining projects.
Over the past 15 years I've worked with projects ranging from Startups to Fortune 100s. From C++ to Scala, past assignments involved: - Protocol Design in C++/Delphi - Java API Publishing - Maven Plugin Development - Distributed Systems using RESTful backends - Scala / Akka Design - Jenkins Plugins Writing On a daily basis, I spend between Systems Architecture, Build Planning, for Hadoop and Restful Backends, specially under Amazon Web Services. Oh, I'm also a Cloudera Certified Administrator and Developer. Favorite platforms include Scala, Python and General Java. See my full profile for more descriptions and feel free to browse my github and bitbucket profiles at the addresses below: http://github.com/ingenieux http://bitbucket.org/aldrinleal Favourite customers involve pragmatic teams, with contant online presence, and direct involvement with the end customer, and a problem involving legacy and the next generation Feel free to discuss our next project - I'm always looking for the next challenge!
I am a freelance software developer providing web scraping services and data crawling using Python and Perl. Over the last 5 years, I have developed a wide range of software solutions to crawl and extract different types of data from a myriad of websites, e.g. real-time data, business listings, sports results data and other related data sources. My experience lies in automated web scraping with well defined specifications including website and output format. Website data delivered often are in standardised CSV or SQL formats and typically involves further post-processing of crawled data according to your requirements. Code is written in Python and Perl following agile and TDD software design techniques to ensure code longevity, reliability and robustness. I also have experience in mathematical languages Matlab, R, as well as databases SQL, MongoDB, Linux administration, and many web API's including Dropbox, Twitter, Twilio and Sendgrid. As well as web scraping, I am capable of completing similar web related projects using Python and Perl, for example, filtering log files, web monitoring, notification systems and backend web apps. Please contact me if you have any available projects!
Hi, I'm really expert in web crawling. Google, facebook, craiglist and many others crawlers. Emails and phones searchers, and websites by keword searcher. My crawlers are scalable, i use Amazon.AWS for it and proxies if needed. Recognize captcha is not trouble for me. I have online email scraping service http://mailtarget.org/ You can give me a test task for test me for free Regards, Alex
Rupom Razzaque Agency Contractor
[Only Available for Long Term Jobs] With more than 12 years engagement in various programming projects, I've been providing advanced level of web scraping (or data mining / extraction / crawling / screen scraping or whatever you call it), parsing, searching & reporting AS WELL AS various backend web development services since 2006. So far I have contributed in 100+ scraping / parsing, aggregation & backend driven projects which include data extracting from email boxes (like Yahoo, Gmail, MSN, etc.), content scraping from property sites, product scraping (like Google Products, Amazon, Buy.com, Dell, NewEgg, Walmart, Overstock, Boutiques.com, etc.), deal & coupon scraping, job site scraping and many more -- their aggregation & backend data management as well. I can handle both the frontend & backend of any aggregation assignment; no matter it is on a known CMS like Drupal, WP or any custom one. I can also build custom CMS for your own. My goal is to develop & deliver quality applications for my clients and make sure that those serve their intended purposes. I'm proficient in object oriented programming, MVC based frameworks, design patterns, regular expressions, database ORMs, custom web service / API development, algorithm design, and server administration. I have capability to develop and manage high level web applications in a scalable manner. As a professional developer, I like to work with professional and serious employers only. My primary field of interest is advanced web data scraping / mining & parsing and aggregating them as needed, but I'm also fluent in other sections of a project. I'm only looking for long-term opportunities that require strong technical knowledge to develop quality projects. Please don't invite me if you're looking for a short-term developer.
Tigran Tokmajyan Agency Contractor
Over last 8 years I was working on software development using Java. I have developed a wide range applications especially in such scopes: -- Web 2.0 applications. -- Highload applications: experienced with clustering applications, search engines, crawlers, scrapers, Amazon Web Services.
I create both desktop and server based applications/scripts mainly in the area of web scraping and data gathering. From gathering emails, search engine results and pulling hundreds of thousands of records from secure sites and blogs, nothing has yet to escape my skills, including: Where do have designed a scripts for proxy servers, to scrape the blocking sites. Good at TCP/IP and Http protocaol. CCNA certified. Workes as a qualty analyst for CISCO service Engines. Applications are often multi-threaded and work from remote back-end databases. Will be available through WebeX.