I am passionate about big data technologies. Software Engineer at Deerwalk Inc. I have been working with US healthcare data for about two years using big data technologies like Cascading, Hadoop and Elasticsearch. Projects: HEDIS 2014/15 Development and Certification Quality Reporter: Quality Reporter provides full clinical data analysis on the HEDIS certified measures Plan Analytics: Plan Analytics is an innovative software solution designed to work in the real world of healthcare management. Distributed Approach of Analysing and Visualizing Census Data: The main theme of this project was the implementation of the distributed computing for the large data-sets. View my LinkedIn Profile: http://www.linkedin.com/pub/swoyambhu-shrestha/67/487/121
Hadoop Job Cost Overview
Typical total cost of oDesk Hadoop projects based on completed and fixed-price jobs.
oDesk Hadoop Jobs Completed Quarterly
On average, 20 Hadoop projects are completed every quarter on oDesk.
Time to Complete oDesk Hadoop Jobs
Time needed to complete a Hadoop project on oDesk.
Average Hadoop Freelancer Feedback Score
Hadoop oDesk freelancers typically receive a client rating of 4.73.
Installation of various versions of Hadoop MR, YARN clusters and HA cluster. Installation and configuration of various versions of Cloudera’s Distribution Including Apache Hadoop (CDH) and Apache Hadoop. Installation of various Hadoop Ecosystems (Hive, HBase, Pig, Hama, Oozie, Whirr, Mahout, Zookeeper) and Hadoop Daemons. Involved in Hadoop Cluster environment administrator that includes adding and removing DataNode existing cluster. Cluster Capacity Planning, Performance Tuning, Cluster Monitoring, Troubleshooting. Configure of Fully-distributed Hadoop cluster. Using server monitoring like Ganglia and Service monitoring tools like Nagios. Decommissioning and commissioning the Data Node on running cluster. Recovering from a NameNode failure.
I am an expert Software Engineer with over 9 years of professional development experience on both Windows and Linux. I have designed, implemented and delivered numerous large scale projects at various companies and during my time at Microsoft on the Windows and Bing teams. Through that work I have become proficient in creating back-end services in C/C++, C#, Java, Python and Ruby. I am experienced with multiple aspects of Hadoop, including setting up and maintaining a cluster, configuring Flume, writing Hive/Pig/MapReduce jobs, creating Oozie workflows, etc.
Hi, I currently work and reside in NYC, USA. I have over 14 years of experience developing financial applications. I hold BS and MS degrees in Computer Science from Columbia University. I'm a Sun Certified Java Developer and J2EE architect. I'm also a CFA charter holder and have been mostly working in the financial industry for front-office trading and risk management in NYC, Tokyo and Shanghai. I have worked with both equities and fixed income asset classes and I am very knowledgeable with derivatives. My technical skills include Java/C++, Excel/VBA, ASP.NET, and SQL. I've worked on both web and desktop applications. I'm available for part-time projects initially. But I want to gradually convert to be working full-time on oDesk. Thank you.
I gained a passion for computers during my 5 years in the military and went to the University of Illinois in Champaign-Urban to develop the necessary skills. I majored in Accounting and Finance with minors in Computer Science, Informatics, and Technology and Management. I have been on several database design projects and have been on 2 database development projects. I interned at Deloitte Consulting in their Technology practice where I worked on a ERP implementation Project for a major food manufacturer. I currently work for FTI Consulting in their Data Analytics practice where I do data management and visualization mostly for litigation purposes. My work usually involves SQL and Excel, but has included a wide variety of other programming, data management, and visualization software, some of which you can see listed in my skills.
Summary ======== Cloud, Virtualization, BigData infrastructures and Parallel Programming. • Administration, management of Hybrid Cloud infrastructure (OpenStack). • BigData infrastructure deployment (hadoop, yarn, spark) in the cloud (AWS, OpenStack) • Amazon Web Services Architect • Platform management using Docker • Providing support for OpenMP and MPI Programming/applications on HPC clusters
Hadoop development • Analyze the specific categorical data across geography and time series to get meaningful insight. • Designed Social media listening frameworks for social media websites to gauze customer behavior. • Implemented Data Integration capability of Hadoop eco-system. • Work and had my hands dirty on other Big Data areas such as Predictive Analysis/Sentiment Analysis, Taming Text, R Statistical Computing, Machine Learning (Mahout, Open NLP). • Administration of Hadoop clusters.