1. Worked with multinational company for 8+ years and have experience working with different clients. 2. Worked in pharmaceutical domain and have experience of development and production support projects. 3. Technical expertise in INFORMTICA,PL/SQL, have created various mappings, workflows,sessions to meet the data requirements. Dealt with various transformation logic like lookup, Aggregator, Joiner, Source Qualifier, Expression, Update Strategy etc. Good understanding of performance tuning. 4.I also have some experience in website designing using HTML,PHP,CSS. Also working on mobile app designing using ANDROID.
Informatica Job Cost Overview
Typical total cost of oDesk Informatica projects based on completed and fixed-price jobs.
oDesk Informatica Jobs Completed Quarterly
On average, 0 Informatica projects are completed every quarter on oDesk.
Time to Complete oDesk Informatica Jobs
Time needed to complete a Informatica project on oDesk.
Average Informatica Freelancer Feedback Score
Informatica oDesk freelancers typically receive a client rating of 4.12.
7 years IT experience in developing BI Applications Using Informatica, Oracle,Teradata,Sql Server,Unix ,Shell Scripts,PLSQL, Business Objects • Diverse experience in HealthCare,Sales & E-Commerce Domains • Secured CTS Wah&Scope Awards for my contribution towards multiple Projects • Secured HP E-Awards , Star Awards and multiple SPOT Awards for my contribution and Good deliverables across multiple projects • High working experience with Informatica PowerCenter transformations like Joiner, Router, Update Strategy, connected and unconnected lookups, Expression, Sorter, Filter, Union, Normalizer, etc. Also worked with varied sources and targets like relational tables, flat files, XML Editor, VSAM files, XLS,SAP etc. Aware of the basic administrative tasks with Integration Service, Repository Manager • Excellent skills in working with UNIX shell scripts. Developed shell scripts for feed file integrity checks, FTP activities and commands like ftp,scp,find, grep, pipe, cut, redirection and utilities like awk and sed • 1+ years of experience in dimensional modeling techniques like Star and Snowflake schemas using Erwin • Extensive experience with CA7,Tidal and Autosys job scheduler • Designed and developed EDW, DM and Data Quality Systems for prestigious clients Hewlett Packard, ACS, BCBS. • Strong in dimensional modeling such as star schema and snow flake dimensional models, Logical and physical data modeling and 3NF data modeling. • Strong in performance tuning of complex SQL queries • Involved in detailed Analysis, design, development, maintenance and upgrading systems to support business operations. • Created and Used Workflow Variables, Session Parameters and Mapping Parameters
Peejay is an experienced IT professional and has about 9 years of experience in Information Technology. He has a solid 9 years of experience in using Informatica PowerCenter from various Datawarehouse / ETL projects which include development, maintenance and production support. He’s been involved into end-to-end implementation of Data Warehouse and Business Intelligence solution in a scrum environment and a support, handling adhoc requests from business stakeholders.
Over 12 years of work experience in Information Technology with expertise in data warehouse applications focusing on Business Intelligence, Analytics, ETL and OBIEE. • Expertise in dimensional data modeling/administration, OLAP cube design, and data warehouse implementations. • Proficient in performance and scalability best practices, data transformation, and data integration strategies. • Extensive knowledge in ETL (Informatica), Data warehouse Administration Console (DAC), Oracle BI Enterprise Edition (OBIEE) and Siebel CRM. • Very strong Analytical, Technical and Business skills in architecting and designing using ETL applications. • Informatica Administration with data warehouse experience. • Expertise in setting up Informatica environments including hardware, software, and security. • Provide Informtica ETL production support including troubleshooting, performance tuning, best practices and standards. • Informatica PowerCenter 8.x/9.x System Administration (including but not limited to proactive resource monitoring (CPU, memory, workflow run times etc), monitoring physical and logical Informatica services, upgrading Informatica repositories from v8.x/v9.x to later versions, sizing the Development, Test and Production environments Tools: • Informatica 7.x/8.x/9.x • DAC 10/11 • OBIEE 10g/11g • Oracle BI Applications 7.x • Siebel CRM 7.x/8.x • MS SQL Server 2005/2008 • Oracle • Unix Shell Scipting
• 6+ years of IT experience in Analysis, Design, Development, Implementation ,Testing, Support of Data warehousing and Data Inetgration Solutions using Informatica Powercenter • Strong Data Warehousing ETL experience of using Informatica 9.1/8.6.1/8.1 PowerCenter Client tools • Expertise in working with relational databases such as Oracle 10g/9i, SQL Server 2008 and Teradata • Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills • Team player, Motivated, able to grasp things quickly with analytical and problem solving skills
1. Over 9 years of software experience in client server Applications design, development, implementation, upgrade, and maintenance. 2. Over 5 years of experience in Netezza. 3. Worked on databases Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008, DB2. 4. Netezza Database Administrator Responsibilities: • Creating new objects from data models / altering/ dropping physical database/ table/ external table/ views/ materialized views objects in development and production environments. • Execute scripts to create objects and populate data on the objects. • Create new user / new group and provide necessary Grant/Revoke priviliges to nz_user or nz_group ids based on the application requirement. • Key distribution selection based on the tables relations as well as based on query. • Backup and Restore of database objects. Also monitor the jobs and check if any issues. • Remove database fragmentation using GROOM and analyze the benefits from output. • Generate statistics upto date to improve the query performance using Genstats. • Monitor health of the Netezza hardware and filing a case with IBM in case of any issues. • Monitoring workload management and tuning them • Monitoring catalog size and performing manual vacuum. • Monitor the file system usage. • Change the priority of the jobs/transactions if needed. • STOP and START the netezza appliances incase of issues. • Loading/Unloading tables using external table concepts also using the files. • Check the pg logs, dbos logs, etc. with respective to error the application faced. • Nzevents creation and monitoring. • Migrating databases from one host to another. • Managing storage management and capacity planning. • Writing shell scripts and automating tasks. • Creating Netezza utility scripts. • Tuning the queries to improve performance of SQL using EXPLAIN. Also analyze the cost of the SQL. • Use NZ_Admin tool to check the hardware issues as well as data skew, stats, backup, database size, etc. as it is a GUI tool. • Solve issues using Service Now Incident Management Record (IMR) and Problem Management Record (PMR) as per ITSM standards. 5. Netezza HEALTH CHECK Responsibilities: • System Health Check – Checking reported components states, check cluster and DRBD state, checking basic counts and presence of SPU and CPU. • Monitoring Distribution, Organization, Generate Statistics, Groom. • Specifying advice and problem description. • Performance tuned mappings by tuning source qualifier overrides, re-using lookups, and using temporary tables. • Monitoring workload management and tuning them • Monitoring catalog size and performing manual vacuum. • Monitor the file system usage. • Nzevents creation and monitoring. • Managing storage management and capacity planning. • Writing shell scripts and automating tasks. • Creating Netezza utility scripts. • Tuning the queries to improve performance of SQL using EXPLAIN. Also analyze the cost of the SQL. 6. ETL Informatica Developer Responsibilities: • Understand the Functional Business processes and Requirements given by the Business Analyst. • Extensive design and development of transformations; Normalizer, Lookup, Aggregator, Expression, Sequence Generator, Router, Filter, Joiner, Update strategy and Union. • Developing mappings/maplets, and utilizing dynamic parameter files. • Designing and Developing Mapping specifications, Physical Flow diagrams and build documents. • Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping • Involved in enhancements and maintenance activities of the data warehouse including rewriting of stored procedures for code enhancements, converting stored procedures into ETL for better performance. • Experience with high volume datasets from various sources like Oracle, DB2, SQL Server, Flat Files, XML files and Netezza tables. • Creating Informatica Mappings to implement complex business rules by using Mapplets, reusable transformations and mapping parameters. • Interacting with Offshore Team everyday using Instant Messenger and Conference CALLS, and assign them work and to follow up with them for its progress. • Written Unix shell scripts for batch processing • Debugged the mappings if it had any failed sessions. • Performed Unit testing and Integration testing. • Designed the business process, grain, dimensions and measured facts using Erwin 3.5. • Written ETL standards followed Naming conventions and prepared documentation for ETL flow in Netezza. • Upgrading the contents of the repository each and every time when required.
Very proficient in Informatica and other ETL tools Worked extensively on Oracle DB, MySQL, DB2 Involved in analysis of existing reports across different locationsand come up with common data loading strategy. Requirement Analysis and Approach Documentation of ETL jobs and scheduling. Creation of Warehouse design approach and ETL scheduling strategy. Participate in activities which include Code and Design Documentation Review. Developing mappings and workflows in Informatica PowerCenter v 8 to handle Type1, Type2, error handling and other complex scenarios. Performance tuning of existing ETL jobs – SQL tuning, Informatica Performance tweaks, Job re-design. Developing tools/re-usable components to enhance/automate development activities to reduce turnaround time. Unit Testing and planning of developed code including preparation of Unit Test case documents. Participate in activities which include Code and Design Documentation Review
4 years of IT experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing and Data Integration Solutions using Informatica Power Center and Abinitio. Proficiency in developing SQL with various relational databases like Oracle, SQL Server and Netezza. Knowledge in Full Life Cycle development of Data Warehousing. Experience with dimensional modeling using star schema and snowflake models. Understanding of business rules completely based on High Level document specifications and implementation of the data transformation methodologies. Hands-on experience in resolving critical and time bound production issues. Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer. Experience in Performance Tuning & Optimization of Informatica and SQL statements. Creation of UNIX shell scripts to run the Informatica workflows and control the ETL flow. Extensive work with Informatica performance tuning involving source level, target level and mapping level bottlenecks. Independent performance of complex troubleshooting, root-cause analysis and solution development. Experienced in ensuring quality and compliance to coding standards. Motivated team player with advanced analytical and problem solving skills.
• Project to process Transactional Processing. We designed the dataware house and we are moving this business transactional data to dataware house. • Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica. • Performed data cleansing and transformation using Informatica. • Worked extensively with Designer tools like Source Analyzer, warehouse Designer,Transformation Developer, Mapping and Mapplet Designers. • Created data mappings to extract data from different source files, transform the data using Filter,Update Strategy, Aggregator, Expression, Joiner Transformations and then loaded into datawarehouse. • Implemented Slowly Changing dimension type2 methodology for accessing the full history of accounts and transaction information. • Used Update Strategy Transformation to update the Target Dimension tables, type2 updates where we insert the new record and update the old record in the target so we can track the changes in the future. • Developed various mapplets that were then included into the mappings. • Used Workflow Manager to read data from sources, and write data to target databases and manage sessions. • Review of the mappings. • Extensively used various transformations like Aggregator, Lookup, Expression, Router and Filter Transformations. • Developed various worklets that were then included into the workflows. Tuned and tested the mapping to perform better using different logic’s to provide maximum efficiency. • Review of the mappings. Monitor, Repository Data modeling duties: • Create Logical Data Model and Conceptual Data Model • Review current System Design Document, Functional Requirement Document and existing Physical Data Models • Create Logical Data Model, Conceptual Data Model for existing application using bottom up approach while following data modeling standards. • Update Client’s Enterprise Logical and Conceptual Data Model Job Detail • Update models with promoted entities, attributes and relationship at application level for which could be used at Enterprise level • Review and Develop Information Exchange Package (IEDP) • Develop IEDP using NIEM for Information Sharing • Create and Update Client’s Taxonomy • Develop and update the enterprise data harmonization with data definitions • Support the development of Data Management Plan and investigate unstructured data