Normalize Csv Data to Existing Database Vtiger
Closed - This job posting has been filled and work has been completed.
I am looking to have a macro/ filter developed to integrate with vtiger. That will allow data that is being imported on a regular basis from a separate database into an installation of vtiger. The data that is being exported from Database "A" is not structured in manner that allows importing simply into our vtiger CRM. Since there will be weekly/ daily updates to this database, I need to ensure that the data that is been imported is structured so that the import can be mapped efficiently. Phase One is to create at filter that restructures the data to be imported into vtiger. Phase to is to integrate that filter into the gui of vTiger import function, and to check for duplicate data or records that have change with an update record function older vs Newer. Existing records that have had changes made to them will eventual be updated automatically if the changes to those records meet certain filter criteria. The filter criteria will need to be created. Phase three is to automate the import by either monitoring a file folder for new .csv files updates or a direct data feed from the existing database "A". This project will start ASAP. You should have a strong background with MySql, Vtiger and Database administration. I am attaching a sample data set for your review. In the excel file that is included, the red highlighted columns can be omitted from import, while the yellow highlighted columns need to be normalized. That average data set size is 1200 to 1800 records.