Design, develop ETL jobs based on requirements and data mappings from source to EDW and to Data marts.
Perform Data Collection, cleansing, transformation and loading into EDW and Data Marts.
To perform batch production support and troubleshooting production support issues post-deployment and come up with solutions as required.
Understand the goals and risks associated with technical and business requirements and align data solutions accordingly.
Involve in performance tuning of SQL and keep the EDW and Data marts functioning optimally.
Manage data integration with other applications and source systems.
Work effectively with all technical personnel (Development Team, business analysts, security, risk and compliance, data center, project managers, data architects and testers), and clearly translate business priorities and objectives into technical solutions.
Responsible for development, support and maintenance of current Big Data platform, and the design and implementation of new Extract Transform and Load processes to load data into the data warehouse from various sources.
Responsible for the ongoing operational stability of the ETL processes to ensure they are properly monitored and audited to provide data integrity, accuracy and timeliness of delivery.
The. Developer is responsible for leading and mentoring developers and includes data analysis, source to target data mappings, job scheduling, and development and testing of ETL programs.
Understand Talend ETL architecture , knowledge of Talend transformations.
Able to create reusable objects.
Able to create source to target mapping and Low level design.
Schedule, compile, and run jobs efficiently.
Minimum 3+ years' experience in Data Warehouse / Data Marts and solid hands-on experience in using .Hadoop using Sqoop, Spark, Scala, Python, Oozie is a must.
3+ years of solid hands-on experience in using Talend ETL.
3+ years in UNIX scripting, FTP and basic network principles.
3+ years skills in database and data warehouse design principles.
3+ years of SQL Coding and querying skills.
Preferred experience with sourcing data from structured, unstructured and semi-structured data sources.
Preferred experience with loading data into Netezza.
Experience with loading data into Hadoop is a must.
3+ years of Scheduling tool like Autosys, Control-M.
Experience in testing, debugging skills and troubleshooting support and development issues.
Proactive and self-managed to achieve deliverables.
Experience working in the Insurance Industry is an advantage.