"Expectations & Responsibilities
ü Expertise in building data pipelines, data ingestions , data integrations , data preparations, NLP's and traditional Data-warehouse/BI systems across multiple projects.
ü Expertise in designing, validating and implementing multiple projects (BI & Analytical) across the hybrid infrastructure (On-cloud to On-Premise and vice versa)
ü Required to involve and participate in data architecture, data modeling, statistical modeling and regression analysis.
ü Research and discover new methods to acquire data, store data and explore various file formats.
ü Required to collaborate and work with multiple team members between various teams and mentor junior team members.
ü Prepare functional, technical design documentation for application development
ü Involve in Architecture review and guidance on best practices
ü Willing to work across multiple Data Engineering platforms and projects.
Desired Skills & Experience
ü 4+ years hands-on experience as Talend developer, including Talend Real-Time Big Data
ü 2+ years' experience with Kafka and Spark
ü 8+ years in Data Engineering (BI/DWH/ETL) Development experience
ü High proficiency in Python and Java
ü Candidate should be comfortable working with Talend Data Mapper, CICD workflow and complex transformation components.
ü Manage Talend code artifacts using repository tools such as Git and Artifactory/Nexus
ü Well versed with deploying, migrating, and publishing code in Talend Environments
ü Should have experience migrating and integrating on premise sources with AWS Cloud platform
ü Experienced with AWS services including S3, EMR, Athena, and Lambda.
ü Experience with Parquet file format.
ü Solid experienced in SQL language, and with traditional databases like Oracle, SQL Server and IBM Iseries.
ü Hands-on experience with data modeling and Design and Development for both OLTP and Data warehousing environments.
ü Experienced with multiple operating systems, especially UNIX, Linux, Solaris and main cloud platforms."