We are seeking a Data Pipeline / ETL Engineer to design, build, and maintain the data ingestion and transformation processes that power IntegriChain’s analytics and SaaS platform. You’ll work closely with Data Engineers, Analysts, and Product teams to ensure high-quality, reliable, and scalable data pipelines.This is a hands-on engineering role with strong focus on automation, performance optimization, and data quality.Key ResponsibilitiesDesign, implement, and maintain scalable ETL/ELT data pipelines for large and complex datasets (structured, semi-structured, unstructured).Build robust data ingestion frameworks to integrate third-party sources, APIs, and client datasets into our Data Cloud.Optimize pipeline performance, monitoring, and error-handling for reliability and cost-efficiency.Collaborate with product and analytics teams to translate business requirements into efficient data transformations.Ensure data integrity, consistency, and compliance with industry regulations (HIPAA, SOC2, etc.).Support migration to modern cloud-native architectures (Snowflake, Databricks, AWS/GCP/Azure).Document processes, pipelines, and data flows for cross-team visibility