Team Lead - Data Engineering (Remote)

Team Lead - Data Engineering (Remote)

02 Nov 2022
North Carolina, Raleigh / durham / CH, 27601 Raleigh / durham / CH USA

Team Lead - Data Engineering (Remote)



We are looking for a Team Lead to join our Data Engineering team and help build next generation datalake and data pipeline solutions. You will be leading a team of highly skilled data engineers responsible for designing and implementing a scalable datalake/data pipeline solution utilizing various data extraction and transformation technologies. You will provide technical leadership, prioritize and manage day to day technical work of a team of data engineers. You will be core part of a broader Data, Analytics and ML team with the chance to make a significant impact to the broader data initiative within the company.

Principal Responsibilities:


  • Provide technical leadership to a team of data engineers by managing technical aspects of day to day work (No people management responsibilities)

  • Contribute, as part of a team, in designing, testing, and implementing sophisticated data pipeline technologies to extract and transform data from source systems

  • Perform data-modeling on target systems to store data and support analytic querying

  • Communicate with other development teams to gather and document requirements

  • Work in Agile methodology, help manage work in JIRA and lead/participate in design and review discussions

  • Help create/follow best practices for the software development life-cycle including coding standards, reviews, source management, build and testing

  • Collaborate with other engineers in the team to implement best-practices around large scale data processing


Position Requirements:

  • 3+ years of experience as a technical manager or as a team lead

  • 5+ years of experience as a Data Engineer or in a similar role working with large data sets and ETL processes

  • 7+ years of industry experience in software development

  • Knowledge and practical use of a wide variety of RDBMS technologies such as MySQL, Postgres, SQL Server or Oracle

  • Strong SQL experience with an emphasis on analytic queries and performance

  • Experience with cloud-based data warehouse technologies such as Snowflake, AWS RedShift, Google BigQuery

  • Familiarity with either native database or external change-data-capture technologies

  • Practical use of various data formats such as CSV, XML, JSON, and Parquet

  • Use of Data flow and transformation tools such as PDI (Pentaho Data Integration), Apache Nifi or Talend

  • Implementation of ETL processes in languages such as Java, Python or NodeJS

  • Use of large shared data stores such as Amazon S3 or Hadoop File System

  • Thorough and practical use of various Data Warehouse data schemas (Snowflake, Star, OLAP cube)

  • Experience with various "NoSQL" technologies such as MongoDB or ElasticSearch

  • Practical experience with Public Cloud Services such as AWS/Azure/Google Cloud Platform a plus

Related jobs

Job Details

Jocancy Online Job Portal by jobSearchi.