Lead Data Engineer

Lead Data Engineer

16 Jun 2024
Illinois, Riverwoods, 60015 Riverwoods USA

Lead Data Engineer

Vacancy expired!

Discover. A brighter future.

With us, you'll do meaningful work from Day 1. Our collaborative culture is built on three core behaviors: We Play to Win, We Get Better Every Day & We Succeed Together. And we mean it - we want you to grow and make a difference at one of the world's leading digital banking and payments companies. We value what makes you unique so that you have an opportunity to shine.

Come build your future, while being the reason millions of people find a brighter financial future with Discover.
Job Description
Discover Commerce Exchange (DCX) is the Gold Standard for enriched merchant data at DFS, and the solution provides detailed merchant data to cardholders as well as internal and external clients. As part of our Discover Commerce Exchange (DCX) team you will:

• Develop data-driven solutions with current and next generation technologies to meet evolving business needs.

• Provide technical design and develop Extract/Transform/Load (ETL) applications that interface with all key Discover applications.

• Serve as a technical consultant in understanding business needs, assessing impacts, and providing solution options.

• Act as a Leader, guiding development efforts across the team and ensuring best practices are followed.



How You'll Do It

• Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks.

• Work extensively within the Cloud ecosystem and migrate data from Teradata to AWS-based platform.

• Deploy and provide support for application codes and analytical models.

• Provide senior-level technical consulting to peer data engineers during design and development for highly complex and critical data projects.

• Create and enhance data solutions that enable seamless integration and flow of data across the data ecosystem.

• Provide business analysis and develop ETL code and scripting to meet all technical specifications and business requirements according to the established designs.

• Develop real-time data ingestion and stream-analytic solutions leveraging technologies such as Kafka, Apache Spark, Python and AWS-based solutions.

• Utilize multiple development languages/tools such as Python, SPARK, Hive, Java to build prototypes and evaluate results for effectiveness and feasibility.

• Develop application systems that comply with the standard system development methodology and concepts for design, programming, backup, and recovery to deliver solutions that have superior performance and integrity.

• Contribute to determining programming approach, tools, and techniques that best meet the business requirements.

• Provide subject matter expertise in the analysis, preparation of specifications and plans for the development of data processes.

• Offer system support as part of a support rotation with other team members.

• Operationalize open source data-analytic tools for enterprise use.

• Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification.

• Understand and follow the company development lifecycle to develop, deploy and deliver the solutions.

• Designs and develops data ingestion frameworks leveraging open source tools and data processing frameworks.

• Designs and develops complex and critical data projects.

• Develops, implements, and supports application codes and analytical models.

• Creates and enhances data solutions that enable seamless integration and flow of data across the data ecosystem.

• Designs and develops real time processing solutions



Minimum Qualifications

At a minimum, here's what we need from you:

• Bachelor's degree in Information Technology, or related field

• 4+ years of experience in data platform administration/engineering , or related



Desired Qualifications

If we had our say, we'd also look for:

• 6+ years of experience in data platform administration/engineering

• Knowledge and experience using query languages (SQL, Cypher) for relational and graph databases

• Willingness to continuously learn & share learnings with others

• Capability to collaborate with stakeholders and project leaders to understand requirements, deliverables, and set expectations on tasks that you will be responsible for

• Ability to work in a fast-paced, rapidly changing environment

• Experience working in an agile and collaborative team environment

• Excellent written and verbal communication, presentation and professional speaking skills

• Passion for learning and interest in pursuing classroom training and self-discovery on a variety of emerging technologies

• Hands-on experience with Amazon Web Services (AWS)-based solutions such as Lambda, Dynamodb, Snowflake and S3

• Experience in migrating ETL processes (not just data) from relational warehouse databases to AWS-based solutions

• Experience within the financial industry

#Remote

#BI-Remote

#LI-KE

What are you waiting for? Apply today!

The same way we treat our employees is how we treat all applicants - with respect. Discover Financial Services is an equal opportunity employer (EEO is the law) . We thrive on diversity & inclusion. You will be treated fairly throughout our recruiting process and without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status in consideration for a career at Discover.

Job Details

Jocancy Online Job Portal by jobSearchi.