BIG DATA ENGINEER - III

BIG DATA ENGINEER - III

17 May 2024
Texas, Grandprairie, 75050 Grandprairie USA

BIG DATA ENGINEER - III

Vacancy expired!

Location: Grand Prairie, TX
Description:
Description:

As a Big Data Engineer on the Google Cloud Platform the candidate will help design, develop, deploy, maintain code in Google Cloud Platform using services such
as Big Query, Pub Sub, Data Flow, Data Fusion, Google Cloud Storage, Composer, and Looker. Also the role requires enabling mining and data analyzing to help AI leaders,
researchers and other business teams make data driven decisions for data collection, diversity, training, and evaluation. The engineer will spend a majority of the time
hands-on writing code and peer reviewing high performance, high quality, and well tested and well architected code. In addition to establishing and maintaining good working
relationships internally, this position will participate in various cross-functional organizational initiatives.
Spend a majority of the time hands-on writing code and peer reviewing high performance, high quality, and well tested and well architected code.

MUST HAVE SKILLS (Most Important):

• Hands-on experience in Google Cloud Platform Suite (3+ years minimum) such as Big Query, PubSub, DataFlow, Data Fusion, Google Cloud Storage, Dataproc, Composer, and Looker.
• Solid understanding of Google Cloud architecture
• Strong work experience in SQL/Data warehousing
• 4+ years of Working experience on tools like Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, MapReduce, etc.
• 3+ years of programming experience in Scala, Java or Python
• 4+ years of working experience on Hadoop/HDFS, Hive
• Experience developing and deploying ETL / ELT processes and documentation including physical data model, source to target mappings, ETL / ELT packages (Matillion, Fivetran, Spark etc.)
• Experience with metadata management, data governance, data catalogs, and data discovery
• Experience with data quality monitoring and alerting on dynamic data sources, including anomaly detection
• Knowledge of GIT/Jenkins and pipeline automation is a must.
• Demonstrate excellent verbal and written communications skills with technical and non-technical clients
• Demonstrate creative, strategic thinking and problem solving; able to exercise independent judgment in a highly complex environment using leading-edge technology

DESIRED SKILLS:
• Certification in Google Cloud Platform Data Engineer/Architect
• Prior experience working with: container technology such as Docker, version control systems (Github), build management and CI/CD tools (Concourse, Jenkins), and monitoring tools (App Dynamics, etc.)
• Working knowledge on the Migration from Hadoop/HDFS to Google Cloud Platform
• Working experience on large migration Projects is a big plus.
• Cost Optimization and Workload management
• Good knowledge of Teradata and/or MS SQLServer suite
• Experience in creation of analytical dashboards using data visualization tools like Kibana, Tableau, Qlik and associated architectures.
• Experience in an Agile environment utilizing Rally/One Jira
• Excellent verbal and written communications skills with technical and non-technical clients

JOB DUTIES:
• Design and build data engineering solutions using Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc.
• Monitoring and improving the quality of the components throughout the development life cycle
• Designing and Developing data modules that convert underlying raw data to more readily usable formats for reporting
• Performing data deep dives to pinpoint Problems
• Identify the application bottlenecks and opportunities to optimize performance and optimize the performance
• Work on extracting, Loading, Transforming, cleaning, and validating data using cloud ETL/ELT tools
• Work with the team to conduct training workshops to identify data sources, flows, and requirements
• Preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders
• Working with Business Analysts and other business teams to get the required data to build the semantic views.
• Periodically update senior management with the status of the project with excellent written and verbal communication skills
• Troubleshoot production issues and coordinate with the support team for code deployment.

EDUCATION/CERTIFICATIONS:

Bachelor's degree with 4-6+ years of experience in Data Engineering/BI areas with 3+ years data engineering on Google Cloud Platform and using Bigquery.

Contact:

This job and many more are available through The Judge Group. Find us on the web at www.judge.com

Job Details

Jocancy Online Job Portal by jobSearchi.