Data Warehouse Archt

Data Warehouse Archt

22 Jun 2024
Arizona, Scottsdale, 85250 Scottsdale USA

Data Warehouse Archt

Vacancy expired!

Job Title: Data Warehouse Archt.

Job Type: Contractor (Remote till covid)

Duration: 6+ months
Location: Scottsdale, AZ


Job Description:
The ideal candidate for this role will be working in an agile environment while maintaining a close partnership with application engineers and reporting analysts. The candidate should be able to collaborate with the Business users & Bl team and able to understand/document the requirements, research, analyze the data and convert functional requirements to technical solutions. The candidate should be able to participate in the development of Business Intelligence solutions to meet the needs of the internal departments (Finance, CRM, Sales, Sales Operations, Marketing) The candidate should possess/demonstrate critical attention to detail and must be able to demonstrate expertise in Database development, SQL optimization, building scalable custom ETL solutions. Candidates should be able to support existing DW operations and BI Reporting tool operations.

Responsibilities include:
  • Work closely with Business stakeholders (Finance, Sales, Marketing and Customer Relations) and reporting Analysts.
  • Create/manage source to target mapping documents and validate business rules with reporting analysts to ensure mapping accuracy.
  • Design, Develop, Test, Deploy and manage ELT/ETL processes using Pentaho DI(Spoon).
  • Work with business users, subject matter experts, project managers, software engineers and testers to gather and clarify requirements
  • Provide Database and ETL Solutions that are in alignment with Nextiva standards and best practices.
  • Perform code reviews/Release management/prod deployment.
  • Work with business users/analysts on ad-hoc financial audit reports requests.
  • Provide operations support to data warehouse, reporting tool and Callidus commissions tool.
  • Document steps needed to migrate database and ETL objects between various environments including Dev, QA, and Prod.
  • Validate accuracy of data between multiple systems and document data gaps.
  • Communicate issues, risks, and concerns proactively to project management.
  • Document ETL process thoroughly to allow peers to assist with support as needed.
  • Self-sufficient and strong communication skills (must be able to work with external groups/departments).
  • Investigate data issues, identify root problems and fix data in the production environment.
  • Manage/Improve Pentaho Data Integration framework.

Requirements:
  • 10+ years of proven experience in multiple database platforms, including Oracle, SQL Server, Teradata and Postgres.
  • 10+ years of experience in design, development, deployment and support activities using ETL tools Informatica, Pentaho, Talend or SSIS.
  • 5+ years of experience in DW production support, troubleshooting and resolving data load errors.
  • Minimum 2 years of experience in consuming data from REST API’s and Kafka events.
  • Minimum 2 years of experience with JSON, XML and CSV sources
  • Strong Analytical and dimensional modeling background is required.
  • Demonstrated ability to take initiative, solve complex data problems and make sound decisions.
  • Advanced experience with Data visualization tools (Tableau, Power BI)
  • Experience in designing highly available Data Warehouse architecture, including identifying and deploying appropriate replication and monitoring capabilities for operational effectiveness.
  • Demonstrated experience in all phases of software development including requirements analysis, design, coding, testing, debugging, implementation, and support
  • Demonstrated expertise in designing and implementing ETL/ELT processes using established industry best practices
  • Strong experience in Relational Data Modelling and Dimensional Data modeling (starschema) and slowly changing dimensions.
  • Advanced experience in writing ad-hoc complex SQL Queries (Oracle and Postgres). Query optimizations and performance improvements.
  • Minimum 5 years of experience with Database procedural languages(PL/SQL, PG/SQL)
  • 5+ years’ experience with the analysis of business processes, workflow, data mapping and cross-data integration in a complex data warehouse environment.
  • In depth understanding of data cleansing, data auditing, data profiling, data modeling, entity-relationship modeling and dimensions modeling.
    Experience in Java, python programming.

Preferred Qualifications

Must have:
  • Informatica Power Center, Informatica iics,Talend, Data Stage, Microsoft SSIS or similar ETL tool.
  • Pentaho experience is a big plus
  • Oracle 12c, Postgres, AWS RDS
  • Dimensional Data Modelling
  • Microsoft Power BI
  • Advanced SQL, PL/SQL, Procedures and Functions
  • Oracle BICC, Salesforce(Nice to have)
  • Postman, Rest API, JSON, XML
  • Java or Python is a plus
  • Confluence,Sharepoint, Git and Jira
  • Excellent written and oral communication

Education: Bachelor's or Master’s Degree in Statistics, Math, Computer Science, Management Information Systems, Data Analytics or another quantitative discipline, or equivalent work experience

Share Resume:

Related jobs

  • The HR Data & Analytics team plays a critical role as enablers of HR to make crew-first decisions that help Vanguard meet its human capital goals. Our cross-functional team empowers our HR partners through powerful analytics, sharp dashboards, and strategic influence. If you share in our passion for teamwork, our vision to revolutionize the way our senior leader\'s act on talent data, and our goal to be an industry leader in People Analytics, we want you to join the People Analytics team!

  • What The Role Is

Job Details

  • ID
    JC15716691
  • State
  • City
  • Job type
    Permanent
  • Salary
    Depends on Experience
  • Hiring Company
    Omega Solutions Inc
  • Date
    2021-05-24
  • Deadline
    2021-07-23
  • Category

Jocancy Online Job Portal by jobSearchi.