Data Architect

Data Architect

02 Nov 2024
North Carolina, Charlotte, 28255 Charlotte USA

Data Architect

Vacancy expired!

job summary:

Data Architect

Looking for Big Data Platform Architect to be a part of the development of next-generation enterprise big data platform that supports every line of business must be well-versed in building Big Data analytical platforms using containerization, cloud technologies and Big Data parallel processing technologies as well as the advantages, disadvantages and trade-offs of various architecture approaches. Candidate will be responsible for developing architecture and deployment plans for Big Data implementations. As a Big Data Platform Architect candidate should have sound knowledge and architecture experience with Big Data tools, Data Science tools and libraries, Machine Learning, Streaming Data and Enterprise Data Warehousing. Candidate should be able to help accelerate our customer's journey to Private Cloud by moving and improving existing Hadoop installations, modernizing their data lakes with emerging and proven industry trends. Must be familiar with agile methods and related SDLC tools.


Responsibilities

- Design and Develop Big Data architecture patterns on on-prem and Private Cloud platforms

- Work on new product evaluation, certification, defining standards for tool fitment to the platform

- Develop technical architecture for enabling Big Data services using industry best practices for large scale processing

- Design, build, and automate Big Data solutions centered around the Kubernets container orchestration platform

- Stand up architecture review, operating model, routines, and evaluation criteria for Big Data container platform adoption by applications

- Maintain in-depth knowledge of the organization's technologies and architectures.

- Ensure the reference architecture is optimized for larger workloads and come up with recommended tuning techniques

- Develop standards and methodologies for benchmarking, performance, evaluation, testing, data security and data privacy

- Communicate architectural decisions, plans, goals and strategies.

- Participate in regular scrum calls to track progress, resolve issues, mitigate risks and escalate concerns in a timely manner

- Contribute to the development, review, and maintenance of requirements documents, technical design documents and functional specifications


Required Skill Set

- Experience in Big Data/Analytics/Data science tools and a good understanding of the leading products in the industry are required along with passion, curiosity and technical depth

- Thorough understanding and working experience in Cloudera/Horton Hadoop distributions

- Solid functional understanding of the Big Data Technologies, Streaming and NoSQL databases

- Experience in working with Big Data eco-system including tools such as YARN, Impala, Hive, Flume, HBase, Sqoop, Apache Spark, Apache Storm, Crunch, Java, Oozie, SQOOP, Pig, Scala, Python, Kerberos/Active Directory/LDAP

- Experience in solving Streaming use cases using Spark,Kafka,NiFi

- Thorough understanding, technical/architecture insight and working experience in Docker,Kubernets

- Containerization experience with Big Data stack using Open Shift/Azure

- Exposure to Cloud computing and Object Storage services/platforms

- Experience with Big Data deployment architecture, configuration management, monitoring, debugging and security

- Experience in performing Cluster Sizing exercise based on capacity requirements

- Ability to build partnership with internal teams, vendors on resolving product gaps/issues and escalate to the management on timely manner

- Good Exposure to CI/CD tools, application hosting, containerization concepts

- Excellent verbal and written skills, Team skills, Proficient with MS Visio, analytical and problem solving skills

- Must be a self-starter, excellent communication and interpersonal skills.

- problem solving and analytical skills

- Effective verbal and written communication skills


Years of experience required:

5-7, 7-10 or 10+


Top required IT/Technical skill-sets:

Hands on architect with coding experience, Kubernetes, SME in one of the following: Kafka/Spark/NiFi/Ranger

location: CHARLOTTE, North Carolina

job type: Contract

salary: $60 - 70 per hour

work hours: 8am to 5pm

education: Bachelors



responsibilities:

- Design and Develop Big Data architecture patterns on on-prem and Private Cloud platforms

- Work on new product evaluation, certification, defining standards for tool fitment to the platform

- Develop technical architecture for enabling Big Data services using industry best practices for large scale processing

- Design, build, and automate Big Data solutions centered around the Kubernets container orchestration platform

- Stand up architecture review, operating model, routines, and evaluation criteria for Big Data container platform adoption by applications

- Maintain in-depth knowledge of the organization's technologies and architectures.

- Ensure the reference architecture is optimized for larger workloads and come up with recommended tuning techniques

- Develop standards and methodologies for benchmarking, performance, evaluation, testing, data security and data privacy

- Communicate architectural decisions, plans, goals and strategies.

- Participate in regular scrum calls to track progress, resolve issues, mitigate risks and escalate concerns in a timely manner

- Contribute to the development, review, and maintenance of requirements documents, technical design documents and functional specifications






qualifications:


  • Experience level: Experienced
  • Minimum 3 years of experience
  • Education: Bachelors


skills:
  • Data Architect
  • Hadoop



Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.

Related jobs

  • About this role:

  • Teachers Insurance and Annuity Association of America seeks a Data Quality and Governance Analyst at our office in Charlotte, NC to complete assessment, improvement, and governance of quality and fitness-for-purpose data, ensuring data supports business processes on software platforms. Build, manage, and implement data and pipeline capabilities, including data modeling, process design, and data pipeline architecture, including all phases of ETL (extract, transform, and load) processes. Develop reports and dashboards to identify key business metrics, trends, and analytical needs, and optimize dashboard reporting for operations and management to enable decision making. Direct the design and implementation of data management solutions to maintain common, firm-wide data standards. Review and delegate work to lower-level team members and influence the team to implement new processes and methodologies. Strengthen the organization\'s data governance capabilities to ensure high data quality throughout the data lifecycle andimplement data controls to support business objectives. Ensure business process, system support, and data quality governance for master data through data coordination and integrationto ensure efficient processes and consistent data flows to business and stakeholders. Telecommuting permitted 2-3 days per week.

  • Data Manager

  • Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.

  • sql data engineer with ssis.

  • .NET Solution Architect

  • Data Migration Tester

Job Details

  • ID
    JC5440546
  • State
  • City
  • Job type
    Contract
  • Salary
    $60 - 70 per hour
  • Hiring Company
    Randstad Corporate Services
  • Date
    2020-11-02
  • Deadline
    2021-01-01
  • Category

Jocancy Online Job Portal by jobSearchi.