Hadoop Developer

Hadoop Developer

23 Jan 2024
California, Sanramon, 94582 Sanramon USA

Hadoop Developer

Vacancy expired!

Role: Hadoop Developer

Location: San Ramon, CA

Duration: 9 Months with Possible Extension

Summary:
Customer is seeking a Developer (Other Specialty) to design, develop, and implement applications using in-demand languages and technologies (e.g. - Java, Websphere, Informatica etc.) to support business requirements.

Job Responsibilities:
  • Analyze highly complex business requirements; generate technical specifications to design or redesign complex software components and applications.
  • Act as an expert technical resource for modeling, simulation and analysis efforts.
  • Leverage industry best practices to design, test, implement and support a solution.
  • Assure quality security and compliance requirements are met for supported area.
  • Be flexible and thrive in an evolving environment.
  • Adapt to change quickly and adjust work accordingly in a positive manner.

Qualifications:
  • Bachelor's degree in a technical field such as computer science, computer engineering or related field required.
  • 8-10 years’ experience required.
  • Development experience in needed language or technology (e.g. - Websphere, Informatica etc.).
  • Hands on experience in designing, developing and successful deployment of large scale projects from end-to-end.
  • Hands on experience in following the iterative and agile SDLC.

Hadoop Admin/Configuration Responsibilities:
  • 6-8 years’ experience in Big Data Hadoop administration and configuration.
  • Responsible for implementation and ongoing administration of Hadoop infrastructure. Horton Works (HDP) Stack is a must.
  • Experience in Apache Atlas, Apache Ranger, Apache Kafka, Kerberso, HA cluster setup is a must
  • Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
  • Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
  • Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise/Horton Works Ambari, Dell Open Manage and other tools.
  • Performance tuning of Hadoop clusters & MR, Apache Spark HBase, Druid
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files.
  • File system management and monitoring.
  • HDFS support and maintenance.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
  • Enabling prodOps teams for data pipeline maintenance
  • Experience with streaming technologies such as Spark streaming.
  • Experience with various messaging systems, such as Kafka
  • Experience with Apache Airflow
  • Experience in implementing Real Time Streaming Architecture
  • Experience with Data Lab concept with Zeppelin, Jupyterhub
  • Experience in Docker, Kubernetes is a plus.
  • Experience with Grafana, Prometheus is a plus.
  • Experience with scripting tools like Python, Ansible , Java is a plus
  • CI/CD automation using Gitlab

Regards,

Madhu Chinta, Sr. Talent Acquisition

PROCYON TECHNOSTRUCTURE
Direct| Fax: (415) 483-1620
| procyonts.com

Related jobs

Job Details

  • ID
    JC8484914
  • State
  • City
  • Job type
    Contract
  • Salary
    $60 - $65
  • Hiring Company
    PROCYON Technostructure
  • Date
    2021-01-22
  • Deadline
    2021-03-23
  • Category

Jocancy Online Job Portal by jobSearchi.