Vacancy expired!
RESPONSIBILITIES:
Kforce has a client that is seeking a Big Data Developer in Charlotte, NC.
Duties Include:
Big Data Developer will lead complex initiatives on selected domains
Ensure systems are monitored to increase operational efficiency and managed to mitigate risk
Define opportunities to maximize resource utilization and improve processes while reducing cost
Lead, design, develop, test, and implement applications and system components, tools and utilities, models, simulation, and analytics to manage complex business functions using sophisticated technologies
Resolve coding, testing and escalated platform issues of a technically challenging nature
Lead the team to ensure compliance and risk management requirements for supported area are met and work with other stakeholders to implement key risk initiatives
Mentor less experienced software engineers
Collaborate and influence all levels of professionals including managers
Lead the team to achieve objectives
Partner with production support and platform engineering teams effectively
This technical role will be responsible for:
Designing high performing data models on big-data architecture as data services
Designing and building high performing and scalable data pipeline platform using Hadoop, Apache Spark, and Amazon S3 based object storage architecture
Working with business analysts, development teams and project managers for requirements and business rules
Collaborating with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions
Effectively working in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist
Working with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure
Supporting ongoing data management efforts for Development, QA, and Production environments
REQUIREMENTS:
10+ years of application development and implementation experience
10+ years of experience delivering complex enterprise-wide information technology solutions
10+ years of ETL (Extract, Transform, Load) Programming experience
10+ years of reporting experience, analytics experience, or a combination of both
5+ years of experience delivering ETL, data warehouse, and data analytics capabilities on big-data architecture such as Hadoop
5+ years of Hadoop experience
5+ years of operational risk, conduct risk or compliance domain experience
5+ years of Java or Python experience
Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing, and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark, or Apache Storm
Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects
Knowledge and understanding of DevOps principles
Excellent verbal, written, and interpersonal communication skills
Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations
Ability to interact effectively and confidently with senior management
Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.