Hi, Role(s): Data Engineer- Pyspark Role Location(s): Carlsbad, CA (remote till covid) Duration : Long Term Project Overview: The resource will be part of Platform team helping develop/support cloud-based applications on AWS and perform system admin work for the Platform to enable Data Science related use cases Role Scope / Deliverables:
Design, develop, test, deploy, support, and enhance cloud-based applications using AWS • Assist with requirement gathering, capturing of technical designs, and creation of test cases • Assist with troubleshoot and resolution of production application issues • Research new technologies with hands-on experiments, PoC implementations, and technology evaluation presentations for sharing with stakeholders • Assist with AWS cloud administration tasks to support / enable the user community • Assist with developing Wiki content such as cloud solution architecture / design, usage best practices of AWS components and services for sharing with the development community • Assist with developing and enhancing standard operating procedures to improve process efficiency and effectiveness • Help drive continual improvement in systems operations through tools and automation • Participate in collaboration with Cloud Engineering, Networking, Storage, and other teams to ensure cloud deployments are performant and reliable • Participate in working with IT Security to ensure the solutions meet data security and compliance requirements
Key Skills: Minimum Requirements/Qualifications: • Bachelor’s Degree required • 5+ years’ IT experience, 2 years of which in application development using Java or Python • A minimum of 1-2 years’ experience in developing cloud-based applications on AWS • Hands-on experience with AWS IAM, EC2, Lambda, S3, DynamoDB, RDS, CloudFormation, and CloudWatch • Strong Experience with building back-end solutions in Java or Python on AWS • Experience with IDE tools such as Eclipse, source control tools such as GitHub • Ability to troubleshoot code, debug root cause of application issues • Experience with database technologies and SQL • Experience with Agile / Scrum software development • Experience with Linux environments • Understanding of security concepts with experience in implementing security controls and compliance requirements • Must be self-starter, detail oriented, and highly motivated individual • Ability to take ownership of work assignments and manages them to completion • Ability to work as part of a large cross-functional, geographically distributed team as well as function independently • Strong problem solving skills • Ability to convey complex information in both written and oral form to both technical and non-technical audiences • PySpark and following AWS technologies: Lambda, API Gateway, Glue, EMR, ECS, Kinesis, SQS, SNS, RDS, DynamoDb, Cognito, Redshift. Preferred Qualifications: • BS in Computer Science or Mathematics preferred • Experience with Databricks or Apache Spark a plus • AWS Certification such as “AWS Certified
Thanks & Regards Peter Logic Soft Inc 5900 Sawmill Rd., Suite 200, Dublin, OH 43017-2588 WORK: x 211 | FAX: | Email: | Visit Us: www.logicsoftusa.com |
Disclaimer: Under Bills.1618 Title III passed by the 105th U.S. Congress this mail cannot be considered Spam as long as we include contact information and a method to be removed from our mailing list. To be removed from our mailing lists please reply with the word "REMOVE" in your subject line. Include complete address and/or domain/aliases to be removed.