Data Engineer w/ AWS Redshift below is the JD attach, please go through it and let me know if you are interested Position : Data Engineer Location : Remote 100% Visa :USC/EAD/EAD/ EAD
This position only accepts W2 candidates who can directly work with us. Outside company is working on old Data Warehouse and modernizing it on AWS. This candidate will own the new environment and build upon it. MUST have experience working with Redshift and building the data pipelines that feed it. Communication is "critical" as they will be working closely with the business SME's and the hired company. Need someone with a minimum of 3 years of relevant experience. There is flexibility to learn along the way, if needed.
100% remote. Soft Skills (Very important) Willing to color outside the lines Comfortable with a start-up pace and ambiguity Comfortable leading through abstract requirements & SDLC processes Partner with 3rd party development teams, be a bridge to share knowledge and develop alongside Good communication skills Willing to be the “seed” to an internal development team that will evolve Very strong communication skills and really confident being the face of IT to various business partners Ability and willingness to play hands-on role as well as conceptual architect role.
Qualifications: • Bachelor’s degree in Computer Engineering, Computer Science, Information Systems or related discipline • 3+ years relevant experience • Experience in capturing end user requirements and align technical solutions to the business objectives • Understanding of different types of storage (filesystem, relation, MPP, NoSQL) and working with various kinds of data (structured, unstructured, metrics, logs, etc.) • Understanding of data architecture concepts such as data modeling, metadata, workflow management, ETL/ELT, real-time streaming), data quality • 5+ years of experience working with SQL • Experience with setting up and operating data pipelines using Python or SQL • 3+ years of experience working on AWS, Google Cloud Platform or Azure • Experience working with data warehouses such as Redshift, BigQuery or Snowflake • Exposure to open source and proprietary cloud data pipeline tools such as Airflow, Glue and Dataflow • Experience working with relational databases • Experience with data serialization languages such as JSON, XML, YAML • Experience with code management tools (e.g. Git, SVN) and DevOps tools (e.g. Docker, Bamboo, Jenkins) • Strong analytical problem-solving ability • Great presentation skills, written and verbal communication skills • Self-starter with the ability to work independently or as part of a project team • Capability to conduct performance analysis, troubleshooting and remediation Best Regards
Kavitha R 14742 Newport Ave | Suite 108 | Tustin | CA 92780 Phone : Extn: 502 Email: r.kavitha@ravh-it.com|www.ravh-it.com