We are not just offering a job but a meaningful career! Come join our passionate team!
As a Fortune 50 company, we hire the best employees to serve our customers, making us a leader in the insurance and financial services industry. State Farm embraces diversity and inclusion to ensure a workforce that is engaged, builds on the strengths and talents of all associates, and creates a Good Neighbor culture.
We offer competitive benefits and pay with the potential for an annual financial award based on both individual and enterprise performance. Our employees have an opportunity to participate in volunteer events within the community and engage in a learning culture. We offer programs to assist with tuition reimbursement, professional designations, employee development, wellness initiatives, and more! Visit our Careers page for more information on our benefits, locations and the process of joining the State Farm team!
REMOTE: Qualified candidates (outside of hub locations listed below) may be considered for 100% remote work arrangements based on where a candidate currently resides or is currently located.
HYBRID: Qualified candidates (in or near hub locations listed below) should plan to spend time working from home and some time working in the office as part of our hybrid work environment.
HUB LOCATIONS: Dunwoody, GA; Richardson, TX; Tempe, AZ; or Bloomington, IL
Check out our Enterprise Technology department!
Responsibilities
Do you want to work on cutting edge technology? State Farm is looking for an ambitious Software Engineer with passion in managing large scale databases in AWS Cloud, to enable State Farm's data as service infrastructure and software. As a Software Engineer in this role, you will get to modernize data systems for analytic consumption by the enterprise. This position will function as a product team member tasked with building, implementing, and maintaining database clusters, oversee security, conducting performance monitoring, troubleshooting issues, making requested changes and updates, managing database access, and setting and maintaining database standards. Bring your agile skills to a product team doing new development on cloud data technologies! If you are an experienced Software Engineer with Database Administration skills, this position is for you!
Responsibilities:
Applies skills, tools, security processes, applications, environments and programming language(s) to complete complex assignments.
Lead strategic work and utilizes application architecture to increase efficiency and effectiveness of complex issues
Applies a wide application of complex principles, theories, and concepts in computer science for software engineering solutions
Applies advanced understanding regarding technology trends/changes, best practices, and processes to complete assignments and influence the direction of product solutions
Applies advanced understanding of product design, data design and movement and test to ensure quality outcomes
Identifies, Diagnoses, and resolves complex problems/issues.
Responsible for the analysis, design, deployment, support, and security of technology to ensure the organization is efficiently managing its technology and data-related assets in accordance with market best-practices and external regulations
Exhibits DevOps and Agile mindset where team is accountable for product from inception to sunset
Possesses an understanding of User Experience practices to improve usability and interaction between the customer and product
Maintains advanced understanding in software engineering topics, including classes, functions, security, containers, version control, CI/CD, and unit tests
Maintains advanced understanding in programming (e.g., Python), and database functionality (e.g., SQL, Non-SQL)
Maintains advanced understanding in compute environments, including but not limited to Linux, Hadoop, Mainframe, Public Cloud, and containers
Leverages an advanced understanding of the State Farm organizational structure to navigate the organization
May take on several simultaneous work stories or focus on a single complex story
Understands, supports, and helps define the vision and direction for the product development
Provides mentorship, technical guidance, training, and may delegate work to others
Champions and leads others to design and develop for exceptional user experience
May have membership and engage with technical groups in the organization, like dev guilds
Influences and provides direction on product development practices, coding, data and testing standards, code reviews and software architecture
Conducts research and integrate industry best practices into processes and potential solutions
Mentors, drives, coordinates with other product team members
Authors and contributes to technical product documentation and support articles
Additional Responsibilities:
Write database code (Redshift) to support data analytics queries.
Write database code for ETL jobs.
Provide consistent database maintenance, optimization and best practices input to the data warehouse.
Experience with Redshift and/or other distributed computing systems.
Strong stored procedure code writing experience in Redshift.
Strong experience in hands-on Redshift troubleshooting.
Strong experience in Redshift performance tuning.
Experience in AWS cloud deployment.
Learns all aspects of AWS Databases and storage patterns to include, Amazon Aurora, Relational Database Service (RDS), Dynamo, Redshift.
You will have ownership of Redshift database clusters, and will manage their upgrades and patching, while working on a variety of development, maintenance, and performance tuning projects in coordination with project and development teams.
Automation of day-to-day activities like cloning, disaster recovery, user management, enhancing monitoring, and capacity management
Conduct performance monitoring and troubleshooting
Securing the application components and databases at each layer.
Extensive experience with high-volume OLTP and OLAP database systems.
Qualifications
Preferred Technical Skills and Experiences Needed:
Technical background can include but not be limited to: Amazon Web Services, Java, Python, Hadoop, Spark, Pig, Hive, DB2, RDS, Redshift, SQL, etc.
Technical knowledge of at least one data discipline including data design, data movement, data infrastructure platforms, and data analytics
Understanding of Disaster Recovery, High Availability, Fail Over and Redundancy
Champion Information Security best practices and understanding of cloud networking: VPN, Direct Connect, VPC, etc.
Development and scripting experience (Powershell, Bash, Python, Azure CLI, etc)
Understands and develops data access related to storing, retrieving, or acting on housed data
Tests requirements for the movement, replication, synchronization, and validation of data
Preferred Education: BA/BS degree and Cloud certifications
Additional Desired Technical Skills and Experiences:
Champion of DevOps practices and experience with DevOps tools (Jenkins, Travis or Gitlab-ci)
Systems Automation knowledge (Terraform, Puppet, Anisble)
Embrace and focus on emerging technology
Understanding of container technologies and orchestration systems (Docker, Swarm, Kubernetes)
SPONSORSHIP: Applicants are required to be eligible to lawfully work in the U.S. immediately; employer will not sponsor applicants for U.S. work authorization (e.g. H-1B visa) for this opportunity.
For Los Angeles candidates: Pursuant to the Los Angeles Fair Chance Initiative for Hiring, we will consider for employment qualified applicants with criminal histories. For San Francisco candidates: Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.