This position is a key role in the development, test, and deployment of complex solutions.
Essential Functions
Build data strategy for broad or complex requirements with insightful and forward-looking approaches that go beyond the direct team and solve large open-ended problems.
Participate in the strategic development of methods, techniques, and evaluation criteria for projects and programs.
Drive all aspects of technical and data architecture, design, prototyping and implementation in support of both product needs as well as overall technology data strategy.
Provide leadership and technical expertise in support of building a technical plan and backlog of stories, and then follow through on execution of design and build process through to production delivery.
Guide a broad functional area and lead efforts through the functional team members along with the team's overall planning.
Represent engineering in cross-functional team sessions and able to present sound and thoughtful arguments to persuade others. Adapts to the situation and can draw from a range of strategies to influence people in a way that results in agreement or behavior change.
Collaborate and partner with product managers, designers, and other engineering groups to conceptualize and build new features and create product descriptions.
Actively own features or systems and define their long-term health, while also improving the health of surrounding systems.
Assist Support and Operations teams in identifying and quickly resolving production issues.
Develop and implement tests for ensuring the quality, performance, and scalability of our application.
Actively seek out ways to improve engineering and data standards, tooling, and processes.
Supporting the company's commitment to risk management and protecting the integrity and confidentiality of systems and data.
Minimum Qualifications
Education and/or experience typically obtained through a Bachelor's degree in computer science or related technical field.
Ten or more years of relevant related experience
Seven or more years of experience in the development of complex data platform, distributed systems, SaaS, cloud solutions, micro services.
Six or more years of experience in the development of Data Warehouse, Big Data - structured & unstructured platforms, real-time & batch processing, data standards.
Four or more years of experience in development of Business Intelligent Solutions
Two or more years of experience in development / operationalization of Artificial Intelligence / Machine Learning Models / Model development life cycle activities (implementing feature engineering, data pipelines, model operationalization, model monitoring).
Demonstrated experience in delivering business-critical systems to the market.
Ability to influence and work in a collaborative team environment.
Experience designing/developing scalable systems.
Extensive experience implementing Data Warehouse (Star / Snow flake schemas) using SQL Server or equivalent, Big Data - HDFS, Elastic Search, ETL process development using IBM Infosphere or equivalent, Reusable Frameworks
Experience with implementing data science solutions using Python, Spark, PySpark, R, Data Robot.
Experience with event-driven architecture and messaging frameworks (Pub/Sub, Kafka, RabbitMQ, etc).
Working experience with cloud infrastructure (Google Cloud Platform, AWS, Azure, etc).
Knowledge of mature engineering practices (CI/CD, testing, secure coding, etc).
Knowledge of Software Development Lifecycle (SDLC) best practices, software development methodologies (Agile, Scrum, LEAN etc) and DevOps practices.
Preferred Qualifications
MS or PHD
Experience using AI/Client Model Frameworks like Tensorflow, Sage Maker, Scikit, PyCharm
Big Data Platforms (Cloudera, S3)
Database platforms (Oracle, SQL Server)
Computer language experience (Python, PySpark, and R)
Knowledge of Aerospike, Scality S3, Elastic Search
Monitoring and Alerting systems experience (AppDynamics)
Knowledge of ACH/EFT
Knowledge of real time payment networks (RTP, FedNow)
Experience in development / operationalization of Artificial Intelligence / Machine Learning Models / Model development life cycle activities (implementing feature engineering, data pipelines, model operationalization, model monitoring).
FinTech experience
Kubernetes experience
Recent Big Data skills in: Hadoop, Kafka, Spark and/or Scala