Company description:
Capitec Bank is a leading South African retail bank that focuses on essential banking services and provides innovative transacting, savings, insurance and unsecured lending products to individuals. Capitec’s mission is to make banking simple and transparent to help clients – regardless of their level of income or assets – improve their financial lives through a single solution, called Global One.
Job description: Purpose Statement

* To contribute to the design and development of new cloud workloads for platform and product teams, to empower data consumers using Capitec data platforms to deliver client value.
* To maintain and manage the existing cloud data environments and enable data producers to easily contribute to these environments.
* Contribute to evolving the Capitec data platforms through sharing of knowledge, contributing new data features, and enhancing/streamlining existing processes e.g. improved re-use of code

Education (Minimum)

* Bachelor’s Degree in Information Technology or Information Technology – Programming

Education (Ideal or Preferred)

* Honours Degree in Information Technology – Computer Science or Information Technology – Systems Engineering

Knowledge and Experience

Knowledge:
Minimum:
Must have detailed knowledge of:
* Application development with scripting languages (Python)
* Relational database management systems
* Provisioning cloud resources using Infrastructure as Code (Terraform)
* Core AWS services (S3, EC2, VPC, IAM)
* Cloud data lake and warehouse concepts
* Software testing practices
* Basic Terminal/bash usage
* Software Version Control systems (git) and deployment tools (CI/CD)
* Structured vs Unstructured data

Ideal:
Knowledge of:
* AWS serverless services (Step Functions, Lambda, EventBridge, API Gateway)
* AWS data lake and warehousing services (Glue, LakeFormation, EMR)
* Data lake and warehouse architecture.
* AWS Well-Architected Framework.
* Collaboration tools (JIRA, Confluence, [URL Removed] * Trusted insights into Data Governance, Data Management, Data Quality, Data Security and Master Data Management.
* Solid understanding of:
* Banking systems environment
* Banking business model

Experience:
Minimum:
* At least 3 years’ proven experience in computer programming and data engineering, together with a relevant 3 year tertiary qualification

OR
* At least 4 – 5 years’ proven experience in computer programming and data engineering
* Proven experience in:
* AWS data stack (AWS Glue, AWS Redshift, AWS S3, AWS LakeFormation)
* Operationalizing Batch and/or Realtime data pipelines.
* Python, PySpark, or Scala
* Version control in git, and CI/CD deployment
* Any infrastructure as code tool

Ideal:
* At least 3 years’ proven experience in cloud data engineering, particularly in AWS, together with a relevant 3 year tertiary qualification

OR
* At least 4-5 years’ proven experience in cloud data engineering, particularly in AWS
* Proven experience in: Apache Spark, Hudi, Presto.
* Distributed Systems (Apache Hadoop, Amazon EMR)
* Advanced shell scripting.
* Infrastructure as Code (Terraform)

Skills

* Analytical Skills
* Communications Skills
* Computer Literacy (MS Word, MS Excel, MS Outlook)
* Interpersonal & Relationship management Skills
* Problem solving skills

Additional Information

* Clear criminal and credit record

Desired Skills:

  • Cloud
  • Power BI
  • SQL

Learn more/Apply for this position