• Matric / Grade 12 essential
  • 3 year IT or IS degree or diploma or related field is essential
  • Relevant cloud certification at professional level or above essential
  • 5+ years BI or related software development experience essential
  • Agile exposure, Kanban or Scrum
  • Experience managing the development life-cycle for agile software development projects
  • Expert level experience in designing, building and managing data pipelines for batch and
  • streaming applications
  • Experience with performance tuning for batch based applications like Hadoop, including working
  • knowledge using Nifi, Yarn, Hive, Airflow and Spark
  • Experience with performance tuning streaming based applications for real-time data processing
  • using Kafka, Confluent Kafka, AWS Kinesis, GCP pub/sub or similar
  • Experience working with serverless services such as Openshift, GCP or AWS
  • Working experience with AWS or GCP would be a prerequisite
  • Working experience with other distributed technologies such as CassandraDB, DynamoDB,
  • MongoDB, Elastic Search and Flink would be desirable
  • Java and Python programming ability would be an advantage
  • Experience in metadata management, data modelling and schema management would be aN added benefit

Desired Skills:

  • KAFKA
  • AWS
  • MONGO

Learn more/Apply for this position