The role involves the design, development, and maintenance of the architecture that facilitates the efficient collection, storage, and processing of large datasets. This position collaborates closely with data scientists, analysts, and software engineers to ensure that data is accessible, reliable, and well-structured for analysis and business insights.
JOB DESCRIPTION

  • Execution of the BCT Divisional plans and monitor performance thereof based on SLA.
  • Identify technology strategic opportunities to meet business needs.
  • Data Architecture & Infrastructure Development
  • Design, develop, and maintain data pipelines for processing large-scale datasets.
  • Build and optimize data warehouses, data lakes, and ETL (Extract, Transform, Load) processes.
  • Implement data models and database schemas to improve data efficiency.
  • Ensure data systems are scalable, secure, and cost-effective.
  • Data integration, develop ETL/ELT workflows to extract data from a variety of sources, including databases, APIs, and streaming services.
  • Automate the processes of data ingestion, transformation, and storage.
  • Troubleshoot and monitor pipeline performance and data quality issues.
  • Optimise and manage SQL and NoSQL databases, MySQL.
  • Manage cloud-based storage solutions, including MS Azure and AWS
  • Guarantee the integrity, consistency, and accessibility of data across various platforms
  • Ensure privacy and security by implementing data stewardship policies (POPIA).
  • Manage and supervise encryption, authentication, and access controls.
  • Formulate strategies for data retention, recovery, and backup
  • Collaborate closely with data scientists to guarantee that the data is appropriately organized for machine learning models.
  • Provide data analysts with well-organised and clean datasets for the purpose of reporting and visualization.
  • Collaborate with software engineers to incorporate data solutions into applications.
  • Enhance data processing speed and efficiency by utilising big data technologies (e.g., Hadoop, Spark, etc).
  • For cloud environments, optimise storage costs and query performance.
  • Employ relevant scripting, SQL, or Python to automate repetitive duties.
  • Contribute to budget planning where required
  • Respond to all data related loss notifications impacting business

JOB REQUIREMENTS

Qualifications

  • Bachelor’s in computer science, Information Technology, Data Engineering, or any related field.
  • Honour’s degree advantageous
  • Certified in Data Analytics
  • Advantageous – Cloud Professional Data Engineer, or Microsoft Azure Data Engineer.

Experience

  • +5 years of experience in data engineering, database management, or software development.

Desired Skills:

  • Data Analytics
  • Database Management
  • Software Development

Learn more/Apply for this position