To design, build, and maintain robust, scalable data infrastructure that ensures secure, real time access to operational and supply chain data
JOB OUTPUTS AND KEY PERFORMANCE INDICATORS

  • Design, build, and maintain ELT pipelines
  • Pipeline success rate (% jobs completed without failure)
  • Data latency (extraction to availability)
  • Error resolution turnaround time

  • Integrate data across systems (WMS, TMS, ERP, IoT)
  • Number of successful system integrations% data completeness and consistency across sources
  • Average time to onboard new data source
  • Data Lake / Warehouse Management
  • System uptime and availability
  • Query performance (execution speed)
  • Storage usage vs capacity

  • Structured and clean data sets
  • Data quality score (accuracy, completeness, validity)
  • Number of reported data issues
  • Resolution turnaround time
  • Collaboration with BI and Data Science teams
  • Time to deliver datasets for reports and models
  • Internal stakeholder satisfaction rating
  • Support requests resolved within SLA
  • Documentation and data cataloguing% of pipelines with up to date documentation
  • Metadata completeness
  • User ease of data discovery
  • Security and compliance support% compliance with access control policies
  • Number of unauthorised access incidents
  • Audit readiness and completion rate
  • Process automation
  • Manual hours reduced via automation
  • Number of recurring tasks automated
  • Stability of automated workflows

  • Issue analysis and root cause investigations
  • Time to issue resolution (from investigation to recommendation)
  • Number of root causes correctly identified
  • Reduction in repeated issues
  • Advanced SQL development, optimisation and data engineering
  • Query performance improvements (execution speed and efficiency)
  • SQL based data validation, transformation and cleansing coverage
  • Reusable SQL pipelines for recurring logistics workflows (inventory, shipments, routing)
  • SQL based reconciliation across multiple systems
  • Version controlled, documented SQL scripts aligned with governance standards
  • Reduction in errors or rework due to SQL inefficiencies

JOB SPECIFIC REQUIREMENTS

  • Minimum Requirements (Experience and Qualifications)
  • Bachelor’s or Masters in Computer Science or related field
  • Microsoft Certified Azure Data Engineer Associate
  • Google Professional Data Engineer
  • AWS Certified Data Analytics
  • Certifications in BI or analytics tools such as Power BI, Tableau or SQL
  • 3 to 5 years experience in data engineering preferably in logistics or supply chain sectors
  • Required Knowledge
  • Development of ELT and ETL pipelines using tools such as Apache Airflow,
  • SSIS or Azure Data Factory
  • Data integration across logistics systems including ERP, WMS, TMS and IoT
  • Data modelling, schema design and SQL optimisation
  • Data warehousing concepts including star and snowflake schemas
  • Version control and CI/CD pipelines for data products
  • Supply chain and logistics data structures and flows
  • Required Skills
  • Advanced proficiency in SQL and Python or another scripting languageStrong debugging, problem solving and performance tuning skills
  • Data validation, cleansing and transformation techniques
  • Building scalable and reusable data pipelines
  • Working knowledge of cloud based data platforms such as Azure, AWS or GCP
  • Communication skills
  • Required Competencies
  • Ability to work under pressure
  • Time management
  • Collaboration
  • Problem solving
  • Attention to detail
  • Analytical thinking
  • Working under pressure in an agile environment

ADDITIONAL NOTES OF IMPORTANCE

  • Operate in a safe manner complying with all Health, Safety, Quality and
  • Environmental requirements to ensure own safety and the safety of others
  • Compliance with Good Distribution and Good Documentation Practice guidelines as per South African Health Products Regulatory Authority (SAHPRA)
  • All handling of Pharmaceutical Goods and Medical Devices must be in accordance with the operational requirements of the Business and the regulatory requirements of the relevant statutory bodies in South Africa as per the Medicines and Related Substances Act 101 of 1965 as amended

DEFINITION

  • Designing, constructing and maintaining data infrastructure
  • Functional Operations
  • Refers to day to day functional delivery requirements of role
  • Check overnight ETL jobs and resolve any pipeline errors
  • Review logs from a new IoT tracking feed integration
  • Optimise SQL queries for delivery performance dashboard
  • Deploy updated pipeline to automate inbound shipment data
  • Document new data model changes for inventory tracking

Desired Skills:

  • Communications
  • Information Technology
  • SQL

Learn more/Apply for this position