• Hybrid working model opportunity

We help our clients understand the rich data we collect from the traditional retail market in South Africa and Africa broadly. The ideal candidate will have a strong analytical and mathematical background to help drive value from our data. This person will wear many hats in the role, but much of the focus will be on building out our Python ETL processes and writing superb SQL.

In addition, the candidate will be responsible for maintaining and building out our reporting DB. Beyond technical prowess, the data engineer will need soft skills for clearly communicating highly complex data trends to organizational leaders and to clients. We’re looking for someone willing to jump right in and make an impact in the traditional retail market through powerful data pipelines.

How to Apply:
For your application to be considered, please email your CV and Academic Transcript to [Email Address Removed] – only candidates with suitable Software Development experience will be contacted.

Requirements:

  • Bachelor’s degree in computer science, information technology, engineering, or related analytical discipline required.
  • Three or more years of experience with Python, SQL, and data visualization/exploration tools (Power BI, Tableau etc).
  • Familiarity with common python based ETL tools such as PySpark or Apache Airflow.
  • Familiarity with Kimball & Inmon data warehousing approaches.
  • Familiarity with the AWS ecosystem, specifically Redshift, RDS & EC2.
  • Familiarity with PostgreSQL preferred.
  • Communication skills, especially for explaining technical concepts to nontechnical business leaders.
  • Ability to work on a dynamic, results-oriented team that has concurrent projects and priorities
  • Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes.
  • Help streamline our data science and analytics workflows by improving data delivery and quality to internal and external stakeholders.
  • Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning.
  • Be an advocate for best practices and continued learning, constantly challenging the status quo and striving for excellence.

Responsibilities:

  • Work closely with our development team, data analysts and BI analysts to help build and maintain data flows that support our reporting requirements.
  • Use agile software development processes to make iterative improvements to our back-end systems, particularly our reporting DB.
  • Model front-end and back-end data sources to help draw a more comprehensive picture of user flows throughout the system and to enable powerful data analysis.
  • Build data pipelines that clean, transform, and aggregate data from disparate sources and deliver quality usable data to data analysts and BI analysts for reporting.
  • Develop models that can be used to make predictions and answer questions for the overall [URL Removed] skills and qualifications:
  • Experience in building or maintaining ETL processes.
  • Professional cloud certifications.
  • Experience as a data analyst / BI analyst.
  • Database admin experience a bonus.

Desired Skills:

  • Python
  • SQL
  • data visualization
  • exploration tools
  • ETL Tools
  • AWS ecosystem
  • PostgreSQL
  • Kimball & Inmon data

Desired Work Experience:

  • 1 to 2 years

Desired Qualification Level:

  • Degree

Learn more/Apply for this position