Key Duties and Responsibilities:
- Architect robust data pipelines (ETL) across diverse sources.
- Design and build data layers to support big data analytics.
- Develop, test, and maintain high-quality data pipelines and data layers.
- Deliver quality-driven, value-added solutions.
- Conduct peer reviews and uphold SDLC and documentation standards.
- Maintain comprehensive technical documentation.
- Understand business needs and analyse client requirements.
- Identify and visualise relevant data to support robust solutions.
Minimum Requirements:
- Bachelor’s degree in Computer Science or IT.
- 4+ years of ETL and data warehouse experience.
- Experience with AWS or Azure or GCP.
- PostgreSQL, MySQL, SQL Server experience.
- Data Analysis and modelling experience essential.
- Python, Java, Korn shell scripting.
- Version control systems experience.
- Agile Methodology.
Desired Skills:
- Data Engineer
- Data Modelling
- Scripting
- Python
- Cloud
- ETL
Desired Qualification Level:
- Degree