Can You Build the Superhighway of Data?
We’re searching for a Data Engineer to design pipelines that transport information with speed and precision. If ETL, data lakes, and SQL are your playground, apply now and let’s drive data excellence together!
Data Analysts:
Manage several data analysts, their performance management and delivery.
Data Maintenance & Quality:
Maintain, improve, clean, and manipulate data in operational and analytics databases.
Data Infrastructure:
Build and manage scalable, optimized, secure, and reliable data infrastructure using technologies such as:
- Infrastructure and Databases: DB2, PostgreSQL, MSSQL, HBase, NoSQL.
- Data Lake Storage: Azure Data Lake Gen 2.
- Cloud Solutions: SAS, Azure Databricks, Azure Data Factory, HDInsight.
- Data Platforms: SAS, Ab Initio, Denodo, Netezza, Azure Cloud.
- Collaborate with Information Security, CISO, and Data Governance to ensure data privacy and security.
Data Pipeline Development:
- Build and maintain data pipelines for ingestion, provisioning, streaming, and APIs.
- Integrate data from multiple sources (Golden Sources, Trusted Sources, Writebacks).
- Load data into Nedbank Data Warehouse (Data Reservoir, Atomic Data Warehouse, Enterprise Data Mart).
- Provision data to Lines of Business Marts, Regulatory Marts, and Compliance Marts through self-service data virtualization.
- Ensure consistency in data transformation for reporting and analysis.
- Utilize big data tools like Hadoop, streaming tools like Kafka, and data replication tools like IBM InfoSphere.
- Utilize data integration tools such as Ab Initio, Azure Data Factory, and Azure Databricks.
Data Modelling:
Collaborate with Data Modelers to create data models and schemas for the Data Reservoir, Data Lake, Atomic Data Warehouse, and Enterprise Data Marts.
Automation & Monitoring:
Automate data pipelines and monitor their performance for efficiency and effectiveness.
Collaboration:
Work closely with Data Analysts, Software Engineers, Data Modelers, Data Scientists, Scrum Masters, and Data Warehouse teams to deliver end-to-end data solutions that bring value to the business.
Data Quality & Governance:
Implement data quality checks to maintain data accuracy, consistency, and security throughout the data lifecycle.
Performance Optimization:
Ensure the optimal performance of data warehouses, data integration patterns, and real-time/batch jobs.
API Development:
Develop APIs for efficient data delivery and ensure seamless integration between data consumers and providers.
Essential Qualifications – NQF Level
- Matric / Grade 12 / National Senior Certificate
- Advanced Diplomas/National 1st Degrees
- Preferred Qualification
- BCom, BSc, BEng in related fields.
- Essential Certifications
- Cloud (Azure/AWS), DevOps, or Data Engineering certifications.
- Preferred Certifications
- Azure Data Factory
- Azure Synapse Analytics
- Event Hub
- Microsoft Fabric
- AZ-900 certification (Microsoft Azure Fundamentals)
- Data Science certifications (e.g., Coursera, Udemy, SAS Data Scientist certification, Microsoft Data Scientist).
Minimum Experience Level
- Total Experience Required: 3 – 6 years.
Type of Experience:
- Ability to work independently within a squad.
- Experience designing, building, and maintaining data warehouses and data lakes.
- Hands-on experience with big data technologies (Hadoop, Spark, Hive).
- Programming experience with Python, Java, SQL.
- Knowledge of relational and NoSQL databases.
- Experience with cloud platforms like AWS, Azure, and GCP.
- Familiarity with data visualization tools.
- Analytical and problem-solving skills.
Technical / Professional Knowledge
- Cloud Data Engineering (Azure, AWS, Google) – Intermediate, Proficiency Level 3
- Data Warehousing – Progressive, Proficiency Level 4
- Databases (PostgreSQL, MS SQL, IBM DB2, HBase, MongoDB) – Progressive, Proficiency Level 4
- Programming (Python, Java, SQL) – Progressive, Proficiency Level 4
- Data Analysis and Data Modelling – Intermediate, Proficiency Level 3
- Data Pipelines and ETL tools (Ab Initio, ADB, ADF, SAS ETL) – Intermediate, Proficiency Level 3
- Agile Delivery – Progressive, Proficiency Level 4
- Problem-Solving Skills – Progressive, Proficiency Level 4
- Behavioural Competencies
- Decision Making
- Influencing
- Communication
- Innovation
- Technical/Professional Knowledge and Skills
- Building Partnerships
- Continuous Learning
Hit apply today!
Desired Skills:
- python
- sql
- aws
- azure
- sas
- ab initio
Desired Work Experience:
- 2 to 5 years