Purpose:

Develop and maintain our data storage platforms and specialised data pipelines to support the company’s Technology Operations.

Operational Delivery:

  • Development and maintenance of LakeHouse environments.
  • Development of data engineering and data analysis pipelines and solutions in the cloud (Azure).
  • Ensure DevOps compliance for all data pipelines and workloads.
  • Ensure development and maintain PowerBI reports.
  • Documentation of pipelines and databases.
  • Troubleshooting and support to our Technical Operations team when dealing with production issues.
  • Contribute to application and systems design in collaboration with the Software Architects.
  • Provide mentorship and support to junior data engineers.

Qualifications:

B.Sc (Electronic and/or Computer Engineering) or similar qualification.

Experience:

  • Minimum of 5 years of experience working within a data engineering environment.

Competencies:

  • Strong SQL skills.
  • Databricks experience (minimum 1 year), preferably with certifications.
  • Python and PySpark experience.
  • DataLake experience with practical experience developing and maintaining a LakeHouse architecture.
  • Azure Data Factory (ADF) experience or similar.
  • Experience in writing technical documentation (architectural diagrams, release notes, etc.).
  • Good inter-personal and communication skills.

Additional Nice-to-have Skills:

  • DataWarehouse experience – especially Azure Synapse
  • Datascience and AI/ML experience would be beneficial
  • NoSQL database experience (e.g. CosmosDB, CouchBase, MongoBD, etc.)

Desired Skills:

  • Systems Analysis
  • Complex Problem Solving
  • Programming/configuration
  • Critical Thinking
  • Time Management

Learn more/Apply for this position