We are seeking a highly skilled Senior Data Engineer to join a dynamic team within a leading banking environment. This role is ideal for someone passionate about building scalable, high-performance data platforms and delivering robust data solutions that drive business value.

You will play a key role in designing, developing, and optimising modern data architectures, working with cutting-edge technologies such as Microsoft Fabric, Azure Data Factory, and Databricks.

Key Responsibilities

  • Translate business, architectural, and data requirements into scalable technical solutions
  • Design and build metadata-driven data ingestion pipelines using Azure Data Factory and Databricks
  • Develop and maintain enterprise-grade Data Warehouses (Kimball methodology)
  • Build and model data products using Databricks and Microsoft Fabric
  • Implement end-to-end data engineering solutions using IDX templates into ODP (One Data Platform)
  • Drive DevOps best practices, including CI/CD and automation
  • Perform unit testing, integration testing, and debugging to ensure high-quality deployments
  • Design and manage Azure infrastructure components and templates
  • Develop and maintain documentation, governance standards, and best practices
  • Collaborate with business stakeholders to understand and deliver on data requirements
  • Apply data governance and engineering standards to ensure high-quality, secure data products
  • Participate actively in data engineering and modelling communities of practice
  • Support operational processes and shared team responsibilities

RequirementsRequired Skills & Experience

  • 6+ years’ experience as a Data Engineer / Platform Engineer
  • Strong experience with Microsoft Fabric (Lakehouse, Warehouses, Pipelines, Notebooks, Semantic Models)
  • Hands-on experience with Azure Data Factory, Databricks, and Azure Synapse Analytics
  • Expertise in Apache Spark for large-scale data processing
  • Strong SQL skills (T-SQL) and data analysis capabilities
  • Experience with real-time data streaming (Azure Event Hubs, Stream Analytics)
  • Solid understanding of ETL design and optimisation
  • Experience with data governance tools (Unity Catalog, Microsoft Purview)
  • Knowledge of Data Warehouse methodologies (Kimball, Data Vault 2.0)
  • Proficiency in Python, C#, and SQL
  • Experience with Azure DevOps, CI/CD pipelines, and Infrastructure-as-Code (Bicep, ARM, CLI, PowerShell, Bash)
  • Understanding of data mesh architectures
  • Familiarity with Azure AD security, authentication, and authorization
  • Agile delivery experience

Qualifications

  • Bachelor’s Degree in Computer Science or related field (or equivalent experience)
  • Mandatory: One or more Microsoft Azure certifications (AZ-900, DP-203, DP-600, or DP-700)

Desired Skills:

  • Azure Data Factory
  • Databricks
  • Microsoft Fabric
  • Apache Spark

Desired Qualification Level:

  • Degree

About The Employer:


Learn more/Apply for this position