- Design, develop and maintain robust data pipelines and ETL/ELT processes using Azure Data Factory, Databricks Pipelines and related services.
- Create and optimise data models and data warehousing solutions to support reporting and analytics needs.
- Build high-quality interactive reports and dashboards; translate business requirements into insightful visualisations.
- Work closely with business stakeholders to gather requirements, define KPIs and deliver actionable analytics.
- Implement and enforce data governance, data quality checks and best practices across datasets and pipelines.
- Develop SQL scripts, stored procedures and Python/PySpark code for data transformation and analysis.
- Collaborate with data engineers, data scientists and platform teams to integrate analytical solutions into the wider data platform.
- Monitor and tune performance of queries, data loads and data storage to ensure cost-efficient operations
- Document data models, pipeline designs, data dictionaries and runbooks for handover and operational support.
- Support data ingestion from diverse sources, including APIs, databases and streaming platforms
- Contribute to automation and CI/CD practices for data solution deployments using Git
Minimum Requirements:
Education
- Bachelor’s degree in Computer Science, Data Science, Statistics, Business Analytics or a related field (or equivalent practical experience)
- Minimum of 5+ years’ hands-on experience in data analysis, data engineering or analytics roles with significant Azure exposure
- Demonstrable experience delivering production-ready data solutions, dashboards and documentation in an enterprise environment
Knowledge:
- Strong experience with Azure data services (ADF, Azure SQL, ADLS, Azure Data Explorer/Kusto)
- Advanced SQL skills for data extraction, transformation, and analysis
- Proven experience building ETL/ELT pipelines using Azure Data Factory or Databricks
- Expertise in dimensional modelling and data model design (star/snowflake schemas)
- Experience with data visualisation tools (Power BI, Tableau, Celonis) and interactive dashboards
- Solid understanding of data warehousing best practices, including partitioning and indexing
- Strong analytical skills with the ability to translate business requirements into data solutions
- Knowledge of data governance, data quality, and metadata management
- Proficient in Python and PySpark for data engineering tasks
- Strong communication skills to present insights to technical and non-technical stakeholders
Desired Skills:
- Data Scientist
- Python
- SQL
- Azure
- Analytics