- Develops and maintains scalable data pipelines and builds out new integrations to support continuing increases in data volume and complexity.
- Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
- Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Writes unit/integration tests, contributes to data engineering wiki, and documents work.
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
- Works closely with a team of frontend and backend engineers, product managers, and analysts.
- Defines company data assets (data models), data pipelines to populate data models.
- Design data integration solutions and data quality approaches for solutions.
- Designs and evaluates open source and vendor tools for data lineage.
- Works closely with all business units and engineering teams to develop strategy for long term data platform architecture.
- Maintains technical and operational metadata during data solution development.
- Has experience in advanced query performance and query optimisation.
- Can develop complex SQL to implement complex business logic.
- Translate the functional requirements to high level design and build technical specifications document.
- Work with internal and external stakeholders to assist with data-related technical issues and support data infrastructure needs.
Minimum Requirements:
Qualifications:
• Diploma in IT related field or equivalent qualification; or
• BTech in IT related field or equivalent qualification
4 – 10 years practical experience as a Data Engineer/ ETL Developer
• Experience in the following technologies –
- 3 years’ experience in Data Factory
- 2 years’ experience in Azure Synapse
- 3 years’ experience in Azure SQL
- 2 years’ experience in PowerBI
- 1 years’ experience in Databricks
- 5 years’ experience in SQL
• Experience in developing and building data pipelines
• Experience in troubleshooting and debugging of data pipelines
Advantageous:
• Experience in Agile and DevOps squads
• Experience in Informatica
• Experience in Teradata
• Experience in SAP Data Services
• Knowledge of Data Warehousing principles
• Working knowledge of Data Integration technologies
• Knowledge of scripting languages
• Working knowledge of data visualisation libraries
• Solid understanding of all the underlying infrastructure, including data integration tools, ETL/ELT processes, data formatting and warehouse architecture
• Good knowledge of databases
Covid vaccination certificate required.
Desired Skills:
- Data Engineer/ ETL Developer
- Data Factory
- SQL
- Azure SQL
- PowerBI
- Data Warehousing
- ITIL