Our client is seeking an experienced Intermediate Data Engineer to join their team. This role is key to building and maintaining scalable data infrastructure to support analytics, business intelligence, and operational systems. The successful candidate will have solid technical expertise in Python and SQL, with proven experience working in cloud environments and designing efficient data pipelines.
Key Requirements
- Bachelor’s degree in Computer Science, IT, Engineering, or a related field (or equivalent practical experience)
- 2-4 years in a similar data engineering role, with hands-on experience in pipeline development and optimization
- Strong proficiency in Python and SQL for data manipulation and engineering
- Experience with ETL/ELT tools (e.g., Apache Airflow, Talend)
- Familiarity with cloud platforms (e.g., AWS, Azure, or Google Cloud)
- Knowledge of Big Data frameworks such as Hadoop, Spark, or ClickHouse
- Solid understanding of data modeling and schema design
Should you meet the requirements for this position, please email your CV to [Email Address Removed]. You can also contact the IT team on [Phone Number Removed]; or visit our website at [URL Removed] NOTE: When replying to the advert, also include the reference number in the subject line. Correspondence will only be conducted with short listed candidates. Should you not hear from us within 3 days, please consider your application unsuccessful.
Desired Skills:
- Python
- SQL
- ETL
- Hadoop