2061_ AWS Data Engineer (Expert)
Contract term: Immediate – 31 December 2025
ESSENTIAL SKILLS REQUIREMENTS:
Above average experience/understanding (in order of importance):
e.g. Technical Skills / Technology
- Terraform
- Python 3x
- SQL – Oracle/PostgreSQL
- Py Spark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- Powershell / Bash
- Cloud Data Hub (CDH)
- CDEC Blueprint
Basic experience/understanding of AWS Components (in order of importance):
- Glue
- CloudWatch
- SNS
- Athena
- S3
- Kinesis Streams (Kinesis, Kinesis Firehose)
- Lambda
- DynamoDB
- Step Function
- Param Store
- Secrets Manager
- Code Build/Pipeline
- CloudFormation
- Business Intelligence (BI) Experience
- Technical data modelling and schema design (“not drag and drop”)
- Kafka
· AWS EMR
· Redshift
Experience in working with Enterprise Collaboration tools such as Confluence, JIRA etc.
Experience developing technical documentation and artefacts.
Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc.
Experience working with Data Quality Tools such as Great Expectations.
Experience developing and working with REST API’s is a bonus.
Basic experience in Networking and troubleshooting network issues.
Knowledge of the Agile Working Model.
WHICH QUALIFICATIONS/EXPERIENCE DO WE NEED FOR THE ROLE?
Relevant IT / Business / Engineering Degree
Certifications:
Candidates with one or more of the certifications are preferred.
AWS Certified Cloud Practitioner, AWS Certified SysOps Associate, AWS Certified Developer Associate, AWS Certified Architect Associate, AWS Certified Architect Professional, Hashicorp Certified Terraform Associate
Desired Skills:
- Terraform
- Pythonx3
- Boto3
- Pyspark
- CDEC
- SQLOracle/PostgreSQL