We have a fantastic opportunity for a Senior Data Engineer with Strong AWS to be part of a brilliant team of engineers -who work for one of the biggest and best Automotive giants in the world.

The Ideal Candidate should have:

  • At least 6-8 years Cloud architecture and reporting technology experience
  • Extensive experience in implementing and monitoring solutions
  • Experience in testing (manual or automated testing)
  • Web and digital project experience advantageous
  • Agile working experience advantageous
  • IT/Business Degree

Technical Skills Include:

  • Expertise in ETL optimization, designing, coding, and tuning big data processes using Apache Spark.
  • Experience with building data pipelines and applications to stream and process datasets at low latencies.
  • Show efficiency in handling data – tracking data lineage, ensuring data quality, and improving discoverability of data.
  • Sound knowledge of distributed systems and data architecture (lambda)- design and implement batch and stream data processing pipelines, knows how to optimize the distribution, partitioning of high-level data structures.
  • Experience designing and supporting large-scale distributed systems in a production environment
  • AWS Solid understanding of Components – VPC | IAM. Above average experience/understanding of AWS Components:
  • Lambda
  • DynamoDB
  • Param Store
  • Secrets Manager
  • Athena
  • Glue
  • CloudWatch
  • Step Function
  • SNS
  • Code Build/Pipeline
  • CloudFormation
  • S3

Strong experience/understanding of:

  • Python 3x
  • SQL
  • Py Spark
  • Boto3
  • Terraform
  • ETL
  • Docker
  • Linux / Unix
  • Big Data
  • Oracle/PostgreSQL
  • Powershell / Bash
  • Experience working in Agile SDLC methodology.
  • Working experience building data/ETL pipeline and data warehouse.
  • Demonstrate expertise in data modelling SQL and NoSQL databases.
  • Exceptional analytical skills analysing large and complex data sets.
  • Perform thorough testing and data validation to ensure the accuracy of data transformations.
  • Strong written and verbal communication skills, with precise documentation.
  • Self-driven team player with ability to work independently and multi-task.
  • Must be an analytical and creative thinker, and an innovative problem solver.
  • Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.
  • Familiar with data store such as AWS S3, and AWS RDS or DynamoDB.
  • Experience and solid understanding of various software design patterns.
  • Experience preparing specifications from which programs will be written, designed, coded, tested and debugged.
  • Experience working with a distributed team.
  • Strong organizational skills.


  • Certification: AWS Certified Developer Associate / Solutions Architect
  • Bitbucket / Git
  • Jira / Confluence
  • Familiar with data streaming services such as Apache Kafka, Amazon Kinesis, or similar tools
  • CI / CD Tool (Nexus / Jenkins).

Apply today for more information and lets get those applications across!

Desired Skills:

  • CI
  • CD
  • Data Engineer
  • AWS
  • Git
  • Bitbucket
  • Kafka
  • Glue
  • s3
  • Athena
  • Lambda
  • SNS
  • CloudFormation
  • Step Function
  • Secrets manager
  • Param store
  • DynamoDB
  • SQL
  • Linux
  • ETL
  • Terraform
  • Big data
  • Boto3
  • Py Spark
  • Oracle

Desired Qualification Level:

  • Diploma

Learn more/Apply for this position