We are looking for a Senior Data Engineer / Cloud Architect with expertise in AWS Development andETL to join the Mobility Analytics team of cloud [URL Removed] is a Contract until Dec 2024 (3 years) – Renewable annually there after

  • Applies advanced knowledge of area
  • Managing projects / processes
  • Ability to develop within a specific sought-after programming language
  • Strong working knowledge with software development tools, techniques and approaches used to build application solutions
  • Working knowledge with software development tools, techniques and approaches used to build application solutions
  • Cloud computing technologies, Business drivers and emerging computing trends
  • Understanding of integration between different technologies
  • Coordination between development and support environments
  • Assisting with the business case
  • Planning and monitoring
  • Eliciting requirements
  • Requirements organisation
  • Translating and simplifying requirements
  • Requirements management and communication
  • Requirements analysis
  • Document requirements in appropriate format depending on methodology followed
  • Assist with identification and management of risks

Minimum Requirements

  • At least 10-12 years Cloud architecture and reporting technology experience
  • Extensive experience in implementing and monitoring solutions
  • Experience in testing (manual or automated testing)
  • Web and digital project experience advantageous
  • Agile working experience advantageous

We are looking for a Senior AWS Developer Associate to join our Mobility Analytics team of cloud developers

  • Expertise in ETL optimization, designing, coding, and tuning big data processes using Apache Spark.
  • Experience with building data pipelines and applications to stream and process datasets at low latencies.
  • Show efficiency in handling data – tracking data lineage, ensuring data quality, and improving discoverability of data.
  • Sound knowledge of distributed systems and data architecture (lambda)- design and implement batch and stream data processing pipelines, knows how to optimize the distribution, partitioning of high-level data structures.
  • Experience designing and supporting large-scale distributed systems in a production environment AWS
  • Solid understanding of Components – VPC | IAM. Above average experience/understanding of AWS Components:
  • Lambda
  • DynamoDB
  • Param Store
  • Secrets Manager
  • Athena
  • Glue
  • CloudWatch
  • Step Function
  • SNS
  • Code Build/Pipeline
  • CloudFormation
  • S3

Strong experience/understanding of:

  • Python 3x
  • SQL
  • Py Spark
  • Boto3
  • Terraform
  • ETL
  • Docker
  • Linux / Unix
  • Big Data
  • Oracle/PostgreSQL
  • Powershell / Bash
  • Experience working in Agile SDLC methodology.
  • Working experience building data/ETL pipeline and data warehouse.
  • Demonstrate expertise in data modelling SQL and NoSQL databases.
  • Exceptional analytical skills analysing large and complex data sets.
  • Perform thorough testing and data validation to ensure the accuracy of data transformations.
  • Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.
  • Familiar with data store such as AWS S3, and AWS RDS or DynamoDB.
  • Experience and solid understanding of various software design patterns.

Beneficial:

  • Certification: AWS Certified Developer Associate / Solutions Architect
  • Bitbucket / Git
  • Jira / Confluence
  • Familiar with data streaming services such as Apache Kafka, Amazon Kinesis, or similar tools
  • CI / CD Tool (Nexus / Jenkins).

Learn more/Apply for this position