We’re looking for an Expert Data Engineer to architect, build, and optimize large-scale data pipelines that power mission-critical analytics across multiple business domains.

You’ll work with modern cloud stack technologies, automate everything possible, and ensure data flows like a dream from ingestion to production systems.

If AWS services, Terraform scripts, Spark jobs and streaming architectures excite you, buckle up.

Requirements

  • Expert-level experience with: Terraform • Python 3x • SQL (Oracle/PostgreSQL) • PySpark • Kafka • Docker • Linux/Unix • Big Data • Powershell/Bash ETL • Glue • S3 • Kinesis Streams • Lambda • DynamoDB • CloudWatch • Athena • CodePipeline • Data Hubs.
  • Strong BI and data modelling experience
  • Experience working with REST APIs, data formats (Parquet, JSON, AVRO, XML, CSV)
  • Ability to analyze complex datasets and perform deep validation
  • Strong documentation and communication skills
  • Experience in Agile environments (Jira, Confluence)
  • Bonus: AWS certifications or Terraform Associate
  • Bonus: Experience with AWS Data Pipeline, Great Expectations, EMR, RDS

Desired Skills:

  • Terraform
  • Python
  • SQL
  • ETL
  • Linux
  • Unix
  • Cloudwatch.

Desired Qualification Level:

  • Degree

About The Employer:


Learn more/Apply for this position