• Demonstrate expertise in data modeling Oracle SQL.
  • Exceptional analytical skills in analyzing large and complex data sets.
  • Perform thorough testing and data validation to ensure the accuracy of data transformations.
  • Experience in working with Enterprise Collaboration tools such as Confluence, JIRA, etc.
  • Experience developing technical documentation and artifacts.
  • Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV, etc.
  • Experience working with Data Quality Tools such as Great Expectations.
  • Experience developing and working with REST APIs is a bonus.
  • Experience building data pipelines using AWS Glue Data Pipeline, or similar platforms.
  • Familiar with data stores such as AWS S3, and AWS RDS or DynamoDB.
  • Experience and a solid understanding of various software design patterns.
  • Experience preparing specifications from which programs will be written, designed, coded, tested, and debugged.

Minimum Requirements:

Education:

  • Relevant IT / Business / Engineering Degree
  • Candidates with one or more of the certifications are preferred.
  • AWS Certified Cloud Practitioner / AWS Certified SysOps Associate / AWS Certified Developer Associate / AWS Certified Architect Associate / AWS Certified Architect Professional / Hashicorp Certified Terraform Associate

Role-specific knowledge:

Senior Knowledge:

  • Terraform
  • Python 3x
  • SQL – Oracle/PostgreSQL
  • Py Spark
  • Boto3
  • ETL
  • Docker
  • Linux / Unix
  • Big Data
  • Powershell / Bash
  • AWS Cloud Data Hub & Blueprint

Basic knowledge:

  • Glue
  • CloudWatch
  • SNS
  • Athena
  • S3
  • Kinesis Streams (Kinesis, Kinesis Firehose)
  • Lambda
  • DynamoDB

Desired Skills:

  • Data Engineer
  • Python
  • AWS

Learn more/Apply for this position