Are you a skilled Data Engineer? A giant in the consulting space require an AWS Data Engineer to design, implement, and maintain scalable data solutions on the AWS cloud platform. Work with diverse and gain valuable experience to push you to the next level in your career.

Key Responsibilities:

  • Build and maintain robust ETL/ELT pipelines using AWS Glue, Lambda, or Step Functions.
  • Automate and orchestrate data workflows to handle large-scale data processing.
  • Design and manage data warehouses with Amazon Redshift or similar tools.
  • Optimize data models and queries for performance and scalability.
  • Integrate structured and unstructured data from various sources into a unified platform.
  • Implement data transformation and cleansing processes to ensure data quality.
  • Deploy and monitor AWS data services such as S3, DynamoDB, RDS, and EMR.
  • Optimize AWS resources for performance and cost-efficiency.
  • Partner with data scientists, analysts, and business teams to understand data requirements.
  • Provide support for real-time and batch processing use cases.
  • Implement and monitor data security measures to ensure compliance with industry standards (e.g., GDPR, HIPAA).
  • Manage IAM roles, policies, and permissions for data access.

Experience:

  • Relevant IT Diploma or equivalent qualification.
  • 4+ years of hands-on experience in data engineering or related fields.
  • Proven expertise with AWS cloud services, including Redshift, Glue, Lambda, S3, and CloudWatch.
  • Proficiency in programming languages like Python, Java, or Scala.
  • Strong SQL skills for querying and optimizing relational databases.
  • Experience with distributed data processing frameworks such as Apache Spark or Hadoop.
  • Familiarity with data modeling, schema design, and data governance principles.
  • Knowledge of DevOps tools and practices, including CI/CD pipelines and version control (e.g., Git).
  • AWS certifications (e.g., AWS Certified Data Analytics – Specialty, AWS Certified Solutions Architect).
  • Experience with streaming data technologies like AWS Kinesis or Kafka.
  • Hands-on experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation.
  • Knowledge of machine learning frameworks and integration with AWS SageMaker.

Interested? Apply Now!

Desired Skills:

  • AWS
  • ETL
  • CICD

Learn more/Apply for this position