We have an opportunity for a mid level AWS Data Engineer to join our client’s Global Markets division. This role offers an exciting opportunity to collaborate with leading financial institutions, contributing to the design and implementation of data pipelines to support global markets business. In this role, you will work closely with traders, quantitative analysts, and data scientists to design, develop, and maintain data infrastructure and pipelines. The company is an Information Technology and Business Consulting Company that provides highly specialised solutions to large and small enterprises in both the private and public sectors.
Requirements:
- A strong background in C# or Python programming and building data pipelines on AWS, especially with AWS Glue Jobs using PySpark or AWS Glue Spark.
- 5 Years Python/C# Development
- 3 Years AWS Data Engineering
- Bachelor’s degree in Computer Science, Information Systems, or related field.
- Advantageous: AWS Certified Machine Learning – Specialty Certificate or experience in developing machine learning (ML) models and automate ML pipelines
Responsibilities:
- Creating data models that can be used to extract information from various sources and store it in a usable format.
- Lead the design, implementation, and successful delivery of large-scale, critical, or difficult data solutions involving a significant amount of data.
- Utilize expertise in SQL and have a strong understanding of ETL and data modelling.
- Ability to ingest data into AWS S3, perform ETL into RDS or Redshift.
- Use AWS Lambda (C# or Python) for event-driven data transformations.
- Designing and implementing security measures to protect data from unauthorized access or misuse.
- Maintaining the integrity of data by designing backup and recovery procedures.
- Work on automating the migration process in AWS from development to production.
- You will deliver digestible, contemporary, and immediate data content to support and drive business decisions.
- You will be involved in all aspects of data engineering from delivery planning, estimating and analysis, all the way through to data architecture and pipeline design, delivery, and production implementation.
- From day one, you will be involved in the design and implementation of complex data solutions ranging from batch to streaming and event-driven architectures, across cloud, on-premise and hybrid client technology landscapes.
- Optimize cloud workloads for cost, scalability, availability, governance, compliance, etc.
- Must have experience with AWS Glue Jobs using PySpark or AWS Glue Spark.
- Realtime ingestion using KAFKA is an added advantage.
- Strong SQL and C# or Python programming knowledge.
- Objective oriented principles in C# or Python: classes and inheritance.
- Expert knowledge of data engineering packages and libraries and related functions in C# or Python.
- AWS technical certifications (Developer Associate or Solutions Architect).
- Experience with development and delivery of microservices using serverless AWS services (S3, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM).
- Ability to understand and articulate requirements to technical and non-technical audiences.
- Have experience working with RDBMS databases, such as Postgres, SQL Server and MySQL.
- Apply knowledge of scripting and automation using tools like PowerShell, Python, Bash, Ruby, Perl, etc.
- Stakeholder management and communication skills, including prioritising, problem solving and interpersonal relationship building.
- Effectively and efficiently troubleshoot data issues and errors.
- Strong experience in SDLC delivery, including waterfall, hybrid, and Agile methodologies.
- Experience delivering in an agile environment.
- Experience in implementing and delivering data solutions and pipelines on AWS Cloud Platform.
- A strong understanding of data modelling, data structures, databases, and ETL processes.
- Knowledge and experience in delivering CI/CD and DevOps capabilities in a data environment.
- Ability to clearly communicate complex technical ideas.
- Experience in the financial industry is a plus.
NOTE – We ONLY accept online applications. We do not consider direct applications via Whatsapp or email. SALARY DISCLAIMER: The advertised salary range is merely a guideline to attract a range potentially suitable candidates to the advertised position. This doesn’t automatically mean that a successful candidate can claim an offer for the maximum advertised salary. It is the prerogative of the future employer to offer a candidate a market related remuneration package in line with the candidate’s qualifications, skills and level of experience
Desired Skills:
- AWS
- Data Engineer
- Lambda
- Pythin
- C#