We are seeking a highly motivated and experienced AWS Data Engineer to join our Global Markets division on a (3-month contract with potential to renew). In this role, you will work closely with our traders, quantitative analysts, and data scientists to design, develop, and maintain data infrastructure and pipelines for our global markets business.
Job Description / Responsibilities:
- Creating data models that can be used to extract information from various sources and store it in a usable format.
- Lead the design, implementation, and successful delivery of large-scale, critical, or difficult data solutions involving a significant amount of data.
- Utilize expertise in SQL and have a strong understanding of ETL and data modelling.
- Ability to ingest data into AWS S3, perform ETL into RDS or Redshift.
- Use AWS Lambda (C# or Python) for event-driven data transformations.
- Designing and implementing security measures to protect data from unauthorized access or misuse.
- Maintaining the integrity of data by designing backup and recovery procedures.
- Work on automating the migration process in AWS from development to production.
- You will deliver digestible, contemporary, and immediate data content to support and drive business decisions.
Background and experience required (Years of Experience/Industry Experience):
- 5+ Years Python/C# Development
- 3+ Years AWS Data Engineering Education
Global Markets experience is preferred:
- Amazon CloudFront
- Route 53 Public DNS
- AWS WAF Classic for CloudFront
- AWS WAF for CloudFront
- Amazon Certificate Manager (ACM) for CloudFront
- AWS Global Accelerator (AGA)
- AWS Shield Advanced
- Creation of Data products
- 5+ Years Data Analysis experience
Must-have Skills (Mandatory Skills):
- AWS Glue Jobs using PySpark or AWS Glue Spark
- Strong SQL and C# or Python programming knowledge
- Expert knowledge of data engineering packages and libraries and related functions in C# or Python
- AWS technical certifications
- Experience with development and delivery of microservices using serverless AWS services (S3, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM).
- SQL Server and MySQL
- Apply knowledge of scripting and automation using tools like PowerShell, Python, Bash, Ruby, Perl
- Effectively and efficiently troubleshoot data issues and errors.
- Strong experience in SDLC delivery, including waterfall, hybrid, and Agile methodologies.
- Delivering data solutions and pipelines on AWS Cloud Platform.
- Understand of data modelling, data structures, databases, and ETL processes
- Delivering CI/CD and DevOps capabilities in a data environment
Desired Skills:
- AWS
- Data Engineering
- C#
- Python