ENVIRONMENT:

A fast-growing provider of cutting-edge Hardware Solutions seeks a highly skilled Data Engineer to help develop and refine its existing IoT platform, policy management system, and agent app. To support these efforts, they need to migrate and improve its BI infrastructure. You will take ownership of this migration, focusing on designing and building scalable data pipelines that will support the BI and analytics needs. Applicants will need 4+ years’ work experience in a similar role, experience with Data Warehousing solutions, experience with Python and its libraries, MySQL, PostgreSQL, SQL & AWS (S3, EC2, serverless, AWS Glue).

DUTIES:

  • Be responsible for evaluating architectures and technologies, building Proofs of Concept (POCs), and driving the development of ETL processes that will enhance the data infrastructure.
  • Work closely with other team members to ensure the systems you develop are cohesive, maintainable, and scalable, aligning with the broader goals of the organisation.
  • Design, deploy, and maintain features within the data infrastructure that support critical functions like payment processing, BI reporting, and agent management.
  • You’ll work on ETL pipelines, data modelling, and storage solutions that will ensure they can make sense of the data generated by the platform.

REQUIREMENTS:

  • 4+ Years of relevant work experience in Data Engineering or a similar role.
  • Experience with Data Warehousing solutions.
  • Strong Data Modelling skills, ensuring data structures are efficient and well-optimised.
  • Experience designing and implementing ETL processes and data pipelines.
  • Experience with Python and its libraries commonly used in Data Engineering.
  • A strong understanding of relational databases (e.g. MySQL, PostgreSQL) and SQL.
  • Experience with AWS (S3, EC2, serverless, AWS Glue) or similar cloud platforms.
  • Strive to create clean, maintainable code with an emphasis on scalability and performance.
  • Emphasis on testability, implementing proper unit and integration tests in data pipelines.
  • Have strong research skills to stay up to date with new technologies.
  • Be able to demonstrate high-quality code you are proud of.

Advantageous –

  • Understanding of DataOps methodologies to manage the entire data lifecycle, from collection to delivery.
  • Familiarity with data governance practices, including data quality, metadata management, and data security.
  • Familiarity with data visualisation tools such as Tableau, Power BI, or QuickSight, to work closely with BI and Analytics teams.
  • Experience with DevOps practices, including CI/CD pipelines and version control (e.g., Jenkins, Git).

Technologies they use: Amazon EC2, Amazon S3, Quasar, [URL Removed] Django, JavaScript, MySQL, Python, nginx, Celery, Android Studio, Bitbucket, Docker, Git, New Relic, Postman

ATTRIBUTES:

  • Diligent, with strong attention to details such as naming conventions, data accuracy and performance optimisations.
  • Willing to work closely in a team and contribute to the overall Development practice.
  • Excited about sharing knowledge with the team and learning from others.
  • Be self-motivated, with the ability to work autonomously and deliver quality work under minimal supervision.

While we would really like to respond to every application, should you not be contacted for this position within 10 working days please consider your application unsuccessful.

COMMENTS:

When applying for jobs, ensure that you have the minimum job requirements. OnlySA Citizens will be considered for this role. If you are not in the mentioned location of any of the jobs, please note your relocation plans in all applications for jobs and correspondence. Apply here [URL Removed] e-mail a Word copy of your CV to [Email Address Removed] and mention the reference number of the job.

Desired Skills:

  • Data
  • Engineer
  • CPT

Learn more/Apply for this position