Key accountabilities and decision ownership:

  • Data Pipeline Development: Developing scalable and efficient data pipelines as per business requirements.
  • Collaboration & Stakeholder Engagement: Collaborating with functional data owners in solving data related queries and provide the necessary support for data driven initiatives.
  • Infrastructure & Governance: Managing data integrity, security, quality and accessibility.
  • Reporting Solutions: Assist with development of reporting solutions.

Core competencies, knowledge and experience:

  • Experience with analysing data.
  • In-depth knowledge and practical experience of data lifecycle management.
  • Experience working with cross-functional teams and engaging with business stakeholders.
  • Ability to manage multiple priorities independently.
  • Strong communication and presentation skills.

Must have technical / professional qualifications:

  • 3-5 years of experience in data engineering or a similar role.
  • Proficiency in SQL and Python.
  • Experience with data pipeline tools (e.g., Apache Airflow, dbt).
  • Hands-on experience with cloud platforms (AWS, Azure).
  • Strong knowledge of data warehousing solutions (e.g., Snowflake, Redshift, BigQuery).
  • Familiarity with containerization tools (Docker, Kubernetes) is a plus.
  • Experience in developing reporting solutions using visualization tools (e.g. Power BI).
  • Desirable experience with real-time data processing frameworks (e.g., Kafka, Spark Streaming).
  • Experience with Smartsheet’s is a plus.

Desired Skills:

  • Data engineering
  • SQL
  • Python
  • AWS
  • azure
  • oracle Linux
  • Power BI

Desired Work Experience:

  • 2 to 5 years

Desired Qualification Level:

  • Degree

Learn more/Apply for this position