5 YEAR CONTRACT (RENEWABLE)

These skills are specifically for helping to achieve scalability of data flows within batch and
real-time pipelines in the SDP.

Essential

  • Experience with distributed data driven processing algorithm development.
  • Ability to rapidly learn and contribute to the development of new techniques and technologies related to execution engines, e.g. distribution patterns and abstractions.
  • Knowledge of high-level graph-based execution engines, such as Dask and Spark or streaming execution engines such as Storm or Kafka Streams
  • Experience of software development in Python.

Desirable

  • Experience with low-level execution engines, such as MPI
  • Experience with exchange of in-memory data within high-performance data analysis pipelines
  • Experience with high throughput data-aware task scheduling technologies
  • Data flow performance modelling experience
  • Knowledge of radio astronomy data processing is an advantage

Desired Skills:

  • Python
  • GIT
  • Agile

Desired Work Experience:

  • 5 to 10 years

Desired Qualification Level:

  • Degree

Learn more/Apply for this position