Are you ready to fuel innovation in the motor industry through cutting-edge data solutions? Our client is seeking a dynamic Data Engineer to accelerate their enterprise collaboration and data quality initiatives.
Responsibilities:

  • Utilize Enterprise Collaboration tools such as Confluence and JIRA to facilitate communication and collaboration within the team.
  • Develop technical documentation and artifacts to support data engineering processes and initiatives.
  • Work with various data formats including Parquet, AVRO, JSON, XML, and CSV.
  • Utilize Data Quality Tools such as Great Expectations to ensure data accuracy and integrity.
  • Apply knowledge of the Agile Working Model to adapt to changing requirements and priorities.

Technical Skills:

  • Proficiency in Terraform for infrastructure as code implementation.
  • Strong programming skills in Python 3.x for data manipulation and automation tasks.
  • Experience with SQL, including Oracle and PostgreSQL, for data querying and manipulation.
  • Familiarity with PySpark for big data processing and analysis.
  • Knowledge of Boto3 for interacting with AWS services programmatically.
  • Experience with ETL (Extract, Transform, Load) processes and tools.
  • Proficiency in Docker for containerization and deployment.
  • Familiarity with Linux/Unix operating systems.
  • Understanding of Big Data technologies and concepts.
  • Proficiency in Powershell/Bash scripting for automation tasks.

Qualifications and Experience:

  • Bachelor’s degree in IT, Business, Engineering, or related field.
  • Relevant certifications preferred, including AWS Certified Cloud Practitioner, AWS Certified SysOps Associate, AWS Certified Developer Associate, AWS Certified Architect Associate, AWS Certified Architect Professional, and Hashicorp Certified Terraform Associate.

Advantageous Skills:

  • Expertise in data modeling with Oracle SQL.
  • Exceptional analytical skills for analyzing large and complex datasets.
  • Experience with thorough testing and data validation processes.
  • Strong written and verbal communication skills, with precise documentation abilities.
  • Self-driven team player with the ability to work independently and multitask effectively.
  • Experience building data pipelines using AWS Glue, Data Pipeline, or similar platforms.
  • Familiarity with data stores such as AWS S3, AWS RDS, or DynamoDB.
  • Solid understanding of various software design patterns.
  • Experience preparing specifications for software development projects.
  • Strong organizational skills.
  • Basic experience in networking and troubleshooting network issues.
  • Bonus: Experience developing and working with REST APIs.

Desired Skills:

  • Data formats
  • Agile
  • XML
  • Oracle SQL
  • AWS
  • Rest APIs
  • Jira

Learn more/Apply for this position