Skip to content

Senior Data Engineer

  • Location: Rotherdam, Zuid-Holland, Netherlands
  • Remote: Remote
  • Type: Contract
  • Job #26057

Our client is a fast-growing B2B data services start-up developing a next-generation lead generation platform. With a strong emphasis on innovation, scalability, and quality, they are building a powerful data collection pipeline deployed on AWS and developed using modern technologies. This is a unique opportunity to join a pioneering team and contribute to a product that is set to reshape the data services landscape.

The ideal candidate will be a seasoned data engineer with deep expertise in Python, AWS, and large-scale data processing. You’ll be working on building and optimizing data pipelines, transforming raw data into structured formats, and collaborating across teams to deliver clean, usable datasets for development and analysis.

NOTE – While remote the candidate must be based in a time zone from UTC 0 to UTC +4.

Role Description

This role involves designing and maintaining scalable data pipelines, automating workflows, and integrating data solutions into a serverless architecture. You’ll be solving complex business problems, working with structured and unstructured data, and contributing to the development of a data lakehouse infrastructure.

Technical Must Have Skills:

• Python Programming – Minimum 5 years of experience with strong understanding of data structures and algorithms.
• SQL Expertise – Advanced SQL skills including analytical functions and performance tuning.
• Data Engineering Experience – At least 3 years of hands-on experience building and deploying data pipelines.
• Apache Iceberg – Minimum 1 year of experience working with this data lakehouse table format.
• AWS Services – Proven experience in automating workflows and deploying pipelines using AWS tools.

Technical Nice to Have Skills:

• Cloud Ecosystems – Familiarity with AWS (S3, Glue, EMR, Redshift, Athena) or other cloud platforms like Azure or GCP.
• Orchestration Tools – Experience with Airflow or Kubernetes for managing workflows.
• Machine Learning Integration – Exposure to integrating ML models into scalable data pipelines.
• GraphQL and TypeScript – Understanding of frontend technologies used in customer-facing applications.
• Serverless Architecture – Experience working with AWS Lambda and serverless deployment models.

Additional Functional Requirements:

• Problem Solving – Ability to tackle complex data challenges and create order from chaos.
• Collaboration – Comfortable working with cross-functional teams and business stakeholders.
• Proactive Mindset – Hands-on approach with a desire to contribute across different product features.
• Adaptability – Ability to quickly understand new products and generate innovative ideas.
• Communication – Business-level proficiency in English is essential.

Educational and Certification Requirements:

• Bachelor’s or Master’s Degree – In Computer Science or a related field.
Job Ref: BBBH26057

Next Steps

If you’re excited by the opportunity to work in a startup environment that values quality, innovation, and collaboration, we’d love to hear from you. This is your chance to help shape a product from the ground up and celebrate its success with a passionate team.

Interested in this Role? Submit your CV today!

Please don’t hesitate to contact any of our team with any questions you may have on Email: [email protected]
 

Attach a resume file. Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

We are uploading your application. It may take a few moments to read your resume. Please wait!

Unlock your recruitment potential

You’re only a step away from accessing our expertise