A fast-growing SaaS company in the mobility and transportation space is looking for a Senior Data Engineer to join the team and take ownership of our core data infrastructure.
This is a high-impact role for someone who’s excited by large-scale data challenges, cares deeply about performance and reliability, and wants to shape the foundation of a data-driven product.
Responsibilities
- Design, build, and maintain robust data pipelines for batch and real-time processing using Spark and modern tools
- Own end-to-end backend data infrastructure: ingestion, transformation, validation, orchestration
- Architect scalable, secure, and cost-effective solutions using Google Cloud Platform (GCP)
- Optimize ETL/ELT workflows for analytics and machine learning use cases
- Build high-quality data models and data marts with strong performance and maintainability
- Collaborate with developers, product teams, and stakeholders to translate data needs into scalable systems
- Drive architectural decisions on distributed processing, reliability, and pipeline scalability
Requirements
- 4+ years of experience in data engineering or backend infrastructure roles
- Strong programming skills in Python and solid command of SQL
- Proven track record designing production-grade, low-latency pipelines (batch & streaming)
- Experience building modern data platforms, including data lakes, pipelines, and internal tooling
- Familiarity with orchestration tools (e.g., Airflow) and modern CI/CD workflows
- Comfortable working in cloud-native environments (GCP preferred, AWS also relevant)
- Hands-on experience with containerization tools like Docker or Kubernetes