Sylndr was born to build trust in the Egyptian pre-owned cars market by bringing transparency, reliability and delivering best-in-class service to all stakeholders. We are on a mission to impact millions of Egyptians by becoming the go-to online platform to sell their cars directly in just a few hours or to buy high-quality and affordable cars seamlessly.
About the role
We’re looking for a Senior Data Engineer to help shape the foundation of Sylndr’s data ecosystem. You will be responsible for building and maintaining scalable data pipelines, warehouses, and systems that power analytics, reporting, and AI-driven decision-making across the company. You’ll work closely with data scientists, analysts, and product/engineering teams to ensure our data infrastructure is high-performing, reliable, and secure.
What you'll do
Design, build, and maintain robust ETL/ELT pipelines to collect, process, and transform data from multiple internal and external sources.Develop and manage data models, schemas, and warehouse/lakehouse solutions (e.g., Snowflake, BigQuery, Redshift, Databricks).Ensure high standards of data quality, governance, and lineage across all systems.Implement workflow orchestration and automation using tools such as Airflow, dbt, or Dagster.Collaborate with analytics, product, and engineering teams to define and enforce data architecture best practices.Enable real-time and batch data processing pipelines using technologies like Kafka, Spark, or Flink.Optimize data storage, query performance, and cost efficiency across Sylndr’s data platforms.Champion modern data engineering principles — versioning, CI/CD, observability, and infrastructure-as-code.Work closely with the AI and Data Science teams to support model development and deployment pipelines.
Who you are
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.5+ years of experience in data engineering or a similar role.Strong proficiency with SQL and at least one programming language (Python, Scala, or Java).Hands-on experience with modern data stacks — Airflow, Snowflake/BigQuery/Redshift.Experience working with cloud environments such as AWS, GCP (preferably GCP).Solid understanding of data modeling, partitioning, and performance optimization techniques.Familiarity with containerization/orchestration (Docker, Kubernetes) and CI/CD for data systems.Excellent problem-solving, communication, and collaboration skills.