Weekday AI

Data Engineer

Weekday AI India 1 day ago
data

This role is for one of the Weekday's clients

Min Experience: 4 years

Location: India

JobType: full-time

We are seeking an experienced Data Engineer with strong expertise in Databricks and modern data engineering practices. The ideal candidate will have 4+ years of hands-on experience in developing scalable data pipelines, managing distributed data systems, and supporting end-to-end CI/CD processes. This role involves architecting and optimizing data workflows that enable seamless data-driven decision-making across the organization.

Requirements

Responsibilities

  • Design, build, and maintain scalable ETL/ELT pipelines for large-scale datasets using Spark, Hive, or Glue.
  • Develop and optimize data integration workflows using ETL tools such as Informatica, Talend, or SSIS.
  • Write, optimize, and maintain complex SQL queries for data transformation and analytics.
  • Collaborate with cross-functional teams including data scientists, analysts, and product stakeholders to translate requirements into technical solutions.
  • Deploy data workflows using CI/CD pipelines and ensure smooth automated releases.
  • Monitor and optimize data workflows for performance, scalability, and reliability.
  • Ensure data accuracy, governance, security, and compliance across pipelines.
  • Work with cloud-based data platforms such as Azure (ADF, Synapse, Databricks) or AWS (EMR, Glue, S3, Athena).
  • Maintain clear documentation of data systems, architectures, and processes.
  • Provide mentorship and technical guidance to junior team members.
  • Stay current with emerging data engineering tools, technologies, and best practices.

What You’ll Bring

  • Bachelor’s degree in IT, Computer Science, or related field.
  • 4+ years of experience in data engineering and distributed data processing.
  • Strong hands-on experience with Databricks or equivalent technologies (Spark, EMR, Hadoop).
  • Proficiency in Python or Scala.
  • Experience with modern data warehouses (Snowflake, Redshift, Oracle).
  • Solid understanding of distributed storage systems (HDFS, ADLS, S3) and formats such as Parquet and ORC.
  • Familiarity with orchestration tools such as ADF, Airflow, or Step Functions.
  • Databricks Data Engineering Professional certification (preferred / required as needed).
  • Experience in multi-cloud or migration-based projects is a plus.

Skills in this job

People also search for

More jobs at Weekday AI