Sutherland

DES2 - SnowFlake Developer

Sutherland Hyderabad 3 days ago
engineering
  • Develop and maintain data pipelines using Python, Airflow (DAGs), and AWS/Snowflake components.
  • Build and automate data ingestion, transformation, and scheduling workflows.
  • Develop Airflow DAGs including custom operators, sensors, hooks, and manage pipeline monitoring.
  • Work on Snowflake-based ELT solutions including data loads, stored procedures, and queries.
  • Write efficient SQL queries and optimize performance for data transformations.
  • Collaborate with cross-functional teams to understand requirements and deliver scalable data solutions.
  • Troubleshoot pipeline failures and ensure high availability of production workflows.

Qualifications

  • 5–8 years of experience in Python development (advanced scripting and automation).
  • 3+ years of experience with Apache Airflow (DAG design, orchestration, scheduling).
  • Experience with Snowflake or any cloud data warehouse (Redshift / BigQuery / Databricks).
  • Experience with AWS services (S3, Glue, Lambda, Athena) or equivalent cloud technologies.
  • Strong hands-on experience with SQL (advanced querying, optimization).
  • Experience with ETL/ELT data workflows, data validation, data quality checks.
  • Familiarity with Git / CI-CD, JIRA, or similar tools.
  • Good communication skills and ability to work independently.
  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience)

Additional Information

All your information will be kept confidential according to EEO guidelines.

About the Company

Sutherland is looking for a skilled Python Data Engineer with strong experience in Apache Airflow, data pipeline development, and cloud data platforms (Snowflake / AWS). The role involves building and orchestrating scalable ETL/ELT workflows and automating data processes across multiple systems.

Sponsored

Explore Engineering

Skills in this job

People also search for