This role is for one of the Weekday's clients
Min Experience: 8 years
Location: Bangalore
JobType: full-time
We are seeking an experienced and highly skilled Senior Data Engineer to design, build, and optimize scalable data platforms that power analytics, reporting, and data-driven decision-making across the organization. The ideal candidate will have deep expertise in Snowflake, AWS, data lake architectures, Python, PySpark, SQL, and Oracle Database, along with a strong understanding of modern data engineering best practices. This role requires hands-on technical leadership, ownership of complex data pipelines, and close collaboration with analytics, product, and business teams.
Requirements
Key Responsibilities
- Design, develop, and maintain scalable and high-performance data pipelines using Python, PySpark, and SQL to support batch and near real-time data processing.
- Build and manage enterprise-grade data lake architectures on AWS, ensuring reliability, scalability, and cost efficiency.
- Implement and optimize Snowflake data warehouse solutions, including schema design, data modeling, performance tuning, and query optimization.
- Integrate data from multiple structured and unstructured sources, including Oracle Database, into centralized data platforms.
- Develop robust ETL/ELT frameworks to ensure high data quality, consistency, and availability across systems.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver optimized data solutions.
- Apply best practices for data security, governance, and compliance, including access controls, encryption, and auditing.
- Monitor, troubleshoot, and optimize data workflows to ensure system reliability and performance at scale.
- Provide technical mentorship and guidance to junior data engineers and contribute to architectural and design decisions.
- Participate in code reviews, documentation, and continuous improvement of data engineering standards and processes.
Required Skills & Qualifications
- 8–12 years of experience in data engineering, building large-scale data platforms and pipelines.
- Strong hands-on experience with Snowflake, including data modeling, performance optimization, and cost management.
- Proven expertise in designing and implementing data lakes on AWS (e.g., S3, Glue, Lambda, EMR, Redshift, or related services).
- Advanced programming skills in Python and PySpark for distributed data processing.
- Excellent command of SQL for complex data transformations and analytics use cases.
- Solid experience working with Oracle Database, including data extraction, integration, and performance considerations.
- Strong understanding of data warehousing concepts, dimensional modeling, and big data processing frameworks.
- Experience with CI/CD pipelines, version control, and automated testing for data workflows.
- Strong problem-solving skills and the ability to work independently in a fast-paced environment.
Nice to Have
- Experience with real-time data processing or streaming technologies.
- Exposure to data governance, metadata management, or data catalog tools.
- Prior experience in mentoring or leading small engineering teams.
Sponsored
Explore Data
Skills in this job
People also search for
Similar Jobs
More jobs at Weekday AI
Apply for this position
Sign In to ApplyAbout Weekday AI
At Weekday (backed by YC; also Product Hunt #1 product of the day), we are building the next frontier in hiring. We have built the largest database of white collar talent in India and have built outreach tools on top of it to generate highest response ...