Weekday AI

Data Integration Engineer

Weekday AI Bengaluru, Karnataka, India 3 days ago
engineering

This role is for one of the Weekday's clients


Min Experience: 4 years

Location: Bangalore

JobType: full-time

As a Data Integration Engineer, you will play a key role in developing and managing data integration frameworks and processes that connect internal and external data systems. You will collaborate closely with data engineers, analysts, product managers, and business stakeholders to design integration solutions that meet organizational data requirements. Your work will ensure seamless data movement, high data quality, and efficient data delivery across platforms.

Requirements

Key Responsibilities

Data Integration & Pipeline Development

  • Design, develop, and maintain scalable ETL/ELT pipelines to support analytics, reporting, and operational workflows.
  • Integrate data from multiple sources, including databases, APIs, third-party tools, streaming platforms, and cloud services.
  • Implement automation to streamline data ingestion, transformation, and delivery processes.

Systems & Architecture

  • Build and optimize data integration architectures aligned with business needs and long-term scalability.
  • Collaborate with Data Engineering and Platform teams to ensure integrations align with enterprise data models and governance standards.
  • Work with cloud platforms (AWS/GCP/Azure) to implement secure and reliable data transfer mechanisms.

Data Quality & Governance

  • Develop validation mechanisms to ensure accuracy, completeness, and reliability of data.
  • Troubleshoot data discrepancies, resolve pipeline failures, and maintain data integrity across systems.
  • Implement best practices for data governance, including metadata management, logging, and documentation.

Cross-Functional Collaboration

  • Work alongside engineering, BI, analytics, and product teams to understand data needs and translate them into integration solutions.
  • Participate in requirement gathering, technical design discussions, and code reviews.
  • Provide support to teams by resolving data flow issues and implementing enhancements.

Required Skills & Experience

  • 4–10 years of experience in data integration, ETL development, or data engineering.
  • Strong expertise with ETL/ELT tools such as Informatica, Talend, SSIS, Pentaho, Matillion, or similar.
  • Proficiency in SQL, stored procedures, and performance optimization for large datasets.
  • Hands-on experience with cloud platforms (AWS/GCP/Azure) and cloud-native data integration tools.
  • Solid understanding of data warehousing concepts, data modeling, and relational databases (e.g., Snowflake, Redshift, BigQuery, PostgreSQL).
  • Experience integrating with REST/SOAP APIs, message queues (Kafka, RabbitMQ), or streaming data platforms.
  • Strong scripting skills in Python, Bash, or PowerShell for automation.
  • Familiarity with version control (Git), CI/CD, and modern DevOps practices.
  • Strong analytical skills with the ability to troubleshoot complex data issues.

Preferred Qualifications

  • Experience with workflows/orchestration tools like Airflow, Prefect, or Dagster.
  • Background in building real-time or near real-time integrations.
  • Understanding of security standards, data privacy, and compliance requirements.

Sponsored

Explore Engineering

Skills in this job

People also search for