Globaldev Group

Data Engineer

Globaldev Group Armenia Today
data

We’re hiring a Data Engineer to design, build, and operate reliable end-to-end ETL/ELT pipelines and data models on Snowflake + Airflow. You’ll own how data is structured, stored, and transformed - consolidating data from databases, REST APIs, and files into trusted, well-modeled datasets with clear SLAs and ownership.

Responsibilities:

  • Design scalable data architectures and data models in Snowflake (staging, integration, marts).
  • Build robust ETL/ELT processes and pipelines: ingest from RDBMS/APIs/files (batch/incremental; CDC when applicable).
  • Develop modular SQL and Python transformations; handle semi-structured JSON; create consumer-ready tables/views.
  • Orchestrate pipelines with Airflow: DAG dependencies, retries, backfills, SLAs, monitoring, and alerting.
  • Ensure idempotent re-runs/backfills; maintain runbooks and perform RCA for incidents.
  • Optimize performance and cost in Snowflake (warehouse sizing, pruning; clustering when needed).
  • Collaborate with BI/Analytics to refine data definitions, contracts, and SLAs.

Requirements:

  • Strong SQL as the core skill: designing and implementing production ETL/ELT processes and data models.
  • Python proficiency (must-have) for building data tooling, transformations, and integrations.
  • Experience as an ETL developer and data modeler: dimensional modeling, schema evolution, best practices for data storage.
  • Snowflake hands-on experience (must-have, preferred expertise): Streams/Tasks/Time Travel, performance tuning, JSON handling.
  • API integrations (auth, pagination, rate limits, idempotency).
  • Advanced English level.
  • Git-based CI/CD;  privacy/GDPR basics.

Will be a plus:

  • iGaming familiarity: stakes, wins, GGR/NGR, RTP, retention/ARPDAU, funnels; responsible gaming/regulatory awareness.
  • Interest or experience in AI/automation: Snowflake Cortex for auto-documentation, semantic search over logs/runbooks, parsing partner PDFs (with guardrails).
  • Exposure to cloud storage (GCS/S3/ADLS), Terraform/Docker, and BI consumption patterns (Tableau/Looker/Power BI).
  • Airflow proficiency: reliable DAGs, retries/backfills, monitoring, alert routing

What we offer:

  • Direct cooperation with the already successful, long-term, and growing project.
  • Flexible work arrangements.
  • 20 days of vacation.
  • Truly competitive salary.
  • Help and support from our caring HR team.

Skills in this job

People also search for