Qualysoft

Senior Data Engineer (Banking)

Qualysoft Bucharest Today
data
About Qualysoft
 
·        25 years of experience in software engineering, established in Vienna, Austria
·        Active in Romania since 2007, with office in central Bucharest (Bd. Iancu de Hunedoara 54B)
·        Delivering End to End IT Consulting Services  - From Team Augmentation and Dedicated Teams to Custom Software Development 
·        We deliver scalable enterprise systems, intelligent automation frameworks, and digital transformation platforms
·        Cross-industry experience by sustaining global players in BSFI (Banking, financial services and insurance), Telecom,Retail & E-commerce, Energy and Utilities, Automotive, Manufacturing, Logitics, High Tech
·        Global Presence: Switzerland, Germany, Austria, Sweden, Hungary, Slovakia, Serbia, Romania, and Indonesia
·        International team of 500+ software engineers
·        Strategic partnerships: Microsoft Cloud Certified Partner, Tricentis Solutions Partner in Test Automation and Test Management, Creatio Exclusive Partner, Doxee Implementation Partner
·        Powered by cutting-edge technologies: AI, Data & Analytics, Cloud, DevOps, IoT, and Test Automation.
·        Project beneficiaries ranging from large-scale enterprises to startups
·        Stable growth and revenue increase year over year, a resilient organisation in volatile IT market conditions
·        Quality-first mindset, culture of innovation, and long-term client partnerships
·        Global and local reach – trusted by key industry players in Europe and the US
 

Responsibilities

  • Design and implementation of robust, scalable, and high-performance ETL/ELT data pipelines using PySpark/Scala and Databricks SQL on the Databricks platform;
  • Strong expertise in implementing and optimizing the Medallion Architecture (Bronze, Silver, Gold) using Delta Lake, ensuring data quality, consistency, and historical tracking.
  • Efficient implementation of the Lakehouse architecture on Databricks, combining best practices from traditional Data Warehousing and Data Lake paradigms;
  • Optimization of Databricks clusters, Spark operations, and Delta tables (e.g. Z-Ordering, compaction, query tuning) to reduce latency and compute costs;
  • Design and implementation of real-time and near-real-time data processing solutions using Spark Structured Streaming and Delta Live Tables (DLT);
  • Implementation and administration of Unity Catalog for centralized data governance, fine-grained security (row- and column-level security), and end-to-end data lineage;
  • Definition and implementation of data quality standards and validation rules (e.g. using DLT or Great Expectations) to ensure data integrity and reliability;
  • Development and management of complex workflows using Databricks Workflows (Jobs) or external orchestration tools such as Azure Data Factory or Airflow to automate data pipelines;
  • Integration of Databricks pipelines into CI/CD processes using Git, Databricks Repos, and Databricks Bundles;
  • Close collaboration with Data Scientists, Analysts, and Architects to translate business requirements into optimal technical solutions;
  • Providing technical mentorship to junior engineers and promoting engineering best practices across the team.
  • Qualifications

  • Proven, expert-level experience across the full Databricks ecosystem, including Workspace management, cluster configuration, notebooks, and Databricks SQL.
  • In-depth knowledge of Spark architecture (RDDs, DataFrames, Spark SQL) and advanced performance optimization techniques;
  • Strong expertise in implementing and managing Delta Lake features, including ACID transactions, Time Travel, MERGE operations, OPTIMIZE, and VACUUM;
  • Advanced/expert proficiency in Python (PySpark) and/or Scala (Spark);
  • Expert-level SQL skills and strong experience with data modeling approaches (Dimensional Modeling, 3NF, Data Vault);
  • Solid hands-on experience with a major cloud platform (AWS, Azure, or GCP), with a strong focus on cloud storage services (S3, ADLS Gen2, GCS) and networking fundamentals.

  • Nice to have
  • Practical experience implementing and administering Unity Catalog for centralized governance and fine-grained access control;
  • Hands-on experience with Delta Live Tables (DLT) and Databricks Workflows for building and orchestrating data pipelines;
  • Basic understanding of MLOps concepts and hands-on experience with MLflow to support collaboration with Data Science teams;
  • Experience with Terraform or equivalent Infrastructure as Code (IaC) tools;
  • Databricks certifications (e.g. Databricks Certified Data Engineer Professional) are considered a significant advantage;
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related technical field;
  • 5+ years of experience in Data Engineering, including at least 3+ years working with Databricks and Apache Spark at scale.
  •     What We Offer
     
    ·        Premium medical package
    ·        Lunch Tickets & Pluxee Card
    ·        Bookster subscription
    ·        13th salary and yearly bonuses
    ·        Enterprise job security with a startup mentality (diverse & engaging environment, international exposure, flat hierarchy) under the stability of a secure multinational
    ·        A supportive culture (we value ownership, autonomy, and healthy work-life balance) with great colleagues, team events and activities
    ·        Flexible working program and openness to remote work
    ·        Collaborative mindset – employees shape their own benefits, tools, team events and internal practices
    ·        Diverse opportunities in Software Development with international exposure
    ·        Flexibility to choose projects aligned with your career path and technical goals
    ·        Access to leading learning platforms, courses, and certifications (Pluralsight, Udemy, Microsoft, Google Cloud)
    ·        Career growth & learning – mentorship programs, certifications, professional development opportunities, and above-market salary
     

    Sponsored

    Explore Data

    Skills in this job

    People also search for