Position: Data Engineer
Location: Cambridge / Luton, UK (Hybrid 2-3 days onsite in a week)
Duration: Long Term B2B Contract
Job Description:
The ideal candidate with a minimum of 5 +years of experience having strong experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT Pipelines using different resources.
• Proficiency in Snowflake data warehouse architecture Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake.
• Experience with DBT (Data Build Tool) for data transformation and modelling. Implement data transformation workflows using DBT (core/cloud).
• Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.
• Proficiency in SQL performance tuning, and query optimization techniques using snowflake.
• Troubleshoot and optimize DBT models, and Snowflake performance.
• Knowledge of CI/CD, version control (Git) tools. Experience with orchestration tools such as Airflow,
• Strong analytical and problem-solving skills with an ability to work in agile development environment independently.
• Ensure data quality, reliability, and consistency across different environments.
• Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.
• Certification in AWS, Snowflake, or DBT is a plus.
Requirements
Position: Data Engineer
Location: Cambridge / Luton, UK (Hybrid 2-3 days onsite in a week)
Duration: Long Term B2B Contract
Job Description:
The ideal candidate with a minimum of 5 +years of experience having strong experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT Pipelines using different resources.
• Proficiency in Snowflake data warehouse architecture Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake.
• Experience with DBT (Data Build Tool) for data transformation and modelling. Implement data transformation workflows using DBT (core/cloud).
• Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.
• Proficiency in SQL performance tuning, and query optimization techniques using snowflake.
• Troubleshoot and optimize DBT models, and Snowflake performance.
• Knowledge of CI/CD, version control (Git) tools. Experience with orchestration tools such as Airflow,
• Strong analytical and problem-solving skills with an ability to work in agile development environment independently.
• Ensure data quality, reliability, and consistency across different environments.
• Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.
• Certification in AWS, Snowflake, or DBT is a plus.
Sponsored
Explore Data
Skills in this job
People also search for
Similar Jobs
More jobs at Axiom Software Solutions Limited
Cybersecurity and IT Auditor
Axiom Software Solutions Limited
Cybersecurity and IT Auditor
Axiom Software Solutions Limited
SAP Tax Lead
Axiom Software Solutions Limited
Cybersecurity and IT Auditor
Axiom Software Solutions Limited
Cybersecurity and IT Auditor
Axiom Software Solutions Limited
Apply for this position
Sign In to ApplyAbout Axiom Software Solutions Limited
Axiom is a global information technology, consulting and outsourcing company and services provider. Our IT solutions empower organizations and individuals throughout the world to maximize value and quality to succeed in today's business environment.
Similar Jobs
More jobs at Axiom Software Solutions Limited
Cybersecurity and IT Auditor
Axiom Software Solutions Limited
Cybersecurity and IT Auditor
Axiom Software Solutions Limited
SAP Tax Lead
Axiom Software Solutions Limited
Cybersecurity and IT Auditor
Axiom Software Solutions Limited
Cybersecurity and IT Auditor
Axiom Software Solutions Limited