GT was founded in 2019 by a former Apple, Nest, and Google executive. GT’s mission is to connect the world’s best talent with product careers offered by high-growth companies in the UK, USA, Canada, Germany, and the Netherlands.
On behalf of Metris Energy, GT is looking for a Data Engineer (Contract) who would enjoy a hands-on, fast-moving project focused on rebuilding and modernizing the data platform powering one of the most innovative solar energy analytics products on the market.
About the Client
Metris Energy is the UK’s leading solar-asset-management platform, helping solar funds and O&M installers maximise return on investment by turning real-time data into action. Their platform enables clients to monitor every watt, automate operations & maintenance, do instant PPA billing, and generate clear insights — all in one place.
Metris believes the future of energy is distributed — and they’re building the tools so any company can become a power provider.
About the Role
Metris is transitioning from a functional MVP to a scalable, structured data platform, and is looking for a Data Engineer to support a 2–3 month migration effort.
Your focus will be to move existing Python/SQL ETL jobs into a new framework and rebuild core pipelines—particularly those handling inverter and performance data.
You’ll work closely with the Senior Data Engineer, who provides architectural direction and reviews, while you take ownership of day-to-day execution.
The environment is fast-moving, high-trust, and pragmatic, with an emphasis on clean implementation, speed, and autonomy. The tech stack includes Python, SQL, cloud-native tools, and an internal Airflow-like orchestrator (Airflow experience is helpful but not required).
Responsibilities
Migrate existing Python/SQL ETL jobs into the new data platform.
Rebuild and refactor pipeline logic following established architectural patterns.
Work with inverter, sensor, and performance data pipelines to ensure accurate ingestion.
Deliver high-quality implementations across multiple independent, parallelizable tasks.
Produce clean, consistent, production-ready code.
Collaborate with the Senior Data Engineer for reviews and technical alignment.
Troubleshoot issues and validate data correctness after migration.
Document changes clearly and maintain logical clarity across pipelines.
Communicate progress proactively and manage your work independently.
Essential knowledge, skills & experience:
Strong proficiency in Python for data engineering tasks.
Hands-on experience with ETL pipelines, data ingestion, or similar workflow-driven systems.
Ability to refactor, migrate, and clean up existing codebases.
Solid knowledge of SQL for data querying and transformations.
Experience delivering work in fast-paced, execution-focused environments.
Comfortable working with minimal supervision and owning tasks end-to-end.
Ability to ensure accuracy, reliability, and consistency across data pipelines.
Nice-to-have
Experience with Airflow or similar orchestration systems.
Exposure to cloud-native data environments or data platform migrations.
Familiarity with IoT, sensor data, or time-series data processing.
Experience working in early-stage or high-growth tech environments.
Interview Steps
GT interview with Recruiter
Technical interview
Offer