About Smart Working
At Smart Working, we believe your job should not only look right on paper but also feel right every day. This isn’t just another remote opportunity - it’s about finding where you truly belong, no matter where you are. From day one, you’re welcomed into a genuine community that values your growth and well-being.
Our mission is simple: to break down geographic barriers and connect skilled professionals with outstanding global teams and products for full-time, long-term roles. We help you discover meaningful work with teams that invest in your success, where you’re empowered to grow personally and professionally.
Join one of the highest-rated workplaces on Glassdoor and experience what it means to thrive in a truly remote-first world.
About the Role
As a Data Engineer, you will be responsible for building and maintaining scalable data pipelines that connect multiple systems to a central data platform. The role focuses on developing reliable data integrations, transforming raw data into structured models, and enabling the business to derive insights from its data ecosystem.
You will work closely with a collaborative team of engineers to support the organisation’s data platform, ensuring systems are effectively integrated with the Delta Lake architecture and that data models are designed to support advanced analytics and reporting. This role operates at a mid-level and is well suited to an engineer who enjoys working with large datasets, building robust data pipelines, and supporting scalable data infrastructure.
Responsibilities
Design, build, and maintain data pipelines connecting internal systems to the organisation’s Delta Lake environmentDevelop and optimise SQL-based data transformations and relational data models to support analytics and reportingIntegrate new data sources and systems into the data platform as the organisation expands its technology landscapeWork with engineering teams to ensure reliable, scalable, and well-structured data flows across systemsSupport the development of data architecture and data models that enable new business metrics and analytical capabilitiesMonitor, troubleshoot, and improve existing pipelines to ensure data accuracy, performance, and reliabilityContribute to documentation, data standards, and best practices across the engineering teamCollaborate with stakeholders to ensure data pipelines support evolving business requirements
Requirements
Strong experience working with SQL for querying, transformation, and data modellingHands-on experience with Databricks for building and managing data pipelines and analytics workflowsExperience working with large-scale data platforms or data lake architecturesFamiliarity with Apache Spark for distributed data processingExperience designing and maintaining data pipelines and ETL/ELT processesAbility to work collaboratively within cross-functional engineering teamsStrong attention to detail when building reliable and maintainable data infrastructure
Nice to Have
Experience using Python for data processing, scripting, or pipeline developmentFamiliarity with API development or integrating external data sources through APIsExperience working with Delta Lake or similar modern data lake architecturesExposure to building analytics-ready data models for reporting and business intelligenceInterest in improving data engineering practices within a growing data platform environment
Benefits
Fixed Shifts: 12:00 PM - 9:30 PM IST (Summer) | 1:00 PM - 10:30 PM IST (Winter)No Weekend Work: Real work-life balance, not just wordsDay 1 Benefits: Laptop and full medical insurance providedSupport That Matters:Mentorship, community, and forums where ideas are sharedTrue Belonging: A long-term career where your contributions are valued
At Smart Working, you’ll never be just another remote hire.
Be a Smart Worker - valued, empowered, and part of a culture that celebrates integrity, excellence, and ambition.
If that sounds like your kind of place, we’d love to hear your story.