Distro

Data Engineer

Distro Argentina Today
data
Position: Data Engineer
Role Overview:
• Collaborate with Architects to develop high-quality data pipelines for research, operations, and public web applications.
• Support current and future reporting and analytics solutions.
• Apply expertise in data architecture, data engineering, and reporting.
Responsibilities:
• Architect, design, and build scalable, robust, and efficient data pipelines using Azure Data Factory and other cloud-native tools.
• Establish best practices for data ingestion, transformation, storage, and access, optimizing for performance and cost-efficiency.
• Design and implement complex data models, schemas, and partitioning strategies for analytical and operational workloads.
• Champion modern data modeling strategies such as medallion architecture.
• Define and enforce frameworks for data quality, lineage, and observability.
• Enhance data platform infrastructure including CI/CD, monitoring, cost optimization, and failover strategies.
• Optimize performance of SQL Server and PostgreSQL databases and cloud services by identifying and resolving bottlenecks.
• Collaborate with product, analytics, and business teams to translate business goals into data architecture and pipeline solutions.
• Serve as the technical subject matter expert for data platforms and initiatives.
• Identify automation opportunities, streamline operations, and promote data democratization.
• Maintain comprehensive technical documentation, architecture diagrams, and operational playbooks.
• Ensure architectural consistency and enforce data security across all services and platforms.
• Work with Architects to model analytical data models, data warehouses, pipelines, and ETL processes.
Skills & Qualifications:
• Bachelor’s degree in Computer Science, Information Systems, Information Technology, or equivalent experience.
• 8+ years of hands-on experience in data engineering or data platform development with a record of delivering production-grade solutions.
• Expertise in Azure Data Factory, SQL Server, and PostgreSQL for data ingestion, transformation, and storage.
• Strong knowledge of data warehousing principles, dimensional modeling, and scalable ETL/ELT pipeline development.
• Experience with cloud services, preferably Azure, and data security protocols.
• Proficiency in Python and/or Scala focused on building maintainable and testable data solutions.
• Excellent problem-solving skills and attention to detail.
• Proven leadership ability and experience delivering quality data solutions.
• Strong communication and collaboration skills.
• Familiarity with machine learning methodologies and data science models.
• Azure or other cloud services certifications are a plus.

Sponsored

Explore Data

Skills in this job

People also search for