Flow

Senior Data Engineer

Flow Miami, FL 1 day ago
data
About the Company

At Flow, we’re reimagining what it means to live, work, and connect. More than just a real estate company, Flow is a brand, a technology platform, and an operations ecosystem spanning condominiums, hotels, multifamily residences, and office spaces. We’re building a new kind of living experience: one that’s flexible, connected, and designed to create genuine community and real value for the people who call Flow home.

Our mission is oneness:  prioritizing our residents and their experiences, and fostering connection with ourselves, their neighbors, and the natural world. By putting people at the center of everything we do, we’re creating vibrant, human-centered communities where life, work, creativity, and play all come together in one place.

About the Role
As a member of Flow's data team, you will shape the data architecture and strategy that powers decision-making across our multifamily real estate portfolio. We're seeking a Senior Data Engineer who excels at designing robust data solutions while working directly with the people who use them daily—property managers, leasing directors, and marketing teams.

This isn't a role where you build pipelines in isolation and hand them off. You'll be embedded with operations teams, translating their real-world problems into elegant technical solutions. When a property manager says "tours are down this week," you'll investigate data quality, understand operational context, and deliver actionable insights—not just a dashboard. Your work will directly impact leasing decisions, marketing budget allocation, and portfolio strategy.

At Flow, we're building sophisticated data infrastructure for the modern real estate industry. We leverage cloud-native technologies and AI-assisted development to move fast while maintaining production-grade quality. You'll use tools like Claude Code and GitHub Copilot as force multipliers, which means strong written communication isn't optional—it's how you work effectively. If you can write clear prompts, documentation, and stakeholder updates, AI amplifies your impact. If you can't, it becomes a liability.

We value engineers who think critically, work independently, and own problems end-to-end. You'll have significant autonomy to scope projects, make architectural decisions, and challenge assumptions when data doesn't support stakeholder hypotheses.

Finally, we believe employees are better together. Every position at Flow has an onsite or in-office requirement.

Responsibilities:

  • Design, build, and maintain data pipelines and architecture using Snowflake, dbt, and AWS
  • Implement scalable ETL processes and data models that power analytics across the organization  
  • Build data solutions for complex challenges like entity resolution, multi-system attribution, and household matching
  • Work directly with operations teams (leasing, marketing, property management) to translate daily workflow challenges into data solutions
  • Own problems end-to-end: from stakeholder conversations through implementation to dashboard deployment and adoption
  • Document architectural decisions and explain technical tradeoffs to non-technical stakeholders in clear, jargon-free language
  • Use AI coding assistants (Claude Code, GitHub Copilot) to accelerate development while maintaining production-quality code
  • Write effective prompts for AI tools and review generated solutions critically—knowing when to trust AI and when to take over
  • Ideal Background:

  • A minimum of 7 years of data engineering experience building scalable data systems in operations-heavy, data-forward organizations (not big tech)
  • Experience working in product & operations-heavy environments where data directly impacts daily business decisions
  • Deep expertise in data modeling with dbt—you understand how to model around the deficiencies and quirks of upstream operational systems
  • Strong written and verbal communication skills—comfortable writing clear documentation, effective AI prompts, and explaining technical concepts to non-technical stakeholders
  • High proficiency in SQL, Python, and modern data stack technologies (Snowflake, Fivetran, Dagster/Airflow, dbt)
  • Deep understanding of the principles and challenges of ensuring high availability, fault tolerance, and efficiency in distributed data systems
  • A keen ability to strike a balance between elegant design and pragmatic tradeoffs, all while prioritizing continuous delivery of value to the business
  • Experience building data solutions in fast-moving startup environments
  • Strong analytical skills with an inherent curiosity to understand and solve complex business problems through data
  • Benefits
    • Comprehensive Benefits Package (Medical / Dental / Vision / Disability / Life)
    • Paid time off and 13 paid holidays
    • 401(k) retirement plan
    • Healthcare and Dependent Care Flexible Spending Accounts (FSAs)
    • Access to HSA-compatible plans
    • Pre-tax commuter benefits 
    • Employee Assistance Program (EAP), free therapy through SpringHealth, acupuncture, and other wellness offerings

    Flow is proud to be an equal opportunity workplace and hires regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity and/or expression, pregnancy, Veteran status any other characteristic protected by federal, state or local law. In addition, we provide reasonable accommodation for qualified individuals with disabilities.

    Sponsored

    Explore Data

    Skills in this job

    People also search for