Are you a senior Kafka/Java engineer who loves designing and building robust streaming systems? Do you enjoy owning application code end-to-end—from design to production—including how it's run and observed?
If so, we have an exciting opportunity for you to help PropertyRadar build our next-generation event-driven platform.
About PropertyRadar
PropertyRadar is the hyperlocal lead generation platform that property-centric small local businesses use to discover, understand, and connect with new opportunities. We turn complex public record data into actionable insights for real estate investors, agents, lenders, home services, and more.
We've been doing it since 2007, and we're now investing in a modern streaming architecture to power the next decade of growth.
About the Role
We are in the early stages of Kafka adoption. You will be the in-house expert for Apache Kafka (including AWS MSK or similar managed offerings) and for Java services built with Spring Boot that consume and produce streaming data.
You will design, build, and run mission-critical Spring Boot—based Kafka services (including stream processors and producers/consumers), and establish the patterns, tooling, and best practices the rest of engineering will follow.
This is a hands-on senior role with significant influence on architecture, design, and team capability.
You'll Excel At
- Java & Spring — Building and maintaining production services with Spring Boot (our primary framework) or similar (Micronaut, Quarkus).
- Building and tuning Kafka applications — Designing topics, partitions, and consumer groups for high-throughput, low-latency workloads; implementing producers, consumers, and stream processors.
- Distributed systems — Reasoning about eventual consistency, backpressure, failure modes, and observability.
- Mentoring — Explaining streaming concepts clearly and leveling up engineers around you.
- Owning outcomes — From design to production: coding, testing, CI/CD, incident response, and continuous improvement.
What You'll Do
- Design and build Kafka-based services in Java Spring Boot applications: stream processors (e.g., Kafka Streams), producers, and consumers with clear contracts and observability for core product workflows.
- Shape Kafka and data architecture Define topic strategy (naming, partitioning, retention, compaction); shape decisions on hosting (e.g., AWS MSK) and configuration so applications are reliable and observable; implement monitoring for producer/consumer lag, throughput, and error rates.
- Own schema and data contracts Work with Product and Data Engineering to define and evolve JSON message schemas; ensure compatibility and safe evolution across services. (Experience with Avro, Protobuf, or schema registries is a plus but not required.)
- Establish best practices and tooling Create libraries, templates, and documentation for Java-based Kafka clients; help define coding standards, testing strategies, and resiliency patterns for streaming workloads.
- Collaborate across teams Partner with backend engineers, data engineers, DevOps, and product to design end-to-end streaming solutions; participate in architecture reviews, planning, and technical decision-making.
- Support what you build in production Lead incident response for Kafka and streaming-related issues; perform root cause analysis and drive preventive improvements.
What You'll Need
- 3+ years hands-on Kafka experience, including:
◦ Implementing and tuning producers, consumers, and stream processors for production-level workloads.
◦ Familiarity with Kafka cluster or managed Kafka (e.g., AWS MSK, Confluent Cloud) configuration and operations.
- Comfort with cloud environments (AWS preferred) and Linux-based operations.
- Strong understanding of topics, partitions, consumer groups, offsets, delivery semantics, error handling, retries, idempotency, and exactly-once/at-least-once patterns.
- Experience with CI/CD and infrastructure-as-code (e.g., Terraform, CloudFormation, Helm).
- 5+ years building backend services, with strong expertise in Java (10+ years Java experience preferred).
- Solid experience with Spring Boot for services and/or stream processing (Kafka Streams with Spring is ideal).
- Comfort using AI-assisted development tools (e.g., Copilot, Cursor) to accelerate coding and testing, and the judgment to review and validate AI-generated output.
- Strong communication and documentation skills; able to drive decisions and align stakeholders.
Bonus Points
- Experience with AWS MSK and related AWS services (RDS, S3, Lambda, ECS/EKS, CloudWatch).
- Experience with event modeling or event-driven architecture.
- Familiarity with CDC tooling (e.g., Debezium) and Kafka Connect.
- Experience with schema registries, Avro, or Protobuf.
- Experience with Flink or other stream processing frameworks.
- Experience with other languages in our stack (e.g., PHP, Python) or willingness to learn.
- Background in B2B SaaS, analytics, or data-intensive products.
Why Join Us
- High impact — You will directly shape our Kafka-based applications and streaming architecture from the ground up.
- Ownership — Be the primary Kafka and Java expert in a product team committed to event-driven architecture.
- Growth — Work on high-scale data problems and help the team adopt modern streaming patterns.
- Culture — Small, collaborative team where your ideas are heard, and your work quickly affects customers.
- Remote — Work from wherever you are most effective (US time zones preferred).
If this sounds like you, we'd love to talk.
PropertyRadar provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.