Homebot

83 Total Employees
Year Founded: 2016

Homebot Innovation & Technology Culture

Homebot Employee Perspectives

Tell us about a recent product your team launched. How does it drive Homebot’s mission forward?

Homebot’s mission is to empower people to make informed homeownership decisions together. Buying or selling a home is a major financial milestone for most Americans, and loan officers and real estate agents are key players in this process. Homebot’s core product helps these professionals guide their clients through every stage of homeownership, promoting financial literacy and wealth-building.

Our latest innovation, Partner Intel, supports our mission by strengthening the collaboration between loan officers and real estate agents. This business intelligence product facilitates stronger relationships, resulting in a broader network of referral partnerships for real estate professionals, and ultimately creates a more cohesive team of advisors, benefiting consumers with comprehensive guidance while enhancing their homeownership journey.

 

What obstacles and challenges did your team encounter — and overcome — while launching Partner Intel?

One of the major challenges we faced was handling the complexity and inconsistencies in historical real estate data essential for launching Partner Intel. Data standards aren’t often consistently applied across the industry, and there are thousands of source systems that have evolved independently, each with unique characteristics. We had to normalize and account for these variations in both our data pipeline and end-user experience.

We typically aim to deliver value to our users through iterative development, but this project demanded a substantial initial investment in our core data infrastructure. We needed to provide our application teams with a stable and adaptable system to ensure long-term success. This required prioritizing quality over speed initially, which is a tough tradeoff to make, especially with the challenging real estate market we’ve seen in the past two years. However, this approach has already paid substantial dividends in recent months. Our development velocity has never been higher, and our data quality is quickly becoming best-in-class.

 

What’s the biggest lesson your team has taken away from this launch, and how has it changed the way your team operates? 

We’ve gained numerous insights on the go-to-market side. Launching a data product from scratch and evolving Homebot from being a single-product to a multi-product company involved many moving pieces and was especially challenging, given the turbulence in the current real estate market.

One key lesson we’ve learned is the unparalleled value of customer relationships, particularly for a B2B product company. While product development principles and frameworks will always evolve, nothing can replace the insights gained from direct customer interaction. Our deep customer relationships provided a significant edge in spotting market gaps and validating our product hypotheses. Establishing a closed customer beta program is a prime example of how we leveraged these relationships to fast-track Partner Intel’s market readiness.

What practices does your team employ to foster innovation, and how have these practices led to more creative, out-of-the-box thinking?

Our team fosters innovation through a combination of structured and informal practices designed to encourage continuous learning and cross-pollination of ideas. We benefit from dedicated personal learning time, supported by a company-sponsored stipend for courses, books and other materials. Regular brainstorming sessions and design reviews provide space for open, creative problem-solving. These sessions are particularly valuable when we’re making significant architectural decisions.

One recent initiative that’s sparked cross-team curiosity is an engineering book club focused on machine learning systems and AI. What began as a learning opportunity for a few interested engineers has grown into broader discussions about applying ML in new areas of our platform, expanding our toolkit and mindset across engineering. Personally, I find the design reviews especially impactful. My teammates often raise thoughtful questions or suggest alternate approaches I hadn’t considered, which leads to more robust, scalable designs and fewer blind spots. These conversations consistently challenge assumptions, reduce siloing and ultimately lead to more innovative, well-rounded solutions.

 

How has a focus on innovation increased the quality of your team’s work? 

Focusing on innovation, particularly in how we collaborate, has significantly improved the quality and sustainability of our team’s work. Our data platform team is composed entirely of senior engineers, each with deep expertise in different areas. Rather than defaulting to assigning work based on existing expertise, we’ve embraced a more innovative approach — encouraging domain experts to mentor others and guide contributions to avoid further knowledge-siloing.

A recent example is our migration to a new orchestration platform aimed at improving reliability, stability and enabling functionality not available in the previous system. One engineer brought deep experience with the new platform, while the rest of the team was relatively unfamiliar. By intentionally building in time for others to ramp up and experiment, we uncovered new and more efficient patterns for building data pipelines and developed approaches we may have overlooked had we relied solely on prior expertise. This not only improved the quality of our implementation but also broadened the team’s skill set and deepened shared ownership of the system.

 

How has a focus on innovation bolstered your team’s culture? 

A focus on collaborative innovation has deeply enriched our team culture by creating opportunities for shared growth and genuine connection, which is especially important for a hybrid-remote team like ours. Practices such as regular design reviews, impromptu pair-programming sessions and learning-focused initiatives like our engineering book club have become more than just vehicles for technical growth — they’re also ways for the team to stay connected and engaged.

We also make space for regular team bonding outside of work, where the focus is purely on having fun together. Recently, we nearly set a record at a local escape room. After all, what better way is there for engineers to bond than solving a technical problem together? Experiences like that reinforce the camaraderie we build through our day-to-day collaboration. Ultimately, the way we innovate and collaborate has created a team culture that’s both high-performing and human — a place where people feel energized, included and excited to collaborate.

Dave Hogue
Dave Hogue, Senior Data Engineer

What’s your rule for fast, safe releases — and what KPI proves it works?

Our rule is pretty basic: Did something break immediately after we shipped? Six months ago, we had zero production monitors in Datadog. Today, we have over 70, covering critical endpoints, databases, latency, throughput, background jobs and AWS plus GCP infrastructure. A release is considered “safe” if no monitor fires post-deploy. If it does, we catch it before a customer does. The primary KPI is our Datadog monitoring coverage. 

In other words, are we watching the parts of the system that matter? We validate that with a few outcome signals: Are we detecting issues from dashboards, monitors and notebooks instead of customer reports? How many revert pull requests do we need after deployments? Are error rates trending up? We also have predefined skills, combined with Datadog’s MCP tooling, to analyze issues and surface craziness faster. Our speed comes from having enough visibility to move confidently.

 

Which standard or metric defines “quality” in your stack?

We don’t have a single metric that defines “quality,” and I’d be skeptical of any team that says they do. For us, quality shows up as a combination of signals. First and foremost, did anything break after we shipped? Are error rates trending up? Did a Datadog monitor fire? Are we getting angry emails from customers? On the product side, we also look at Amplitude to see if people are actually engaging with the feature the way we expect. On top of that, we still do manual QA, especially for high-risk or client-facing changes. If those signals are quiet and the feature is being used as intended, we consider that high quality. Taken together, this is our definition of quality.

 

Name one recent AI or automation that shipped and its impact on the team or business.

How can I name just one? We’re building GitHub Actions that, as part of our CI/CD pipeline, automatically create Datadog notebooks and use Datadog’s MCP tooling to monitor deployments in real time, with the goal of opening pull requests to fix bugs as they are detected. Ideally, every feature is observable from the moment it ships, with monitors and runbooks generated automatically alongside the code monitored mostly by bots but with direction from humans. This lets us ship more often and rely more on automated PR reviews, with humans making high-level decisions. It also helps us stay confident we didn’t break anything in this fast-moving, Opus-induced agentic world. 

On the review side, we built Rampaging Raccoons. It’s a multi-perspective PR review system for Claude Code. It dispatches multiple agents (security, performance, testing, etc.), merges their feedback and posts a single, human-style review that only flags issues the difference actually introduced. We also trigger targeted “skills” during CI, things like Sidekiq patterns and database migration best practices, so the review is grounded in how we actually build, not generic and often incorrect advice.

Bret Doucette
Bret Doucette, Senior Software Engineer

Homebot's Tech Stack

AWS (Amazon Web Services)
AWS (Amazon Web Services)
SERVICES
CircleCI
CircleCI
FRAMEWORKS
Docker
Docker
FRAMEWORKS
Elasticsearch
Elasticsearch
DATABASES
Firebase
Firebase
DATABASES
GitHub
GitHub
SERVICES
Google Cloud
Google Cloud
SERVICES
GraphQL
GraphQL
FRAMEWORKS
JavaScript
JavaScript
LANGUAGES
Kubernetes
Kubernetes
FRAMEWORKS
MongoDB
MongoDB
DATABASES
Next.js
Next.js
FRAMEWORKS
Node.js
Node.js
FRAMEWORKS
Pandas
Pandas
LIBRARIES
PostgreSQL
PostgreSQL
DATABASES
Python
Python
LANGUAGES
React
React
LIBRARIES
React Native
React Native
FRAMEWORKS
Redis
Redis
DATABASES
Redux
Redux
LIBRARIES
Ruby
Ruby
LANGUAGES
Ruby on Rails
Ruby on Rails
FRAMEWORKS
Snowflake
Snowflake
DATABASES
SQL
SQL
LANGUAGES
Terraform
Terraform
FRAMEWORKS
TypeScript
TypeScript
LANGUAGES
Confluence
Confluence
PROJECT MANAGEMENT
Figma
Figma
DESIGN
JIRA
JIRA
PROJECT MANAGEMENT
Miro
Miro
DESIGN
Tableau
Tableau
ANALYTICS
Amplitude
Amplitude
ANALYTICS
Sigma
Sigma
ANALYTICS
User Testing
User Testing
DESIGN
DocuSign
DocuSign
CRM
HubSpot
HubSpot
CRM
Intercom
Intercom
CRM
LinkedIn SalesNavigator
LinkedIn SalesNavigator
CRM
Outreach
Outreach
CRM
Salesforce
Salesforce
CRM
SendGrid
SendGrid
EMAIL
ZoomInfo
ZoomInfo
LEAD GEN
Asana
Asana
PROJECT MANAGEMENT
Slack
Slack
COLLABORATION
Zoom
Zoom
COLLABORATION
Jira
Jira
PROJECT MANAGEMENT