Elementary is the “dbt-native” data observability solution.There are two common questions I get on this from data leaders:
- What does it mean to be “dbt-native”?
- What about datasets not in dbt (legacy systems, sources, etc.)?
The first question has a short answer - our technology, architecture, and product decisions are designed to be the best observability solution for dbt pipelines.
This post will focus on the second question.
If you are considering Elementary as a solution you probably use dbt to some extent. Your goal in adopting dbt was probably to scale data transformations, make collaboration easy, deliver data products faster, and lower the maintenance complexity.
If you migrated to dbt you’ve likely focused your migration on the most critical data first. These are the assets that drive decisions and power your core workflows.
However, migrations take time. For some, it’s months. For others, years. In this time period, your dbt assets are the center of your data operation, while legacy systems often still serve older or less critical workflows that haven’t been migrated yet.
So, where should you focus observability efforts?
Just like migration, adopting a data observability solution is a process. Your goals are to enable your team to provide reliable data to consumers and increase data usage in the organization. The value of data products and resources invested in building them is compromised if, eventually, the trust in data is low.
And just like in the migration, you should start with the datasets that have the highest impact on the business and data consumers.
Why Focus on dbt Pipelines First?
Trying to monitor everything is tempting, but focusing on dbt pipelines first delivers the biggest impact. Here’s why:
- Higher Business Value: dbt pipelines are typically central to decision-making and strategy. Legacy systems, on the other hand, often support ad-hoc or lower-priority use cases. Focusing observability on dbt ensures your efforts target the data that drives your business forward and delivers the most value.
- Quick Wins with Minimal Effort: Adding observability to dbt workflows is straightforward with Elementary. Our seamless integration makes onboarding fast, allowing you to see results and benefits almost immediately.
- Building Trust in Your Investment: Implementing dbt is a big investment. Adding observability helps your organization see the results of that effort right away. Reliable, high-quality data builds confidence and shows the value of your modernized stack.
- Better ROI: Just like legacy systems are harder to maintain, they are harder to monitor. They lack the scalability or compatibility of modern workflows. Monitoring them can be more complex and resource-intensive, and it’s harder to see meaningful returns.
- Effective Coverage: In observability, a key to success is monitoring what actually matters. It is counter-intuitive, but more coverage won’t always lead to better outcomes. Optimal coverage will. The focus on critical assets reduces noise and creates an effective operation.
Once dbt workflows are stable and monitored, you can decide on your approach regarding the rest of the datasets. Starting with dbt ensures your foundation is strong before expanding further.
Should you invest in monitoring legacy pipelines?
Legacy pipelines are often part of the “pre-migration” state. They sit in older systems and may no longer be central to your strategy. What we’ve often seen is that these datasets are used less frequently for critical analytics. They often support more ad-hoc, one-off use cases or workflows, which is why they haven’t been migrated to dbt.
By adopting dbt, you’re making decisions around which data matters most to you right now. It’s about prioritizing the data that drives the biggest impact and ensuring it’s reliable. If you have legacy pipelines that meet this criteria, its migration should be prioritized to handle it with the highest standards and best technology.
As a forward-thinking team, consolidating your critical data assets on your most advanced and well-maintained infrastructure is the right path.
That doesn’t mean legacy data will never matter. For many organizations, the path forward involves addressing both modern and legacy systems over time. To align with this, in Elementary, we plan to add support to additional data pipelines and systems over time.
A Practical Approach to Adopting Observability
In summary, here’s what we recommend:
- Selective, ROI Driven Monitoring: Focus on enhanced monitoring of pipelines that are considered critical assets. This allows you to maintain visibility and detect essential issues without over-investing resources in systems that are likely to be deprecated.
- Plan for Gradual Coverage: Just like a migration project, this is a process. Start with where you can make the most impact, and aim to consolidate how you monitor your critical pipelines, just like you aim to consolidate how you build and maintain them.
By taking a thoughtful and gradual approach, you can increase data reliability effectively while prioritizing the modernization efforts that deliver the most impact today.
Will Elementary expand its coverage beyond dbt?
Certainly!
We already started with supporting non-dbt tables in data warehouses and BI tools. Beyond that, our future expansion will be guided by the needs of our customers. We’re committed to ensuring data reliability across diverse ecosystems, and we’ll continue to explore how we can best support legacy systems based on demand. Our goal is to provide seamless observability wherever our users need it most.
Contributors
Elementary is the “dbt-native” data observability solution.There are two common questions I get on this from data leaders:
- What does it mean to be “dbt-native”?
- What about datasets not in dbt (legacy systems, sources, etc.)?
The first question has a short answer - our technology, architecture, and product decisions are designed to be the best observability solution for dbt pipelines.
This post will focus on the second question.
If you are considering Elementary as a solution you probably use dbt to some extent. Your goal in adopting dbt was probably to scale data transformations, make collaboration easy, deliver data products faster, and lower the maintenance complexity.
If you migrated to dbt you’ve likely focused your migration on the most critical data first. These are the assets that drive decisions and power your core workflows.
However, migrations take time. For some, it’s months. For others, years. In this time period, your dbt assets are the center of your data operation, while legacy systems often still serve older or less critical workflows that haven’t been migrated yet.
So, where should you focus observability efforts?
Just like migration, adopting a data observability solution is a process. Your goals are to enable your team to provide reliable data to consumers and increase data usage in the organization. The value of data products and resources invested in building them is compromised if, eventually, the trust in data is low.
And just like in the migration, you should start with the datasets that have the highest impact on the business and data consumers.
Why Focus on dbt Pipelines First?
Trying to monitor everything is tempting, but focusing on dbt pipelines first delivers the biggest impact. Here’s why:
- Higher Business Value: dbt pipelines are typically central to decision-making and strategy. Legacy systems, on the other hand, often support ad-hoc or lower-priority use cases. Focusing observability on dbt ensures your efforts target the data that drives your business forward and delivers the most value.
- Quick Wins with Minimal Effort: Adding observability to dbt workflows is straightforward with Elementary. Our seamless integration makes onboarding fast, allowing you to see results and benefits almost immediately.
- Building Trust in Your Investment: Implementing dbt is a big investment. Adding observability helps your organization see the results of that effort right away. Reliable, high-quality data builds confidence and shows the value of your modernized stack.
- Better ROI: Just like legacy systems are harder to maintain, they are harder to monitor. They lack the scalability or compatibility of modern workflows. Monitoring them can be more complex and resource-intensive, and it’s harder to see meaningful returns.
- Effective Coverage: In observability, a key to success is monitoring what actually matters. It is counter-intuitive, but more coverage won’t always lead to better outcomes. Optimal coverage will. The focus on critical assets reduces noise and creates an effective operation.
Once dbt workflows are stable and monitored, you can decide on your approach regarding the rest of the datasets. Starting with dbt ensures your foundation is strong before expanding further.
Should you invest in monitoring legacy pipelines?
Legacy pipelines are often part of the “pre-migration” state. They sit in older systems and may no longer be central to your strategy. What we’ve often seen is that these datasets are used less frequently for critical analytics. They often support more ad-hoc, one-off use cases or workflows, which is why they haven’t been migrated to dbt.
By adopting dbt, you’re making decisions around which data matters most to you right now. It’s about prioritizing the data that drives the biggest impact and ensuring it’s reliable. If you have legacy pipelines that meet this criteria, its migration should be prioritized to handle it with the highest standards and best technology.
As a forward-thinking team, consolidating your critical data assets on your most advanced and well-maintained infrastructure is the right path.
That doesn’t mean legacy data will never matter. For many organizations, the path forward involves addressing both modern and legacy systems over time. To align with this, in Elementary, we plan to add support to additional data pipelines and systems over time.
A Practical Approach to Adopting Observability
In summary, here’s what we recommend:
- Selective, ROI Driven Monitoring: Focus on enhanced monitoring of pipelines that are considered critical assets. This allows you to maintain visibility and detect essential issues without over-investing resources in systems that are likely to be deprecated.
- Plan for Gradual Coverage: Just like a migration project, this is a process. Start with where you can make the most impact, and aim to consolidate how you monitor your critical pipelines, just like you aim to consolidate how you build and maintain them.
By taking a thoughtful and gradual approach, you can increase data reliability effectively while prioritizing the modernization efforts that deliver the most impact today.
Will Elementary expand its coverage beyond dbt?
Certainly!
We already started with supporting non-dbt tables in data warehouses and BI tools. Beyond that, our future expansion will be guided by the needs of our customers. We’re committed to ensuring data reliability across diverse ecosystems, and we’ll continue to explore how we can best support legacy systems based on demand. Our goal is to provide seamless observability wherever our users need it most.