# Data & Analytics Roadmap > Need to navigate somewhere specific? Just click the links below. ![](https://i.imgur.com/Q91Vn8V.png) # Table of Contents - [Current State of Edgio](#current-state-of-edgio) - [Data Layer](#data-layer) - [Existing Entity & Technologies](#existing-entity-&-technologies) - [Audit Recommendations](#audit-recommendations) - [Edg.io Data Layer Roadmap](#edg.io-data-layer-roadmap) - [Departments Involved](#departments-involved) - [Goals](#goals) - [Foundation Technologies](#foundation-technologies) - [Requested Priorities](#requested-priorities) - [Requirements Roadmap](#requirements-roadmap) - [90 Day Back-to-Basics](#90-day-back-to-basics) - [Preface **IMPORTANT**](#preface-important) - [The Era of Validation](#the-era-of-validation) - [Summary TL;DR](#summary-tl;dr) - [Wrap-up](#wrap-up) - [Stage 1: Define Business Objectives](#stage-1-define-business-objectives) - [Stage 2: Getting Started: Core Events & Philosophy](#stage-2-getting-started-core-events-&-philosophy) - [Stage 3: Event Naming Conventions](#stage-3-event-naming-conventions) - [Stage 4: Develop a Tracking Plan](#stage-4-develop-a-tracking-plan) - [Stage 5: What is the Segment Destination Catalog?](#stage-5-what-is-the-segment-destination-catalog) | Section | Summary | | --- | --- | | Current State of Edgio | This section provides an overview of the current state of Edg.io's data layer, including existing entity and technologies. | | Audit Recommendations | This section provides a list of audit recommendations for Edg.io's data layer. | | Edg.io Data Layer Roadmap | This section outlines the Edg.io Data Layer Roadmap, including departments involved, goals, foundation technologies, and requested priorities. | | 90 Day Back-to-Basics | This section provides an overview of the 90 Day Back-to-Basics approach to creating a data layer that works for the entire organization. | | Stage 1: Define Business Objectives | This section outlines the steps for defining business objectives. | | Stage 2: Getting Started: Core Events & Philosophy | This section outlines the core events and philosophy for tracking data. | | Stage 3: Event Naming Conventions | This section outlines the best practices for event naming conventions. | | Stage 4: Develop a Tracking Plan | This section outlines the steps for developing a tracking plan. | | Stage 5: What is the Segment Destination Catalog? | This section provides an overview of the Segment Destination Catalog. | ## Companion Documents: - [3 Phases: Big Picture | Shared Janurary 2023](https://hackmd.io/@lDItpNg_SbyWB8xQKDnwgQ/B1C1zqRqj) - [Programmatic SEO Strategy | Shared Janurary 2023](https://hackmd.io/@lDItpNg_SbyWB8xQKDnwgQ/BkMRpy1oj) # Current State of Edgio Edgio is a company that provides data-driven solutions to businesses of all sizes. Unfortunately, their current data offering is inconsistent with other companies, such as Cloudflare and Vercel. ## Data Layer When evaluating the data layer of Edgio, there were several criteria considered. **Customer Data Platform**: Edgio has a 5/5 on this criteria, as they have a well-defined customer data platform that provides a strong foundation for data analysis. **Integration Strategy**: Unfortunately, Edgio has a 1/5 on this criteria, as their integration strategy is not comprehensive enough to provide a comprehensive data analysis solution. **Upkeep**: Edgio has a 1/5 on this criteria, as the data layer requires a lot of maintenance to ensure that data is collected and stored accurately. **Schema Conventions**: Edgio has a 1/5 on this criteria, as their data schemas are not up to par with industry standards. **Redundancy**: Edgio has a 1/5 on this criteria, as their data layer is prone to redundant data collection. **Event Naming Conventions**: Another 1/5 on this criteria, as the events listed in the data layer are not named consistently. **Property Passthrough**: A 1/5 on this criteria, as Edgio does not have a robust property passthrough system. **Context Passthrough**: Again, Edgio has a 1/5 on this criteria, as the context of the data collected is not easily identifiable. **Attribute Passthrough**: Edgio has a 1/5 on this criteria, as the attributes of the data collected are not easily identifiable. **Identity Events**: Edgio has a 3/5 on this criteria, as the identity events are not easily identifiable. **Group Events**: A 1/5 on this criteria, as the group events are not easily identifiable. **Engineering Patterns**: Edgio has a 2/5 on this criteria, as their engineering patterns are not up to par with industry standards. ## Existing Entity & Technologies Several criteria were considered when evaluating the existing entity and technologies of Edgio. **Technology Selection**: Edgio has a 5/5 on this criteria, as they have committed to invest in solid technologies. **Technology Adoption**: Edgio has a 2/5 on this criteria, as the team lacks the skills and training to make the most of the technologies. **Technology Strategy**: Edgio has a 1/5 on this criteria, as their technology strategy is not comprehensive enough to provide a comprehensive data solution. **Technology Upkeep**: Edgio has a 1/5 on this criteria, as the technologies require a lot of maintenance to ensure that data is collected and stored accurately. **Technology Productivity**: Another 1/5 on this criteria, as the team cannot make the most of the technologies due to a lack of training and enforcement. Overall, Edgio's data-driven solutions are not up to par with other companies. Several areas must be improved, from the data layer and integration strategy to the existing entity and technologies. With more focus on training and enforcement and more comprehensive strategies, Edgio can improve its data-driven solution and provide better customer value. # Audit Recommendations To improve the quality of Edgio's data-driven solutions, the following audit recommendations should be implemented: **Data Layer** - Create a more comprehensive integration strategy to ensure all data sources are connected and accurately reported. - Implement regular maintenance on the data layer to ensure that data is collected and stored accurately. - Establish standards for data schemas, event naming conventions, property passthrough, context passthrough, and attribute passthrough to ensure data accuracy. - Establish standards for identity events and group events to ensure data accuracy. - Implement engineering patterns that are up to par with industry standards. **Existing Entity & Technologies** - Increase team training and enforcement to ensure the technologies are being used correctly and to their fullest potential. - Establish a more comprehensive technology strategy to ensure the data solution is used to its fullest potential. - Increase the technology upkeep to ensure that data is collected and stored accurately. - Increase the organizational competency with current technologies to ensure the team can make the most of the technologies. # Edg.io Data Layer Roadmap ```markmap # Edg.io Data Layer ## Goals - Establish a Source of Truth - Ensure Data Redundancy - Establish a Validation Era - Improve Data Visualization - Automate Recurring Reporting - Automate Data Enrichment and Aggregation - Automate Recurring Workflows - Increase Data Accessibility - Improve Data Flexibility - Take Advantage of Growth Signals - Reduce Technology - Adopt New Technology - Standardize Technology Ratings and Reviews - Improve Intelligent Activity Scoring ## Foundation Technologies - Segment.com - NodeJS - GROQ - MxGraph - MermaidJS - D3js - Slack - Looker ## Requested Priorities - Reports - SDR/Sales Activity Tracking - Campaign Tracking - Consolidated View of Marketing Efforts - Account Engagement - Data - Immediate Requests - API Integrations - Integrations & QA <div class="markmap-container"> <div class="markmap" data-markmap='{ "markmap": { "root": { "name": "Edg.io Data Layer", "children": [ { "name": "Goals", "children": [ { "name": "Establish a Source of Truth" }, { "name": "Ensure Data Redundancy" }, { "name": "Establish a Validation Era" }, { "name": "Improve Data Visualization" }, { "name": "Automate Recurring Reporting" }, { "name": "Automate Data Enrichment and Aggregation" }, { "name": "Automate Recurring Workflows" }, { "name": "Increase Data Accessibility" }, { "name": "Improve Data Flexibility" }, { "name": "Take Advantage of Growth Signals" }, { "name": "Reduce Technology" }, { "name": "Adopt New Technology" }, { "name": "Standardize Technology Ratings and Reviews" }, { "name": "Improve Intelligent Activity Scoring" } ] }, { "name": "Foundation Technologies", "children": [ { "name": "Segment.com" }, { "name": "NodeJS" }, { "name": "GROQ" }, { "name": "MxGraph" }, { "name": "MermaidJS" }, { "name": "D3js" }, { "name": "Slack" }, { "name": "Looker" } ] }, { "name": "Requested Priorities", "children": [ { "name": "Reports" }, { "name": "SDR/Sales Activity Tracking" }, { "name": "Campaign Tracking" }, { "name": "Consolidated View of Marketing Efforts" }, { "name": "Account Engagement" }, { "name": "Data" }, { "name": "Immediate Requests" }, { "name": "API Integrations" }, { "name": "Integrations & QA" } ] } ] } } }'> </div> </div> ``` At Edg.io, we’re leveraging the latest in data and analytics technology to create a data layer that works for the entire organization. Our goal is to ensure that our data layer is comprehensive, reliable, and actionable. This document outlines our current priorities, order of operations, processes, goals, and details around building out our data layer. # Departments Involved To reach our goals, we’ll need to deploy resources in the following departments: **Product/Engineering**: - Product event refactoring and documentation - Automation, workflows, and routing management - Custom APIs and webhooks **Community/Product**: - Product data schema feedback and review - Reporting and visualization feedback and review **Marketing**: - Product and funnel event schema feedback and review - Reporting and visualization feedback and review **Sales**: - Enrichment and process feedback and review - Account event schema feedback and review - Reporting and visualization feedback and review **Customer Success**: - Product event feedback and review - Reporting and analytics review ## Goals Title: Data Layer Automation ```sequence Note over Segment: Segment API, NodeJS, Webhooks/Sockets Segment->NodeJS: Stream and Query Data with GROQ NodeJS->Segment: Custom APIs and Webhooks Segment->NodeJS: Automation, Workflows, and Routing Management NodeJS->Segment: Data Enrichment and Aggregation Segment->NodeJS: Automate Recurring Workflows NodeJS->Segment: Automate Recurring Reporting Segment->NodeJS: Increase Data Accessibility NodeJS->Segment: Improve Data Flexibility Segment->NodeJS: Take Advantage of Growth Signals NodeJS->Segment: Reduce Technology Segment->NodeJS: Adopt New Technology NodeJS->Segment: Standardize Technology Ratings and Reviews Segment->NodeJS: Improve Intelligent Activity Scoring ``` To ensure that our data layer is successful, we’ve identified the following goals: **Establish a Source of Truth**: Utilizing Segment's API, NodeJS, and Webhooks / Sockets, we will stream and query data with GROQ in and out of every technology without a complicated integration strategy. **Ensure Data Redundancy**: GROQ will query data from any source and create sophisticated queries, ensuring that all data is properly stored and backed up. **Establish a Validation Era**: Using NodeJS, we will create custom APIs and webhooks to build an automated data layer. **Improve Data Visualization**: Using MxGraph, MermaidJS, and D3js, we will create visual representations of data to help understand and analyze data. **Automate Recurring Reporting**: Utilizing Looker, we will create custom dashboards and metrics to track performance. **Automate Data Enrichment and Aggregation**: NodeJS will automate data enrichment and aggregation. **Automate Recurring Workflows**: NodeJS will be used to create automated workflow processes. **Increase Data Accessibility**: Utilizing Segment's API, NodeJS, and Webhooks / Sockets, we will be able to access and query data from any source easily. **Improve Data Flexibility**: NodeJS will allow us to customize our data layer to meet the needs of our organization. **Take Advantage of Growth Signals**: Using NodeJS, we can create automated workflows to take advantage of growth signals. **Reduce Technology**: Utilizing only the necessary technologies to reduce the cost and complexity of our data layer. **Adopt New Technology**: Integrating new technologies into our data layer when they become available. **Standardize Technology Ratings and Reviews**: Utilizing NodeJS to create automated processes for standardizing technology ratings and reviews. **Improve Intelligent Activity Scoring**: Utilizing NodeJS to create automated processes for improving intelligent activity scoring. # Foundation Technologies ```mermaid graph TD A[Segment] --> B[Foundation Technologies] B --> C[Generated Reports] B --> D[Workflow Automation] B --> E[Slack Assistant Knowledge Base] B --> F[Report Generator] B --> G[Enrichment of Data] ``` **Automating Recurring Reporting** ```sequence Segment.com->NodeJS: Stream and Query Data with GROQ NodeJS->Looker: Create Custom Dashboards and Metrics Looker->NodeJS: Automate Recurring Reporting NodeJS->Segment.com: Automate Data Enrichment and Aggregation ``` At Edg.io, we’re leveraging the latest in data and analytics technology to create a data layer that works for the entire organization. Our goal is to ensure that our data layer is comprehensive, reliable, and actionable. To make our data layer a success, we’re utilizing the following technologies: **Segment.com**: A customer data platform that helps us collect, explore, and act on customer data. This also provides a universal way for our data flow to become consistent with any application. By combining Segment's API, NodeJS, and Webhooks / Sockets, we can stream and query data with GROQ in and out of every technology without a complicated integration strategy, providing a 100% agnostic strategy for consuming, analyzing, and general actional data signals. **NodeJS**: NodeJS is used for automation, workflows, and routing management. It provides an easy-to-use programming environment for creating applications that can handle data operations in a reliable, efficient way. NodeJS also allows for the development of custom APIs and webhooks which can be used to build an automated data layer. **GROQ**: GROQ is a powerful data query language that can create powerful data queries and analyses. GROQ is used to query data from any source and can be used to create sophisticated queries that can be used to analyze and interpret data. **MxGraph**: MxGraph creates diagrams from data using NodeJS. MxGraph allows developers to create visual representations of data that can be used to help understand and analyze data. **MermaidJS(Mx+Others)**: MermaidJS is used for data visualization. It provides an easy-to-use framework for creating data visualizations that can be used to help understand and analyze data. **D3js**: D3js is a powerful data visualization library that can create detailed visualizations that work with multiple presentation platforms such as Slack, Web, and Exportable to Documents. ```graphviz digraph { Person1 -> "Needs data about a specific feature usage for Edg.io Sites" -> RequestData RequestData -> Person2 -> "Looking for information on the Wallmart Account" -> RequestData RequestData -> Person3 -> "Is an executive trying to find recent Performance data of Edg.io vs Vercel vs Cloudflare" -> RequestData RequestData -> "/edgio" -> "Instant Access" "Instant Access" -> "Completely Customizable" -> "Real-Time Collaboration" "Real-Time Collaboration" -> "Produce Results" -> "When You Need Them" "When You Need Them" -> "Not Weeks After You Request Them" } ``` **Slack**: Slack is used to create an easily accessible way for teams and leaders to access data from the warehouse, as well as generate reports, visualization, and any other utilities the team needs. Slack can become a powerful tool for creating a rapid real-time data framework that allows the organization's data to create actionable results regularly when used in conjunction with the other foundational technologies. **Looker**: Looker is a BI tool that provides an easy-to-use and easy-to-understand interface for data visualization and analysis. It allows users to quickly create and explore data visualizations with minimal technical knowledge, making it accessible to all team members. It also provides powerful features for creating insights and analysis and creating custom dashboards and metrics to track performance. ## Requested Priorities The requested priorities outlined above are the top-of-mind priorities of Edg.io's internal leadership and independent contributors. These areas are most pressing for the team and must be addressed as soon as possible. To remedy these solutions, Edg.io must start by going back to basics and establishing an organizational data strategy. This data strategy will not only allow the team to make the most of the technologies, but it will also allow them to take action in more effective ways. By creating a comprehensive data strategy, Edg.io can ensure that its data is organized, reliable, and actionable. **Reports**: Reports are the priority for Edg.io, as they allow the team to create and quickly access data from the warehouse, as well as generate reports, visualizations, and any other utilities the team may need. **SDR/Sales Activity Tracking**: SDR/Sales activity tracking is an essential part of Edg.io's data layer, as it helps the team to track customer engagement and sales performance. **Campaign Tracking**: Campaign tracking is also an essential part of Edg.io's data layer, as it helps the team to track the success of marketing campaigns. **Consolidated View of Marketing Efforts**: A consolidated view of marketing efforts is essential for Edg.io, as it allows the team to measure the impact of their marketing efforts in real time. **Account Engagement**: Account engagement is also an important part of Edg.io's data layer, as it helps the team to measure customer engagement and loyalty. **Data**: Data is anything that would help identify someone as a potential prospect, such as technographic (CDN, etc.), firmographics, demographics, market awareness, SEO, and sales enablement. **Immediate Requests**: Edg.io has identified a few requests, such as a data layer strategy, data layer validation, data layer modeling, Hubspot integrations & validation, account dedupe & reconciliation automation, data warehouse management, enrichment & aggregation, Similarweb automation, and 6Sense automation. **API Integrations**: Edg.io also utilizes API integrations, such as 6Sense and SimilarWeb. **Integrations & QA**: Edg.io also utilizes integrations & QA, such as BuiltWith, HighSpot, Klue, Metadata.io, MixPanel, Qualified, SalesLoft, Segment, Sprout Social, Vidyard, Wappalyzer, and Wiza. # Requirements Roadmap ```mermaid gantt title Analyzing 2023 Projects dateFormat YYYY-MM-DD section Establish a Source of Truth Utilize Segment's API, NodeJS, and Webhooks/Sockets :a1, 2023-02-01, 5d Stream and query data with GROQ :after a1, 5d section Ensure Data Redundancy GROQ query data from any source :a2, 2023-02-08, 5d Create sophisticated queries :after a2, 5d section Establish a Validation Era Create custom APIs and webhooks :a3, 2023-02-15, 5d section Improve Data Visualization Utilize MxGraph, MermaidJS, and D3js :a4, 2023-02-22, 5d section Automate Recurring Reporting Utilize Looker :a5, 2023-03-01, 5d section Automate Data Enrichment and Aggregation Utilize NodeJS :a6, 2023-03-08, 5d section Automate Recurring Workflows Utilize NodeJS :a7, 2023-03-15, 5d section Increase Data Accessibility Utilize Segment's API, NodeJS, and Webhooks/Sockets :a8, 2023-03-22, 5d section Improve Data Flexibility Utilize NodeJS :a9, 2023-03-29, 5d section Take Advantage of Growth Signals Utilize NodeJS :a10, 2023-04-05, 5d section Reduce Technology Utilize only necessary technologies :a11, 2023-04-12, 5d section Adopt New Technology Integrate new technologies :a12, 2023-04-19, 5d section Standardize Technology Ratings and Reviews Utilize NodeJS :a13, 2023-04-26, 5d section Improve Intelligent Activity Scoring Utilize NodeJS :a14, 2023-05-03, 5d section Review and Refine Review and Refine :a15, 2023-05-10, 5d section Finalize Finalize :a16, 2023-05-17, 5d section Implement Implement :a17, 2023-05-24, 5d section Monitor Monitor :a18, 2023-05-31, 5d section Refine Refine :a19, 2023-06-07, 5d section Evaluate Evaluate :a20, 2023-06-14, 5d section Refine Refine :a21, 2023-06-21, 5d section Monitor Monitor :a22, 2023-06-28, 5d section Evaluate ``` ## 90 Day Back-to-Basics # Preface **IMPORTANT** Data is the lifeblood of any successful organization. Without accurate and reliable data, it can be difficult to make informed decisions, track progress, and measure success. That's why it's so important to have a roadmap for data requirements. At Edg.io, we understand the importance of having a data layer that works for the entire organization. That's why we've created the Requirements Roadmap - a 90-day back-to-basics approach to creating a data layer that works for the entire organization. The Requirements Roadmap is designed to help organizations define their business objectives, get started with core events and tracking, create a tracking plan, and understand the Segment Destination Catalog. It also covers custom destinations, which can be used to create custom workflows, validate data, and create unique signals and reports. By following the Requirements Roadmap, organizations can ensure that their data is organized, reliable, and actionable. This will help them to make better decisions, track progress, and measure success. #### **The Era of Validation** >The Era of Validation is the moment in which it is declared that data in the **active warehouse** and all streams are producing accurate and consistent data. We're using a back-to-basic's approach that will be conducted in-tandem with existing priorities. >All existing data will go through validation and scrubbing via Automation. Data that requires further checking, will be flagged by the automation for manual review. The goal with this is that we can process the **"Legacy" events** that pass data into segment while passing the new data. The main difference is that the two warehouse will no longer collide and data that isn't emitted from the new warehouse will be excluded from existing reports unless actively added into the report be a Report Request. #### Summary TL;DR The Requirements Roadmap is a 90-day back-to-basics approach to creating a data layer that works for the entire organization. It helps organizations define their business objectives, get started with core events and tracking, create a tracking plan, and understand the Segment Destination Catalog. It also covers custom destinations, which can be used to create custom workflows, validate data, and create unique signals and reports. Additionally, the Era of Validation is the moment in which it is declared that data in the active warehouse and all streams are producing accurate and consistent data. #### Wrap-up - Define business objectives - Get started with core events and tracking - Create a tracking plan - Understand the Segment Destination Catalog - Utilize custom destinations - Ensure data is organized, reliable, and actionable - Validate and scrub data via automation - Flag data for manual review ### Stage 1: Define Business Objectives At Edg.io, our goal is to create a data layer that works for the entire organization. To do this, we need to define our business objectives, so that we can track and measure progress towards our goals. #### **What kind of events or data best illustrate or explain how your customers use your product?** >At Edg.io, we want to track events related to customer usage, such as page views, clicks, downloads, and sign-ups. Additionally, we want to track customer engagement metrics, such as time spent on page, session length, and page visits. ##### **How do people discover, start using, and paying for your product?** >At Edg.io, we want to track data related to how customers discover, start using, and pay for our product. This includes tracking data related to ad campaigns, link clicks, articles completed, campaigns opened, forms submitted, and user signups. #### **What are the most important steps in a customer’s journey?** >At Edg.io, we want to track data related to the most important steps in a customer's journey. This includes tracking data related to ad campaigns, link clicks, articles completed, campaigns opened, forms submitted, and user signups. Additionally, we want to track customer engagement metrics, such as time spent on page, session length, and page visits. ### Stage 2: Getting Started: Core Events & Philosophy #### What are the core events that should be tracked to get started? >The core events that should be tracked to get started include page or screen views, user actions, and any other way users can interact with the site or app. This should be limited to a few core events, but properties should be added to provide context about them. #### What is the "less is more" philosophy of tracking data? >The "less is more" philosophy of tracking data is the idea that tracking fewer events, but more properties to provide context about them, leads to better data accuracy. This approach allows for more focused analysis and allows the team to better understand the data and its implications. #### What are the pros and cons of the "less is more" and "track more and analyze later" approaches? >The pros of the "less is more" approach are that it allows for more focused analysis and allows the team to better understand the data and its implications. On the other hand, the cons are that it requires a lot of upfront planning and potentially the team may miss out on data that could be valuable. >The pros of the "track more and analyze later" approach are that it allows for more flexibility and allows the team to collect more data. On the other hand, the cons are that it can be more difficult to analyze and interpret the data, as well as more costly to store and maintain. ### Stage 3: Event Naming Conventions ![](https://i.imgur.com/KCqY4kk.png) >I recommend using Title Case for event names and snake_case for property names. This ensures that the event names and properties are easily readable and understood by the team and organization. #### What Event Name Structure Should be Used? >Using the Object (Blog Post) + Action (Read) framework for event names. This structure is easy to read and understand, allowing for better data analysis and understanding. #### What Should be Avoided When Creating Event Names? >When creating event names, it is crucial to avoid creating events that pull a dynamic value into the event name (for example, User Signed Up (11-01-2019)). This can lead to huge numbers of tables and schema bloat. ``It is also important to avoid adding values to event names when they could be a property, and to avoid creating property keys dynamically``. ### Stage 4: Develop a Tracking Plan #### What is a tracking plan? > A tracking plan is a document that outlines the events that should be tracked, where those events live in the code base, and why they are being tracked (from a business perspective). It serves as a project management tool to get the organization in agreement about what data to use to make decisions. A tracking plan helps build a shared understanding of the data among marketers, product managers, engineers, analysts, and any other data users. #### What are the benefits of a tracking plan? > Tracking plan's provides a single source of truth about what data is being collected and why. It also helps to ensure that data is organized, reliable, and actionable. Additionally, a tracking plan can help to identify what information should be collected as traits that would be useful to group users together, and plan how to collect that information. #### What should be included in a tracking plan? > A tracking plan should include the following: - Identify and Group calls - Track events - Track event properties - Data - Immediate requests - API integrations - Integrations & QA ## How do I create a tracking plan? > To create a tracking plan, start by identifying the events that are directly tied to one of your business objectives. > Then, plan the Identify and Group calls, and the Track events and properties. Additionally, plan for data, immediate requests, API integrations, and integrations & QA. > Finally, create a spreadsheet to maintain the tracking plan, and use it as a project management tool to get the organization in agreement about what data to use to make decisions. ### Stage 5: What is the Segment Destination Catalog? The Segment Destination Catalog is a comprehensive list of hundreds of tools that can be integrated with Segment. It includes analytics tools, marketing automation tools, customer data platforms, and more. The catalog includes detailed information about each tool, including how Segment transforms data for the destination tool, troubleshooting tips, set-up instructions, and implementation considerations. #### What information can I find in the Segment Destination Catalog? The Segment Destination Catalog includes detailed information about each tool, including how Segment transforms data for the destination tool, troubleshooting tips, set-up instructions, and implementation considerations. Additionally, you can find information about which Segment methods each tool accepts, which connection modes each tool supports, and which destinations may need to be bundled. #### What is a custom destination? A custom destination is a destination that is not included in the Segment Destination Catalog. Custom destinations allow you to send data to any endpoint, such as a webhook, an API, or a database. Custom destinations can be used to create custom workflows, such as sending data to a custom analytics tool or a custom marketing automation tool. #### What are the Benefits of Custom Destinations? >Custom destinations offer a number of benefits, including: - Automated enrichment workflows: Custom destinations allow you to create automated workflows that can enrich data in real-time. This can be used to add additional data points to existing data sets, such as customer demographics or technographic data. - Data validation: Custom destinations can be used to validate data before it is sent to other systems. This ensures that data is accurate and up-to-date, and helps to reduce errors. - Unique signals and reports: Custom destinations can be used to create unique signals and reports that are tailored to the needs of the organization. This can be used to gain insights into customer behavior or to track performance metrics. - Customized reporting: Custom destinations allow you to create customized reports that can be accessed from anywhere. This makes it easy to track performance metrics and gain insights into customer behavior. - Easy integration of additional tools: Custom destinations make it easy to integrate additional tools into the data layer. This allows for a more comprehensive data layer that can be used to gain insights into customer behavior and track performance metrics.