Ryan Goodman has been in the business of data and analytics for 20 years as a practitioner, executive, and technology entrepreneur. Ryan recently created DataTools Pro after 4 years working in small business lending as VP of Analytics and BI. There he implanted an analytics strategy and competency center for modern data stack, data sciences and governance. From his recent experiences as a customer and now running DataTools Pro full time, Ryan writes regularly for Salesforce Ben and Pact on the topics of Salesforce, Snowflake, analytics and AI.
We built DataTools Pro first and foremost for individual contributors who understand the impact of turning the treasure trove of Salesforce metadata into real time savings. The bi-product of DataTools Pro new Salesforce metadata analysis and generation tools are information assets that will help business and soon AI agents learn and understand the relationships between your data, analytics, Salesforce, and other business applications.
The same way “metadata” connects and explains relationships and meaning of data, we want to transform explanation into true understanding between data, Salesforce, and analytics teams.
The key to Salesforce Data Cloud success is mastery of meta data. In the most recent Salesforce earnings call, meta data was a hot topic.
“But the AI is not going to work because it needs to have the seamless, amalgamated data experience of data and metadata. And that’s why our data cloud is like a rocketship.”
Marc Benioff
We are excited to share some new DataTools Pro features that puts Salesforce metadata to work for you to accelerate onboarding for data workers and soon for new AI agents!
A smarter Metrics Analyst AI – Contextual metrics recommendations
We are rolling out the second release of our Data Analyst AI to round out our original vision to ensure recommendations get smarter as your library of metrics grows. We we have improved results and automate relating dashboards and reports to metrics as one automated step.
Lookup for updates and announcements where we will showcase how Metrics Analyst AI takes common change management challenges head on!
Visualize your metrics influence – Metrics map data visualization
There is no better way to conceptualize and understand complex relationships than to visualize them. We have built the first node of our metrics map vision, allowing you to see at a glance, how a single metric relates to data, analytics, and business topics!
As your metrics library grows like any data set, so does your need to manage that data over time. While Metric Analyst will help make good recommendations, DataTools Pro is getting enhanced merging functions to make it easier for enriching and preventing duplicating metrics. We are actively working with our first power users to continue to expand our merge functions to balance fine grained control with automated recommendations!
Expanding metrics ingest and management with Tableau Pulse
We love the new Tableau Pulse advancements, making it fast and easy to build powerful metrics based analytics. As you implement and grow your metrics, library it will quickly require the same metrics management and relationship management that we are performing for Salesforce. Document and manage all of your Salesforce and Tableau based metrics in one place!
One of the best tools for visualizing and conceptualizing relationships between any topic is a mind map. We with a mind map when we started DataTools Pro in late 2023. The mind map is easy to conceptualize visually as we connect the dots between people, process, metrics, and data. This is something that all enterprises struggle with while transitioning from service and product based businesses to information based businesses. Businesses are not static, so managing complex relationships that change regularly requires building and understanding these relationships at the speed business happens!
As we started turning our mind map concept into reality, we knew relationships between metrics, topics, data and analytics assets like reports and understanding changes that occur is hard enough.
That is where data visualization delivers immense value to bring data to life. The same way data professionals understand “Entity Relationships”, business professionals should have “Metrics Relationships” to understand how business initiatives, operations, and strategy connect.
That is why we created our Metrics Map visualization, powered by DataTools Pro to systemize this process. The first iteration makes each metric the center of the universe (in our visualization visualization). From a single metric we want to know what influences a metric or KPI and what the metric has influence over. With this starting point to discover, understand and relate metrics, we can work backwards to data and forwards to outcomes!
Many analytics industry tech companies have focused on solving problems for accelerating data acquisition, transformation, and delivery. Generative AI, without contextual metrics glossaries jam packed with meta data will produce negligible results. It is the equivalent of hiring a data analyst and not explaining goals, metrics and analytics relates to the decisions out outcomes they influence.
We are excited to work with a number of like-minded partners in the realm of AI and data management to demonstrate profound improvements we are seeing when feeding our soon to be released metric maps API into generative AI analyst agents!
Create your first Metrics Mind Map from Salesforce and Tableau Pulse!
DataTools Pro is freely available for individuals and supports Salesforce and Tableau Pulse to build metrics glossaries and metrics maps.
March madness is our favorite time of year where the top college basketball programs face off on their road to the Final Four. March Madness earned it’s name from intense competition and exciting buzzer beater finishes!
In the spirt of March Madness, we have our own road to AI where we are looking at 4 important factors that will directly influence near term AI adoption and success. Our team reviewed a list of 15 topics and narrowed it down to our own final four for 2024!
1AI Ethics and Privacy: AI ethics and privacy tackle the moral principles and data protection measures critical to maintaining user trust and upholding human rights in the digital age.
2Large Language Models: Large language models, like GPT, have transformed natural language understanding and generation, enabling more sophisticated and nuanced human-AI interactions.
3Data Governance: Data governance ensures the proper management, quality, and security of data assets, serving as the backbone for trustworthy AI systems.
4AI Chatbots and Co-Pilots: AI chatbots and co-pilots are enhancing work productivity and knowledge experience through large language models.
In March, we going to deep dive into these topics and let them face off head-to-head. We are set for an exhilarating journey of discovery and debate while enjoying a few weeks of exciting basketball at the same time! Join our linked in newsletter for updates!
AI Powered Picks for the 2024 March Madness
We created a GPT March Madness bracket bot available in OpenAI GPT Store to help anyone wanting to make pics based purely on season stats. The beauty of march madness is that the stats don’t matter as teams face off. We intend our stats driven bracket to be busted by the end of the first weekend!
We were thrilled to extend an invitation to the unveiling of DataTools Pro Metric Analyst for Salesforce – your key to transforming your Salesforce organization into a beacon of metrics and KPI excellence.
Webinar Date: March 13 2024 9:30 AM PST / 12:30 PM EST
Register to get access to the recording – Week of 3-18-2024
In just 40 minutes, discover how to revolutionize the way you align and agree on KPIs, all with the speed and precision that only AI-aided automation can offer. This is more than a webinar; it’s a doorway to enhancing productivity and insights within your Salesforce org.
What you will Learn?
Plug and Play Salesforce Connected App: Seamless ways to incorporate DataTools Pro into your existing Salesforce org with 1 click.
AI-Aided KPI Alignment: How our batch, AI enhanced meta data analysts fast-tracks consensus on crucial KPIs, making your team more unified and focused.
Real-World Applications: Insightful demonstrations on leveraging DataTools Pro to elevate your organization’s data analysis and decision-making tools.
Interactive Q&A: Have your questions answered in real-time.
Recent AI advancements with large language models have broken through and forever changed how we think about information access and retrieval. Metrics and AI is at the top of my mind as AI agents today provide universal translation and curation of information. Here, in our data tools niche where we wrangle and transforming data into information, we are seeing incredible results using AI to write code and deliver the same results that traditionally required an analyst. We don’t believe AI will replace analysts, but we know already that AI augmented retrieval for researching large bodies of information a job better suited for machines. Enterprises need to re-frame documentation as context data generation for people and AI. You will likely see a rise in “knowledge graphs” as a hot topic. Unstructured data has always been deemed “untapped” gold, so now the race down the yellow brick road is on!
Behind the Curtain: Unveiling the Reality of Modern Bots
Chatbots have propelled large language models into the forefront. The benefit of these AI chatbots to individuals is the way an LLM can breaking down knowledge and experience into personalized bite sized information chunks. We are not too far from this being a shared experience in a collaborative setting. This is where we can see early adopters super charge team productivity. The big opportunity is how how enterprises will use AI to curate and deliver information us using the vast collections of empirical knowledge created over time. It goes without saying there are lots of smart people working tirelessly on these problems. Here at DataTools Pro, we are obsessed with this problem as a small scrappy team.
Does self-service analytics help?
Analysts, data scientists, and data professionals have always been required to distill complex business concepts into quantifiable analytics. Business Intelligence (management information systems) and Analytics disciplines still have the same problems today as 15 years ago. AI, LLMs and data platforms will not solve these problems without radical changes how people work.
Age old “multiple versions of truth” problems still exist. It has moved from spreadsheets to self service reports and dashboards
Empirical knowledge gained from pulling data together becomes disparate in spreadsheets, documents and PowerPoints, and email.
Methods to build a live, connected semantic layers to categorize and measure quantitative performance remain siloed and technology oriented.
Reports, dashboards, and data remain the primary delivery mechanism for performance metrics and KPIs. The need for speed to prepare and deliver self-service analytics has shortcut slow moving BI platforms of yesterday. Similarly, modern cloud data platforms have helped democratized working with structured and unstructured data that historically required database administrators, software engineers, expensive technology components. “Deluge” is the best word to describe the state of most enterprises in regard to the number of data and analytics assets flowing thus creating newer “data mesh” and “data fabric” methodologies to help strategize designing systems and process to tackle the deluge problem. We are experimenting with this ourselves with Azure Co-Pilot while our team, data, and metrics library is small.
What about Self Service via Natural Language Requests?
Natural language queries is a feature and not a solution. Many professionals simply do not know what data and metrics are available to start asking questions. This is where AI agents armed with a glossary, semantics and a large body of context data will be transformational. AI co-pilots are still very new, so we are experimenting ourselves what is real vs art of the possible. The keystone is aligning AI and business professionals with a common taxonomy and language and where we are working to build a common thread between business, data, analytics, and soon auto-pilots aiding these teams.
What about data?
Many enterprises do not have enough resources behind data governance and management. I still think this is a massive area of opportunity to somehow democratize and distribute data management in a way that is non-intrusive. Otherwise our point of arrival for AI automation will be autopilots and agents spitting out useless information. Incorrect information leads to mistrust and failed adoption of “co-pilots”.
Data storage is dirt cheap and modern data platforms make it fast and easy to analyze and model data into sophisticated analytics. How do you create universal focus?
All roads to AI metrics and analytics mastery leads back to goals and data governance
Every company has a set of metrics that indicate the health of the business. Your financial metrics within your income statement and balance sheet don’t get a lot of love on social media, but they are the bedrock to your performance (assuming you are a for profit business). Highly sophisticated metrics or amassing hundreds of metrics wont translate to good performance. Universal understanding, consistency, correctness and execution against a finite set of metrics will!
Quick metrics maturity quiz:
Do you have an inventory of all of the metrics your operational team is using?
Who are the owners, stakeholders, and oracles (keepers of institutional knowledge)?
Are you certain those metrics are calculated and deployed consistently across teams and individuals?
Are your business, technology, data and analytics teams aligned how to implement these metrics into analytics?
Fact Finding Process:
Traditionally, this is the process most consultants utilize to thoughtfully acquire and organize your metrics glossary. Many enterprises already have documents, presentations, or spreadsheets with this information formally gathered. Rarely are they universally understood and up to date.
Getting consensus and universal understanding is slow and cumbersome. It is one of the challenges we wanted to tackle at DataTools Pro
From Metrics to Key Performance Indicators and OKRs
There are a number of different organizing principles and methodologies to translate your organizational goals into metrics. A KPI differs from a metric in that it has a specific target, timeline, and direct impact on your organizational goals and objectives. You may have dozens of metrics without targets, and that is okay. There are a number of widely adopted models to help you formally structure and organize your KPIs:
Specific – target a specific area for improvement.
Measurable – quantify or at least suggest an indicator of progress.
Assignable – specify who will do it.
Realistic – state which results can realistically be achieved, given available resources.
Time-related – specify when the result(s) can be achieved.
OKR – Object Key Results
OKR – An objective is a clearly defined, inspirational goal aimed at driving motivation and direction. Key results are specific, measurable outcomes used to track the achievement of the objective.
As you get deeper into the performance management, process improvement, you will discover what works best for your corporate culture.
How are metrics and KPI is evolving with AI
No article is complete in 2024 without a hot take on AI. A lot of the focus in technology and analytics is centered on amassing feeding large volumes of quantitative data into machine learning models to predict outcomes. Now with generative AI, we are vectorizing large bodies of data to train, fine tune or simply retrieve data using natural language requests. Unfortunately, without the right semantics, definitions, governance, and context data, your AI investment won’t feel magical. Our team is racing ahead knowing the path to have meaningful dialogue and results with AI co-pilots requires context. We have taken a novel approach to run along side enterprises on their journey down the yellow brick road to help with our upcoming DataTools Pro metrics analyst!
Join us for a webinar March 13 where we will formally introduce our Metrics Analyst AI.
We have pooled together Salesforce Data Cloud resources from top experts to help get you up to speed. Salesforce Data Cloud is designed to streamline the organization and unification of data across Salesforce’s extensive ecosystem and beyond. Integrating external data sources and salesforce data seamlessly, Data Cloud was designed to ingest, share, manage, and operationalize data, enabling a deeper connection with customers through personalized experiences and targeted engagement.
At the heart of Data Cloud’s capabilities is the creation of unified customer profiles. The holly grail of a single unified view of a customer to drive understanding and hyper personalized engagement. Data Cloud isn’t a simple Salesforce feature. It is a suite of capabilities that requires a cross breed of skills to succeed.
One of the most useful tools in the admin or data professional’s toolkit are Salesforce entity relationship diagrams. Understanding conceptual and physical data models is difficult enough. A business stakeholder responsible for sales, marketing, and revenue typically has little interest in the Salesforce data model. When information coming out of Salesforce is incorrect, sometimes you need to revisit your existing data model.
Bringing Salesforce admin, data and business professionals together, sometimes a conceptual entity relationship diagram is very useful to algin to the same level of understanding to make the right forward decision. To help explain and prioritize data work for a client, I recently used our entity relationship diagram to pinpoint and explain the root cause of reporting problems.
Real World Lead Attribution Use Case with Salesforce ERD
Lead attribution is one of the most important and challenging aspects of running your “got to market” stack. To do so requires attention to data consistency and quality. One of our customers had an ambitious and practical approach to connects Leads, Accounts, and Opportunities with a junction object called “Vintage”. The ability to automatically track a lead vintage (when the lead enters the funnel), is very useful to report funnel conversion and lifetime value. Reports for revenue and lifetime value by lead source is important for planning and budgeting independent of campaign activity.
To communicate the issue, I used the following DataTools Pro ERD Diagram to demonstrate the additional data relationships that were maintained. Additionally, I explained how existing reporting requirements could easily be achieved without the vintage object. The following is the exact picture I painted to describe the specific linkage that was effectively broken in the Lead Attribution Funnel.
Resolution with Empirical Proof
There were some objections to remove the Vintage object. During the meeting, I clicked to demonstrate where those data relationships are maintained. It was very effective to satisfy most objections in real time.
There was one objection we had to clear to deprecate the Vintage object. Using historical data analysis I discovered the Vintage objection use case occurred 1 in every 500 opportunities which made it a true edge case. Sometimes you engineer a solution to account for anticipated scenarios that rarely occur in real life; this was one of those cases.
The consensus was the vintage object and all of the processes needed to maintain it could be deprecated. Rather than trying to accomplish detailed lead attribution from the lead object, campaign and campaign members are used to capture clients that enter the funnel multiple times from multiple channels.
How to Build a Salesforce entity relationship diagrams for Free
Salesforce provides an out of the entity diagram for Salesforce administrators to visualize and manage the Salesforce data model. I find them useful for administration but not for sharing and distribution.
Build better, easier to visualize ERDs with DataTools Pro: Our desire to build a better ERD for Salesforce led us to create ERDs. Here are some of reasons you may want to check out the free diagraming capabilities we offer:
Simpler, minimal design
Exportable to single page document (SVG)
Connected directly to Salesforce
Custom views aligned to business topics and tech modules.
New Azure Copilot Studio custom actions have opened the door for us to connect live, connected Salesforce metric and data dictionaries into the MS Copilot experience. Over the weekend I jumped into Azure and setup a functioning Azure Copilot, trained on our website data, that is available for you to try out below. A little bit of work and reading landed us in the same place we found ourselves a few weeks ago while testing OpenAI’s GPT actions for the first time. In a similar process, I embedded our DataTools Pro app as an action, in the same time it took to finish a cup of tea.
Unlike OpenAI, Azure OpenAI and now Azure Copilot are designed with enterprise in mind with the full suite of Azure services behind it.
Microsoft Copilot Bot Live Demo
This weekend, I dug in and with only a few clicks, built a co-pilot built a co-pilot based on the DataTools Pro website. With a little more work, we were security connecting in real time to DataTools Pro API and surfacing Salesforce metrics as context to Copilot our own business. We will continue to update this live demo below with our DataTools API demo account connected to Salesforce Essential Metrics.
Ask Questions about DataTools
Azure OpenAI and Copilot won’t fix your data
We are in unprecedented times with the speed that these AI advancements are rolling out and evolving. The real benefit of a Copilot is:
Increasing speed and ease for consuming large bodies of information
Improving the level and depth of understanding (for people who are inquisitive)
Translating and communicating information (text and visual).
While the innovation and art of the possible is very exciting, a sobering reality is you still need to double down on the same data and metadata management and governance.
The path is clear with Azure AI services.
Microsoft has done an incredible job weaving AI into the existing suite of data services and tools..
In this guide, we will walk you through the process of setting up Tableau Salesforce Cloud using the latest and greatest native integrations. Tableau Cloud natively integrates with Salesforce for enhanced security and access as the two clouds have become tightly knit together. In addition to the nuts and bolts, we will focus on key use cases how Tableau can provide valuable insights beyond standard Salesforce reports and dashboards. Tableau’s capabilities for deeper analysis, data manipulation, end-user ad-hoc analysis, and access to diverse data sources make it a powerful complement to Salesforce’s offerings.
Tableau Cloud Setup
Setting up Tableau cloud is as simple as signing up and provisioning an account through the online setup form. Once provisioned you can immediately start connecting and building.
Salesforce SSO for Tableau: Security and Access
Salesforce cloud natively supports Salesforce for user access and authentication. This allows you to extend your user management and access into Tableau so you are not needing to duplicate work.
Simply check “Salesforce” so when you invite users they will need to utilize their Salesforce username and password. If you use Multi-Factor Authentication MFA with the Salesforce authenticator app, you do not need to perform any additional configuration for it to work.
Embedding Tableau inside of Salesforce
For Salesforce organizations, Tableau should be a seamless experience that resides side by side with standard Salesforce.com dashboards. To accomplish this goal, we typically utilize the Tableau lightning component. With Tableau cloud, you can utilize the “Default Authentication type for Embedded Views”, ensuring a secure and seamless experience for end users.
The best user experience is one that reduces friction. We typically embed dashboards inside of Lightning pages and also make use of tabs to isolate Salesforce dashboards and Tableau dashboards side by side based on topic.
To allow embedding of Tableau inside of Salesforce as of Winter 24, simply go to Setup and enable Tableau embedding.
Tableau for Salesforce Use Cases
Before embarking on a Tableau Salesforce its important to understand key uses cases where implementing Tableau makes sense above and beyond standard Salesforce reports and Dashboards.
Deeper analysis
When we refer to “depth of analysis” we mean taking a single subject and exploring history, relationships, and paterns that impact the subject.
For example, if you see that yur lead to opportunity conversion rate is lower than expected, you may ask questions related to sales rep activity including speed to lead, number of calls, number of reps to leads and other ratios. When building Tableau dashboards and supporting reports, you can drill and explore these relationships over time with greater ease and relate them together.
More flexibility to slice and dice data
Slicing and dicing data in many cases requires analysts or in the world of Salesforce reports saving data to Excel. Tableau was born and designed for visual exploration of data where you can filter, drill and modify the subject of your analysis with.
End User Ad-hoc analysis
Salesforce provides an amazing ad-hoc reporting capability, granting business professionals with the power to produce powerful reports. While the report developer has a full fledge reporting solution, end consumers of the report are limited to basic filtering. Tableau on the other hand provides end user ad-hoc analysis for changing dimensions, drilling, and constraining information.
Access to more data sources for analysis
Salesforce reports and dashboards are limited to the data available inside of Salesforce. Tableau on the other hand opens the door to connect more data sources with Salesforce.
Connecting Tableau to Salesforce Data
Tableau provides a native Salesforce data connector, allowing direct access to Salesforce data objects. This is quite useful for real-time access to Salesforce data, or static extracts that harness the full power of Tableau data.
Native Salesforce Connector
Unfortunately, the Tableau integration with Salesforce data is imperfect. Using the standard Tableau connector for Salesforce prevents Salesforce formulas in the results. This limitation has long existed as an enhancement but is not obvious.
Working with Data Time Fields
Small variances in metrics can occur when using DateTime fields as a result of data extractions rendering in UTC instead of your local time zone.
Connecting Tableau to Salesforce Data Cloud
With the recent release of Salesforce Data Cloud, Tableau has a new modern approach to data access that bypasses some of the traditional limitations. We will cover this topic in detail with an upcoming post!