Skip to main content

Feeding Data from Facebook Ads to Snowflake

Facebook Ads to Snowflake

This week, I decided to take my exploration of Snowflake external interface to pump data from Facebook Ads to Snowflake. I took a template that I previously used for Hubspot meta data analysis and rolled those learnings into building a Facebook data tool. A little ChatGPT guidance had me up and running within 20 minutes.

Security and Permissions Guidance and Requirements

Before you jump in and start clicking through this tutorial there are some considerations you need to run this end to end.

1 Access to a real Facebook ads account– When you complete the setup, you preferably want to test and valdiate against a real Facebook ads account. The same developer account you use, will need direct access to a Facebook ads account.

2. Snowflake Enterprise – You will need a full, registered version of Snowflake enterprise to utilize external interface to run this stored procedure.

3 Access to Snowflake AccountAdmin– This process requires external interface to be enabled and a network rule grating access to Facebook graph API. To do that you need AccountAdmin access or request your Snowflake admin to enable this feature. This tutorial explains all of the components, so you may want to pass this along to your admin along with a formal request to enable this feature.

Setup your Facebook App and Acquire a Token

Login to Facebook Developer Console and view “My Apps”

Create a new App in Facebook developer console

Select “Setup” for “Marketing API” so you can access APIs that will ultimately deliver your data into Snowflake.

To pull Facebook Ads insights (like impressions, spend, clicks). In my case, I am reading and analyzing data so I just checked the “read” permissions.

  • ads_read → Required for reading ad data
  • read_insights → Required to access ad performance metrics
  • ads_management(Optional) Only needed if you’re modifying campaigns or fetching extra account metadata (not required for read-only insights)

Click “Get Token” and store it in a secure key vault or key management tool.

Setup your Snowflake for Facebook Ads data

Login to your Snowflake org

Create a table that will store your staged data. I called my table “AD_INSIGHTS” that resides within a FACEBOOKADS schema.

-- CREATE TABLE TO HOLD

  CREATE OR REPLACE TABLE <<YOURDB>>.FACEBOOKADS.AD_INSIGHTS (
  METADATA VARIANT,
  LAST_UPDATED TIMESTAMP,
  TABLENAME STRING
);

Create a secret to hold your Facebook access token securely. This is the same token you acquired in the steps outlined earlier in this tutorial.

CREATE OR REPLACE SECRET facebook_access_token
  TYPE = GENERIC_STRING
  SECRET_STRING = 'PASTE_YOUR_ACCESS_TOKEN_HERE';

Create a network rule that will allow Snowflake to connect to Facebook graph API. This requires elevated permissions in your Snowflake org.

CREATE OR REPLACE NETWORK RULE facebook_api_rule
  MODE = EGRESS
  TYPE = HOST_PORT
  VALUE_LIST = ('graph.facebook.com');

Create external access integration. This requires elevated permissions in your Snowflake org.

  CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION facebook_integration
  ALLOWED_NETWORK_RULES = (facebook_api_rule)
  ALLOWED_AUTHENTICATION_SECRETS = (facebook_access_token)
  ENABLED = TRUE;

Create a stored procedure to connect and return raw JSON data to Snowflake

CREATE OR REPLACE PROCEDURE <<YOURDB>>.FACEBOOKADS.FETCH_AD_INSIGHTS(ad_account_id STRING, date_preset STRING)
  RETURNS STRING
  LANGUAGE PYTHON
  RUNTIME_VERSION = 3.10
  PACKAGES = ('snowflake-snowpark-python', 'requests')
  HANDLER = 'main'
  EXTERNAL_ACCESS_INTEGRATIONS = (facebook_integration)
  SECRETS = (
    'facebook_access_token' = <<YOURDB>>.FACEBOOKADS.FACEBOOK_ACCESS_TOKEN
  )
  EXECUTE AS OWNER
AS
$$
import _snowflake
import requests
import datetime
import json
from snowflake.snowpark.types import VariantType, TimestampType, StringType, StructType, StructField

def main(session, ad_account_id, date_preset):
    token = _snowflake.get_generic_secret_string('facebook_access_token')
    
    url = (
        f"https://graph.facebook.com/v19.0/act_{ad_account_id}/insights"
        f"?fields=campaign_name,ad_name,impressions,clicks,spend"
        f"&date_preset={date_preset}&access_token={token}"
    )
    
    response = requests.get(url)
    if response.status_code != 200:
        return f"Error: {response.status_code} - {response.text}"
    
    raw_json = json.loads(response.text)
    now = datetime.datetime.utcnow()

    schema = StructType([
        StructField("METADATA", VariantType()),
        StructField("LAST_UPDATED", TimestampType()),
        StructField("TABLENAME", StringType())
    ])

    df = session.create_dataframe([[raw_json, now, ad_account_id]], schema=schema)
    df.write.mode("append").save_as_table("<<YOURDB>>.FACEBOOKADS.AD_INSIGHTS")

    return f"Success: ad insights for account '{ad_account_id}' inserted."
$$;

Obtain a campaign ID that you have access and permissions to:

Execute and request your performance data.

CALL <<YOURDB>>.FACEBOOKADS.FETCH_AD_INSIGHTS('<<YOURADACCOUNTID>>', 'last_7d');

Note: Your should NOT prefix your account ID with any values like “act_“. The stored procedure pre-filles that prefix.

Your results should load into <<YOURDB>>.FACEBOOKADS.AD_INSIGHTS as JSON.

In the next tutorial on this topic, I will share the SQL I use to parse ads data, analyze and weave in performance with my CRM data.

Replace Alteryx Self Service Chaos

Alteryx Migration

At DataTools Pro, we are longtime users of Alteryx and have termed it the ultimate data Swiss-Army knife. Having to replace Alteryx for enterprises is not an easy decision because skilled Alteryx builders are wildly productive turning data into information.

The power of Alteryx is the ability to rapidly transform and validate disparate data without writing code. This pattern remains ideal for analysts who struggle to automate complex data workflows in Excel. Data engineers who would typically write code to transform data sometimes lack business context and experience understanding acceptable validation rules. The symbiotic rise of Tableau allowed Alteryx as a high quality “ETL for Analysts” solution thrived.

Alteryx Rocketship : The State of Data in 2010-2020

  1. Data management and business intelligence were centralized but moving toward self service
  2. Analytics turn times were measured in quarters and years
  3. Data and BI teams were severely backlogged and unable to meet demand
  4. Data was spread across windows file shares and on premise databases
  5. Large enterprise data warehouses were extremely slow to develop
  6. The rise of self service visualization with Tableau created the perfect symbiotic relationship

No-Code can get in the way of efficiency

A few years ago, while using Alteryx with Snowflake, I found myself leaning on the Alteryx Python tool to handle extreme edge cases where 10-15 nodes could be expressed in a few lines of code. For example, a rolling 60 business day (minus bank holiday) window function is something we created as a UDF in Snowflake.

Shift from ETL to ELT

As a head of data and analytics and now as a consultant using Snowflake has been a game changer. As an enabling technology, it has democratized the data warehouse the same way Alteryx did for no code ETL 15+ years ago. Now, I can pump millions or hundreds of millions of rows into Snowflake with low storage costs, process, and then deliver to any analytics tool securely.

There are many new drag and drop, flow based solution that have learned and improved on the ETL tools that came before. When it comes to analytics-focused data flows for Snowflake, Datameer has long been my choice after discovering them looking for a low code solution to handle the transformation layer.

Demystifying Alteryx Flows like Messy Code

A SQL engineer can solve problems with un-optimized, difficult to follow code. Similarly an Alteryx builder can create overly complex flows, or worse stitch many flows together that can take days to decouple. In 2025, I can take tens of thousands of lines of SQL code, pump it into ChatGPT and immediately demystify, document and understand what to do next. An Alteryx installation that has grown over time naturally accumulates technical debt.

To demystify Alteryx, there are numerous tools that we use to inventory and understand Alteryx Flows.

Why Replace Alteryx with Snowflake Powered Data, Analytics and AI

Cost, complexity and operational risk are the three consistent themes we see for clients looking for alternatives to Alteryx. There are numerous tools in the market for no-code flows that have advanced beyond Alteryx. We are happy to introduce you to them.

If your enterprise’s data strategy calls for using Snowflake as the core data platform for analytics and AI, we highly recommend Datameer!

We are here to help you inventory and plan your migration


DataTools Spotlight: Datameer is our Snowflake Tool of Choice

Datameer

For this month’s DataTools spotlight, I wanted to share my long time favorite for Snowflake tool, Datameer. Years ago, I found Datameer solved my slow Snowflake adoption problem. My team was loaded with requirements but had only one data engineer on staff. Historically we used Alteryx and Tableau prep to get by. Extracting data from Snowflake to transform and insert into Snowflake was a pattern that didn’t make sense. Years ago, I shared the story how slow adoption turned around with Datameer where we delivered 20 models to production in a couple of months. That resulted in us turning off Tableau Prep and Alteryx.

As that story progressed, times got tough and we had to do more with fewer human resources. Datameer was the only way we could keep up with change management and new requests. Those are the stories you don’t lead with as case study, but I can openly share that experience now with my own clients as DataTools Pro.

Snowflake tool - Union

In 2024, I continued writing about my Snowflake experiences with Datameer. In the last year, Datameer has continued to level up its enterprise grade features while continuing to add time savings features for individual contributors.

When I use Datameer for Snowflake

There are three primary scenarios where Datameer makes sense for Snowflake customers.

  1. Accelerating Snowflake adoption and deployment of data models
  2. Empower analysts on Snowflake while keeping data in Snowflake
  3. Eliminate self-service data prep tech debt

To elaborate on this third point, I have recently run into massive, complex Tableau + Alteryx self service installations. I love Alteryx and Tableau for data exploration and dashboards as an individual contributor. However, watching enterprises lose control over the volume, velocity, and cost of self-service is painful to see.

In recent years, the pendulum has swung back, as companies have invested in modern data platforms like Databricks, Snowflake, Fabric, and Google Cloud Platform. When it comes to Snowflake, as a practitioner, I haven’t found anything that makes it easier than Datameer to wrangle and prepare data in Snowflake for business consumption. I have started chronicling how to convert Alteryx to SQL Snowflake.

Datameer flow

Enterprise Snowflake tool features that matter

From scheduled data delivery via email to publishing fact tables for business intelligence tools like Tableau and Power BI, Datameer provides speed, control, governance, lifecycle management, and cost management in one package.

  • Native Snowflake means data remains in Snowflake
  • A user experience built for speed
  • No-code where you want it, low-code where you need it!
  • Seamless SDLC, code rollback and environment promotion management
  • Generate views or materialize tables
  • Built in data validation and exploration
  • Removes proprietary tool lock-in and points of failure.
  • Predictable enterprise pricing

I was happy to be an early adopter of Datameer while working for a small enterprise. Now, I get to use Datameer once again with medium and large enterprises. If you are on Snowflake or making the move to Snowflake and need to accelerate adoption, feel free to reach out, and I’d be happy to give you a walkthrough.


Previous Spotlight

New Azure DataFactory template makes Salesforce to Snowflake Pipelines fast and cost-effective

Azure DataFactory for Snowflake and Salesforce

We built our free Azure DataFactory template to help you build your data pipelines from Salesforce to Snowflake in 5-30 minutes. The value of data is not realized by collecting and moving it. The value of data is realized when you transform it into information. Analytics insights and attributes for automation is the objective and reason why you invest in a data warehouse like Snowflake. That is why we have built our free data pipeline templates to reduce the level of effort to get your data ready for analysis up to 90%! View our documentation to lean how

In 2023, we launched the first version of our template tagging it as a “5 min data lake with Azure DataFactory”. Adoption and feedback led us to close 2024 with an upgraded version of our template alongside our new DataTools doctor service and our revised Snowflake rapid adoption service to help our customers extract value from data faster.

Download our Azure DataFactory Template Now

Name(Required)
This field is for validation purposes and should be left unchanged.

Why we build Salesforce to Snowflake pipelines with Azure DataFactory?

Fast, cheap and easy rarely happens in technology, but the Azure team, without a massive marketing blitz or fanfare created a very solid product for moving data between enterprise data sources in to Microsoft data platform and Snowflake.

We use Azure DataFactory pipelines to move massive volumes of data daily for customers who have invested in Azure. They save thousands of dollars per month while getting enterprise grade data extraction and migration.

When do we turn to other solutions than DataFactory?

We stick to technology tools that are flexible, practical, and well adopted. DataFactory in particular we like deploying with customers who are running MS SQL or Snowflake in Azure. When it comes to data transformation and ETL patterns, DataFactory does offer a no/low code Spark based Flow builder. However, depending on customer needs, scale, and team makeup, we do recommend alternative solutions that is catered to the existing processes, investments, team makeup and roadmap. We are always on the lookout for new, streamlined data pipeline solutions.

What’s Next for our Azure DataFactory template?

We are actively working on client projects for MySQL and MS SQL version of our template. Contact us for more details

What’s New in our Salesforce to Snowflake Pipelines

This week, we rolled out a long overdue update for our free Azure DataFactory template that makes extracting Salesforce data into Since 2023, there have been a lot of changes to Azure DataFactory, so we have rolled out a long overdue update and upgrade.

Salesforce to Snowflake ADF Data Lake

From our version 1.0.1 release notes:

  • New meta data staging process and table called “SFDC_METADATA_STAGE_TEMP” that feeds SFDC_METADATA_STAGE
  • Support for new field detection and addition (enable append fields)
  • New parameter AppendFields will insert new fields when detected
  • New parameter SnowErrorHandling allows for configurable error handling to skip error rows, skip file, or throw an error.
  • New MetaData field called “Status” that allows for “Disable” attribute that will ignore fields from being synchronized.
  • Update to Salesforce metadata request that supports compound fields by default like FirstName, LastName, Street, City, etc.
  • Pre-install check – End to end flow checks for existence of the metadata object
  • Added status variables to determine results for each pipeline for easier debugging
  • Schema insert and updates managed via merge by object.field ID
  • Changed field from ID to DurableId Salesforce field to Snowflake SFDC_METADATA_STAGE “ID”

Why taking a SQL Course still matters in a world with AI

SQL for Salesforce

This week, Salesforce Ben released new SQL course for Salesforce that aims to introduce a SQL learning path aimed at professionals who work in Salesforce. My goal for the course was to provide technical training from the perspective where data literacy and translating business questions is the driver to write SQL. In my course, I lean into LLMs, specifically ChatGPT, and even introduce how I use ChatGPT to assist debugging. A the end, anyone who takes the course will learn their way around Snowflake and have a lab built for funnel analytics.

Large Language Models (LLMs), are make it easier than ever to write SQL and Python. Some have made bold claims that learning how to code wont be necessary in the future. Despite advances in LLMs, SQL remains a vital skill in the data-driven world.

Writing SQL without LLM

  • Lots of time consumed troubleshooting and debugging SQL
  • Reverse engineering other’s SQL
  • Manually typing documentation
  • Understanding how functions work
  • Formatting data to use in expressions
  • Understanding data structure and meta data

Using LLMs while Writing SQL

  • Paste your code and the error and let LLMs point out syntax issues or how to correct errors
  • Break down and explain SQL structure and purpose
  • Auto-document SQL
  • Relate functions to your existing knowledge
  • Auto-prepare expressions with correct syntax
  • Explain meta data structure

What you get out of learning SQL course for Salesforce

1. General understanding – Fundamental SQL skills

AI tools like LLMs can write SQL queries, but without a solid grasp of SQL fundamentals, it’s challenging to evaluate or optimize those queries effectively. SQL is more than just a query language; it’s about understanding how data is structured, how relationships are built, and how to extract meaningful insights from databases. SQL gives you the foundation to translate questions into queries and ensures that you’re not just a passive consumer of AI-generated code.

2. Contextual Awareness

While LLMs are powerful, they might not fully grasp the nuances of your specific database environment or the business rules that govern your data. Learning SQL allows you to tailor queries to your unique context, ensuring the results are accurate and aligned with your business needs. This contextual understanding is something that AI, despite its advancements, can’t fully replicate.

3. Collaboration with Data Teams

SQL acts as a common language in the data world, bridging the gap between business professionals and technical teams. When you understand SQL, you can communicate more effectively with data engineers, analysts, and other stakeholders. Understanding the data structures needed for analytics also increases your awareness as you alter the Salesforce data model. At the end of the day, having SQL in your toolkit makes you a more valuable contributor.

4. Troubleshooting and Optimization

Even the best AI tools can generate inefficient queries that may impact system performance. By learning SQL, you gain the ability to troubleshoot, optimize, and refine these queries, ensuring they run efficiently and deliver the desired results.

5. Future-Proofing Your Career

SQL skills continue to be in high demand, with job opportunities in this field projected to grow significantly over the next decade. As DataCloud takes off, employers will value SQL proficiency, as it’s a core skill for data cloud related roles when you need to “bring your own data warehouse.”

More about SQL course for Salesforce

What You’ll Learn:

  • Data Query Language (DQL): Focus on querying and analyzing data.
  • Salesforce Integration: Learn how SQL concepts align with Salesforce SOQL.
  • Practical Skills: Hands-on exercises to build familiarity and proficiency.

Exciting New Native Salesforce Snowflake Integration

Salesforce and Snowflake Integration

When it comes to optimizing your business processes and data analytics, Salesforce and Snowflake stand as two potent platforms, each with its own ecosystem of developers, stakeholders, and users. The Salesforce Snowflake Integration is an essential conduit that amplifies the bond between these two cloud platforms.

Salesforce and Snowflake Integration

Native Salesforce Snowflake Integration: A Milestone in Native Data Sharing

Earlier this week, Salesforce and Snowflake made a groundbreaking announcement: the general availability of native Salesforce data sharing for Snowflake,  via what is colloquially referred to as “BYOL” (Bring Your Own License). This is a significant advancement, especially for Snowflake users familiar with the benefits of zero-copy sharing, a core Snowflake feature. With this integration, gone are the days when you needed layers of additional software, services, and complex processes to bridge the two platforms. This is where the Salesforce Snowflake Connector comes into play, simplifying data access and queries between Salesforce and Snowflake.

Skill Enhancement through Certification Paths

Salesforce Data Cloud serves as a data hub orchestrating a wide range of business activities—be it CRM, marketing, or or any web/mobile activities. To encourage this, Salesforce recently launched its Certified-Data-Cloud-Consultant learning path. This will help Salesforce organizations readily find skilled professionals adept in Salesforce Snowflake Integration.

Salesforce Runs on Snowflake: Following the Leader

In a revelation that should add credibility and assurance to the Salesforce Snowflake Integration, Salesforce’s internal data and analytics have migrated to run on Snowflake. This shows Salesforce is not just advocating for the technology but using it themselves, setting the stage for rapid advancements in Salesforce and Snowflake connectivity.

Transforming AI/ML Workloads

The Salesforce and Snowflake partnership holds tremendous promise for accelerating the time-to-value from your Salesforce data assets. From curating data to deploying ML models, the integration, facilitated by the Salesforce Snowflake Connector, will enable enterprises to leverage their data in novel ways, including the utilization of advanced AI features. There are many first and third party powered solutions to weave your model deployment efforts.

Need Help Navigating these Waters?

We have been in front of Salesforce and Snowflake integrating analytics apps for years. We recreantly wrote the  Salesforce data synchronization to Snowflake Guide and can’t wait to extend this into DataCloud. We have an incredible partner network that can help you implement any Salesforce or Snowflake Cloud components (CDP, MarketingCloud, Tableau).

Schedule a meeting to learn more

Unlocking the Full Potential of MLOps with Snowflake and Predactica

Machine Learning Network

The Evolution of Machine Learning Platforms

Snowflake has rapidly emerged as the go-to platform for data-centric enterprises. Its ability to centralize and harmonize diverse data types makes it an exceptional foundation for any data strategy. Machine Learning Operations (MLOps) have matured significantly over the years, thanks to Platform as a Service providers like Amazon, Google, and Microsoft, who have developed comprehensive solutions that span the entire model lifecycle. However, a plethora of platforms and tools exist in a very crowded and confusing marketplace. So, what should a customer with Snowflake, a data and analytics team, and a desire to get models to production quickly and practically do?

Predactica: The is a Glimpse of the Future of MLOps in a Snowflake-Centric World

One name that should be on your list is Predactica.. This innovative solution is engineered to fit seamlessly with Snowflake. Predactica elevates MLOps by offering a natively integrated, end-to-end machine learning solution within Snowflake. Unlike other platforms that require disparate workflows and additional data pipelines, Predactica unifies these operations, making it the ideal companion for Snowflake-centric enterprises.

The result is a unified, agile, and compliant system that dramatically reduces the time-to-market for new models while ensuring their long-term reliability. Risk modelers and data scientists can now focus on the nuances of data, feature engineering, explanation and fine tuning.

Snowflake as Center of Gravity for Enterprise Data

With the introduction of Snowpark, Snowflake has also paved the way for native model deployment, allowing organizations to manage the entire model lifecycle within the data platform. This is done using the same tools, Python libraries, and workflows that data scientists, data engineers, and DevOps professionals already use. However, the rapid evolution of MLOps calls for a more streamlined, low-code solution that can natively integrate with Snowflake. This is where Predactica comes in to compliment or potentially replace external ML platforms and expand aspects of your MLOps to more contributors.

The Competitive Edge: Agility, Compliance, and Real-Time Monitoring

Another often-overlooked aspect of the machine learning lifecycle is monitoring model performance over time. Models, especially in credit risk, are not “set and forget.” They require ongoing attention to ensure they do not degrade and continue to make accurate predictions as market conditions and customer behaviors change. Predactica addresses this crucial need by offering built-in performance monitoring features. These tools enable teams to catch performance drift early, allowing for timely model adjustments and ensuring that your decision-making remains both agile and accurate.

Conclusion

The collaboration between Snowflake and Predactica represents a leap forward for organizations looking to democratize model development and accelerate speed to value.. Don’t take our word for it, setup a meeting or sign up for a trial and let us know what you think! Sign up for a Predactica Trial

Ultimate Salesforce and Snowflake Guide on Salesforce Ben

Salesforce Ben

This week Ryan released a guide for Salesforce and Snowflake on Salesforce Ben. Salesforce Ben is the leading independent Salesforce.com community and authority on all things Salesforce.com.

Snowflake and Salesforce is a perfect marriage of cloud business applications and cloud data platform to turn data into information. Salesforce has built a powerful first-class integration within Salesforce Data Cloud that is the most advanced of any third party connectivity

If you are currently using Salesforce Data Cloud or Salesforce Tableau CRM this article is for you. Additionally, while at SnowflakeSummit2023, we saw some incredible Salesforce Data Cloud enhancements for Snowflake that will be game changing for enterprise cusetomers.

We can’t wait to write about upcoming zero copy feature from Salesforce to Snowflake. Included in our article is step by step tutorials how to integrate Salesforce with Snowflake to day. Should you have any questions how these capabilities apply to your enterprise or how Snowflake can advance your Salesforce analytics, we are here to help!

Snowflake and Microsoft Expand their Data and AI Partnership

Microsoft and Snowflake Logos

Snowflake and Microsoft, announced a press release at Snowflake Summit 2023 that they are expanding their partnership promising substantial advancements for data scientists and developers. This enhanced collaboration is set to seamlessly merge Snowflake’s Data Cloud with Microsoft’s Azure ML, extending its capabilities through the potent combination of Azure OpenAI and Microsoft Cognitive Services.

This strategic alliance means that Snowflake and Microsoft Azure shared customers will gain access to the cutting-edge frameworks of Azure ML, a streamlined process for machine learning development right through to production, along with integrated continuous integration and continuous deployment (CI/CD) processes.

But this partnership doesn’t stop there. Snowflake is setting its sights on creating even more meaningful integrations with a host of Microsoft offerings, aiming to elevate the user experience even further. These plans include closer ties with Purview for advanced data governance, Power Apps & Automate for simplified, low code/no code application development, Azure Data Factory for efficient ELT processes, and Power BI for intuitive data visualization, among others.

The end goal? To foster a seamless ecosystem that capitalizes on the synergies between Snowflake and Microsoft’s product suites, unlocking new possibilities and delivering unparalleled value to users.

At DataTools Pro, we couldn’t be more excited to see our favorite data platform, Snowflake, with new enhancements that make data management easier. Azure balances powerful data management with scalable cost that makes sense for our clients. Additionally PowerBI continues to advance its dominance for Business Intelligence. We have been working with Snowflake and Microsoft together for years and have built a toolkit that can help you jumpstart Snowflake and Azure integration.

Learn how to use Azure Data Factory and Snowflake Together

We have created free interactive step by step tutorials to help you get started!

Create a Snowflake Data Source in Azure Data Factory

Create a Data Pipeline to Connect Salesforce to Snowflake

Publish your ADF Pipeline, Data Sets, and Triggers

Create an ADF Scheduled Trigger

VIEW ALL TUTORIALS

Azure Data Factory for Snowflake Articles

More Getting Started Tutorials