Snowflake Cortex Search vs. Custom RAG

Snowflake Cortex Search vs. Custom RAG

Choosing the Right Approach for Enterprise AI

Data & AI Strategy
AI & Machine Learning
Forecasting & Prediction
AI Agents & Chatbots
Snowflake

Enterprise adoption of AI is moving quickly, but leaders face a critical question: how do we ground large language models (LLMs) in enterprise data while keeping solutions scalable, accurate, and cost-effective?

Snowflake’s Cortex Search service and custom Retrieval-Augmented Generation (RAG) pipelines represent two different approaches to solving that challenge. Understanding how they work—and when each is the right fit—is essential for any organization investing in enterprise-ready AI.

The debate begins with a shared starting point: Retrieval-Augmented Generation (RAG) enables LLMs to use enterprise data effectively.

What is RAG?

Retrieval-Augmented Generation (RAG) is the process of giving an LLM access to relevant, external information so it can answer queries more accurately.

The typical RAG workflow looks like this:

The value of RAG is that it allows models of any size to deliver high-quality, context-aware answers—whether it’s the latest company policy, current product details, or niche industry knowledge.

Why RAG Matters Now

Enterprise AI adoption is accelerating, but models alone are not enough. RAG has become essential because it:

In other words, RAG is the bridge between broad LLM capability and business-specific intelligence.

The Rise of Enterprise-Ready AI 

Snowflake Cortex arrives at a moment when enterprise AI adoption is shifting from experimentation to scale. According to Gartner’s Emerging Tech Impact Radar: Generative AI (2025), one trends stand out that will fundamentally change how enterprises adopt and operationalize AI:

AI Marketplaces Will Reshape How Enterprises Buy AI.

By 2028, 40% of enterprise purchases of AI assets—models, training data, and tools—will be made through AI marketplaces, up from less than 5% in 2024. This shift will make AI assets more accessible, but it also introduces new questions: 

As enterprises weigh these decisions, the opportunity is clear: managed services like Cortex Search make it faster than ever to get started, but selecting the right approach—and understanding the tradeoffs—remains critical to long-term success.

The Big Question

With both Cortex Search and custom RAG pipelines available, leaders face a critical decision:
When should you use Cortex Search, and when does a custom RAG pipeline make more sense?

The Case for Cortex Search

For many enterprises already running on Snowflake, Cortex Search offers the fastest path to Retrieval-Augmented Generation. It delivers a “batteries included” experience with embedding, chunking, document parsing, and auto-updates handled natively inside the Snowflake Data Cloud.

Strengths

Best Fit For

Cortex Search is best suited for teams that:

Key Considerations

When evaluating Cortex Search, organizations should consider:

These considerations highlight the importance of aligning the right approach to the right use case, helping organizations get the most out of Cortex Search today and in the future.

The Case for Custom RAG

While Cortex Search is designed to cover a wide range of enterprise use cases, some organizations encounter requirements that go beyond its current scope. In those cases, a custom RAG architecture may be the right fit.

Strengths

Best Fit For

Custom RAG is best suited for enterprises that:

Tradeoffs

The tradeoff for flexibility is complexity. Custom RAG requires:

Choosing the Right Approach

Enterprises evaluating Retrieval-Augmented Generation often face a decision between Cortex Search, Snowflake’s managed turnkey option, and a Custom RAG architecture built for flexibility and control. 

The right choice depends on the end goal: aligning the approach to the use case ensures organizations get the most value from their investment.

This comparison can be viewed from two angles: feature differences such as setup, scaling, and control, and evaluation criteria that guide leaders in choosing the best fit for their priorities.

Cortex Search vs. Custom RAG

Setup, scale, and control at a glance.

A Hybrid Approach

For many enterprises, the best path is not either/or, but both. Cortex Search provides a fast, Snowflake-native way to launch retrieval-augmented applications with minimal setup. As needs grow — more data types, domain-specific performance, or advanced retrieval strategies — a custom RAG architecture can extend those foundations without starting over.

Aligning Approach to Use Case

The key is alignment: matching the approach to the use case. Whether the priority is speed, scale, or specialization, organizations can maximize value by choosing the right starting point and planning for future flexibility.

The Path Forward

Every enterprise’s journey with AI looks different. Whether you start with Cortex Search, scale with Custom RAG, or combine both, the key is choosing an approach that aligns to your business goals. That’s where OneSix comes in.

Expert Guidance,
Real-World Implementation

We help enterprises choose the right AI approach and make it real. Our team of senior engineers and PhD-trained scientists designs, deploys, and scales solutions that deliver impact fast.

Contact Us
Written by

Osman Shawkat, Senior ML Scientist

Published

September 22, 2025

Beyond the Prompt: Why Your RAG System May Be Underperforming

Beyond the Prompt: Why Your RAG System May Be Underperforming

Data & AI Strategy
AI & Machine Learning
Forecasting & Prediction
AI Agents & Chatbots

Faced with the question “What is the capital of the Netherlands?” you have a few possible responses:

1


Answer confidently
If you know it

2


Look it up
If uncertain

3


Take a guess
Might be wrong


Large Language Models (LLMs) face the same challenge. They excel when a question falls inside their training data, but when it doesn’t, they may “hallucinate,” producing an answer that sounds plausible but is wrong. 

The key difference is that LLMs don’t have direct access to your enterprise data or knowledge bases without additional retrieval methods. That’s where Retrieval-Augmented Generation (RAG) comes in.

RAG in a Nutshell

RAG is the process of giving an LLM access to relevant, external information so it can answer queries more accurately. The typical RAG workflow looks like this:

The value of RAG is that it allows models of any size to deliver high-quality, context-aware answers, whether it’s the latest company policy, current product details, or niche industry knowledge. But RAG doesn’t operate in isolation. For RAG to deliver consistently, it needs to be part of a well-designed information environment, also known as context engineering.

 

The Shift from Prompt to Context Engineering

In the early days, “prompt engineering” was the art of crafting the right wording to get the right answer. But as AI systems have grown more complex, the industry has realized that context quality of context matters more than the cleverness of the prompt.

Context engineering builds the full information environment around the LLM, not just the immediate instruction, but also system settings, past conversation history, retrieved documents, tools, and output formats.


Prompt Engineering
Shaping single-turn prompts for answers


Context Engineering
Shaping context for multi-step tasks


RAG is a critical part of context engineering, ensuring that the model’s “world” includes the exact information needed for the task.

It’s Not Your RAG, It’s Your Context

In real-world deployments, many RAG systems disappoint, and the issue is almost never the model. It’s bad context engineering. Common pitfalls include:

Imagine an AI system reviewing legal contracts that confidently reports a key clause is missing. In reality, the clause exists, but the retrieval process never pulled it into the model’s context. This kind of gap shows why careful retrieval design is essential.

Engineering Retrieval for Success

Preventing these failures starts with designing retrieval around the business use case:

Done well, RAG produces grounded, fresh, scalable, and personalized AI outputs. But in many real-world environments, not all the information you need is text. From images and videos to audio clips and charts, handling different content formats introduces new retrieval challenges — and that’s where multi-modal context comes in.

Handling Multi-Modal Context

Most embedding models are optimized for a single type of data, and text models usually outperform others. Multi-modal embeddings (for example, image plus text models) often underdeliver in production.

A surprisingly effective solution is to convert all content to text before retrieval.

For example:

By indexing text representations, retrieval accuracy for non-text content improves dramatically.

RAG in the Real World

OneSix built an AI-powered chatbot for a higher education client to help students get answers faster.


By applying RAG, the chatbot summarized thousands of unstructured documents, giving students accurate answers instantly and helping the university better serve its community.

Real-world RAG success comes from context engineering, feeding models the right information to deliver accurate, reliable, business-ready answers.

Ready to unlock the full potential of RAG?

At OneSix, we design and deploy Retrieval-Augmented Generation systems built for the real-world. We engineer context, optimize retrieval, and integrate AI into your workflows—so your models deliver accurate, reliable, measurable results.


Let’s talk about how we can turn your AI ideas into measurable results.

Contact Us
Co-written by

Matt Altberg, Lead ML Engineer
Francisco Gonzalez, Sr. Architect

Published

August 19, 2025

Behind the Booth: How We Delivered a Live AI Experience at Snowflake Summit

Behind the Booth: How We Delivered a Live AI Experience at Snowflake Summit

Written by

Ryan Lewis, Sr. Lead Consultant

Published

June 13, 2025

AI & Machine Learning
Sigma
Snowflake

We wanted to do more than just hand out stickers at this year’s Snowflake Summit. We wanted to create an experience—something fun, immersive, and hands-on that brought the power of AI, Snowflake, and real-time analytics to life.

So we built a two-part AI solution that combined facial recognition, mood detection, and instant dashboarding. The result? A booth where attendees could get a custom AI-generated caricature and learn how advanced tools like Snowflake, Landing AI, and Sigma come together to power intelligent experiences.

The Big Idea: Fun Meets Function

We set out to show what’s possible when modern data and AI tools are thoughtfully combined. Our booth experience worked like this:

1

Snap a Photo
Attendees had their photo taken at the booth.

2

Generate a Caricature

The image was sent to an AI service to create a playful caricature.

3

Analyze Your Mood

At the same time, the photo was sent to a custom mood detection model powered by LandingLens (via Snowflake) to classify your expression.

4

See the Results in Real Time

Your mood data was logged in Snowflake and instantly visualized on a Sigma dashboard alongside aggregated visitor insights.

5

Take Home the Fun

Every participant got a printed caricature to keep—and a great story to share on LinkedIn.

Behind the Scenes: How It All Worked

Data Collection & Processing

We started by gathering internal data—photos of the OneSix team acting out specific moods. These images were passed through AWS Rekognition to crop the headshots and stored in an S3 bucket.

But to train a meaningful model, we needed more. We augmented our dataset with hand-picked stock photos and even generated synthetic headshots using GPT-image-1 to ensure variety and balance across moods.

Model Training in LandingLens

Training our mood detection model involved:

LandingLens made it easy to go from raw images to a trained model with its low-code interface and tight Snowflake integration.

Real-Time Inference & Dashboarding

When an attendee took a photo:

The entire process happened in seconds. One moment you’re smiling at the camera, the next you’re a dot on a live mood chart.

Why We Chose These Tools

Snowflake Icon (2)
landingai-icon

Landing AI's LandingLens

sigma (2)

Sigma

The Takeaway

This wasn’t just a gimmick. It was a live demonstration of what’s possible when you combine the right technologies with a little creativity.

At OneSix, we believe data and AI should feel approachable and practical. This booth experience proved that smart, scalable, and governed AI applications don’t have to be intimidating—or boring. Thanks to everyone who stopped by to participate. We hope you had as much fun as we did. See you next year!

AI-Ready. Future-Focused.
Powered by Snowflake + OneSix.

Whether you’re just starting or scaling, OneSix helps you build intelligent data solutions with Snowflake it's core. We bring the strategy, architecture, and AI expertise to turn your data into real business outcomes.

Click Here

AI-Driven and Privacy-First: How Snowflake Powers Modern Marketing

AI-Driven and Privacy-First: How Snowflake Powers Modern Marketing

Written by

Jacob Zweig, Managing Director

Published

May 7, 2025

AI & Machine Learning
AI-Driven Marketing
Snowflake

The rules of marketing are changing fast. AI is raising the bar for personalization. Privacy regulations are reshaping how you work with customer data. And siloed data stacks? They’re quickly becoming a thing of the past. That’s why OneSix partners with Snowflake — to help marketing teams not only keep pace, but lead.

We bring together AI, machine learning, and Snowflake’s AI Data Cloud to help companies build smarter, faster, and more personalized marketing strategies rooted in trusted, connected data. Together, we give you the tools to act on insights, optimize spend, and elevate performance — all while protecting customer privacy and scaling with confidence.

5 Trends Reshaping Marketing

Snowflake’s Modern Marketing Data Stack 2025 report highlights five major forces reshaping the landscape. But to succeed, companies need more than ideas — they need a foundation built for agility, intelligence, and security. That’s where Snowflake shines — and where OneSix delivers real business value.

Trend 1

The Rise of the Data-Empowered Marketer

In 2025, marketers are no longer waiting on technical teams to access insights. Thanks to advancements in AI and natural language querying, they can now explore data directly, make decisions faster, and create more personalized experiences​.

At OneSix, we help marketing teams take full advantage of this shift with personalization at scale, deploying models like Lifetime Value (LTV), churn prediction, and Next Best Action to deliver immediate, actionable insights that drive real-time personalization.

Trend 2

Sophisticated, Data-Connected Applications

Marketing applications are evolving to connect directly to unified data platforms rather than relying on fragmented, siloed subsets. This new model boosts security, strengthens governance, and unlocks deeper insights across every customer interaction​.

OneSix designs marketing ecosystems where applications work seamlessly with unified customer data, creating the foundation for truly cohesive, omnichannel engagement.

Trend 3

Old and New Measurement Strategies

As third-party cookies become less reliable, marketers are turning to a blend of classic and modern measurement tools, including Marketing Mix Modeling (MMM) and secure Data Clean Rooms​. These approaches offer a privacy-first way to measure performance and optimize budget allocation.

Through our Marketing ROI Measurement services, OneSix helps companies implement next-generation MMM models and privacy-centric attribution frameworks to deliver actionable clarity on campaign effectiveness.

Trend 4

The Increased Value of First-Party Data

In a privacy-first world, first-party data has become marketing’s most valuable asset. Brands that successfully capture, enrich, and activate their own customer data are gaining undeniable competitive advantages​.

At OneSix, we empower organizations to build rich, scalable first-party data ecosystems—leveraging clustering, segmentation, and predictive modeling to unlock smarter acquisition, retention, and personalization strategies.

Trend 5

The Rise of Commerce Media

More brands are transforming into media platforms themselves, monetizing their first-party data by creating targeted advertising ecosystems. Whether in retail, travel, telecom, or beyond, this shift to commerce media opens new revenue streams and deepens customer engagement​.

OneSix supports brands in navigating this opportunity with our high-value audience acquisition solutions, helping companies not only identify and convert high-value customers but also build monetizable audience strategies through predictive insights and lookalike modeling.

Snowflake Icon (2)

Snowflake: The Core of Modern Marketing

To fully take advantage of these emerging marketing trends, companies need more than ambition — they need the right foundation. As Snowflake highlights, building a future-ready marketing stack means embracing platforms that are connected, composable, and AI-powered​.

At the heart of this transformation are key capabilities and technologies that define the modern marketing data stack:

Unified, AI-Ready Data Platform

Snowflake's AI Data Cloud eliminates data silos by centralizing customer, campaign, and sales data in a single, governed environment. This "single source of truth" unlocks faster personalization, smarter segmentation, and more efficient optimizations.

Advanced AI and ML Services

With Snowflake Cortex, marketers can easily tap into pre-built machine learning models and generative AI capabilities for tasks like customer segmentation, predictive scoring, and automated personalization — without needing deep technical expertise.

Privacy-First Collaboration

Snowflake's Data Clean Rooms enable secure data collaboration with partners and media platforms, allowing companies to measure campaign performance and enrich audience insights while fully preserving user privacy.

Third-Party Enrichment

Through the Snowflake Marketplace, marketers can access hundreds of third-party data sources — from demographic and intent data to purchase behavior — enriching their own first-party data without complex integrations.

Identity Resolution and Enrichment

Marketers can tie together fragmented user profiles and anonymous interactions using Snowflake-native identity resolution tools, making it easier to create a truly holistic view of the customer journey.

Governance and Security Built In

Snowflake ensures data security, governance, and compliance at every layer, helping marketing teams maintain customer trust while deploying increasingly sophisticated personalization strategies.

At OneSix, we help companies not just implement these Snowflake capabilities — but also design strategies around them. We build AI-driven marketing ecosystems that are fueled by unified data, automated by intelligent models, and powered by real-time insights — so you can deliver the right message, to the right customer, at exactly the right time.

Whether you’re ready to deploy Snowflake Cortex for predictive engagement, leverage Data Clean Rooms for collaborative attribution, or enrich your segmentation strategies through Snowflake Marketplace, OneSix can accelerate your path to marketing success.

Future-Proof Your
Marketing Strategy

At OneSix, we don’t just implement Snowflake — we create custom, AI-powered marketing strategies built on it. From real-time personalization to privacy-first measurement, we’re here to help you lead with data, act with intelligence, and scale with confidence. Let’s reimagine your marketing strategy together.

Contact Us

Marketing Spend Optimization: Why AI Is the Key to Higher ROI

Marketing Spend Optimization: Why AI Is the Key to Higher ROI

Written by

Jacob Zweig, Managing Director

Published

April 16, 2025

AI & Machine Learning
AI-Driven Marketing

With marketing efforts spread across countless channels, each dollar spent—and each customer touchpoint—has greater impact and complexity.

Unfortunately, many brands still rely on outdated marketing models: last-click attribution, rigid budget plans, and disconnected reporting systems. These traditional approaches can’t capture the full story, leading to missed opportunities and wasted spend.

It’s time to move beyond guesswork. With the rise of AI-powered tools like Multi-Touch Attribution (MTA) and Media Mix Modeling (MMM), brands can now track the complete customer journey, attribute value across every channel, and continuously optimize their marketing strategy in real time.

In this post, we’ll explore how AI is reshaping marketing strategy—from smarter budget allocation to advanced attribution models—and how OneSix can help you turn insights into impact.

200%

Increase in return on ad spend (ROAS)

15%

Increase in sales

Why Traditional Marketing Strategies Fall Short

Many marketing teams still rely on legacy models—last-click attribution, manual reporting, and siloed channel analysis. These outdated methods make it nearly impossible to understand the full customer journey or justify budget allocation decisions.

In a world where customers interact with brands across multiple devices, platforms, and stages of decision-making, traditional marketing approaches simply can’t keep up.

AI-Driven Budget Allocation Optimization

AI models can analyze historical performance, campaign goals, and channel effectiveness to recommend how to allocate your marketing budget across platforms like Google Ads, social media, email, and display. Instead of relying on static budgets set months in advance, AI enables dynamic, responsive decision-making—so you’re always investing where it counts.

Multi-Touch Attribution (MTA)

See the full picture of the customer journey.

Understanding the effectiveness of your marketing efforts is no small feat—especially when customer journeys span a wide array of online and offline channels. That’s where Multi-Touch Attribution (MTA) comes in.

MTA is a powerful framework that helps marketers understand how different touchpoints—like social media ads, search campaigns, email marketing, and website visits—contribute to a customer’s decision to buy or engage. Unlike basic models that assign all the credit to the first or last interaction, MTA assigns value to multiple touchpoints across the journey, providing a more accurate, data-informed view of marketing performance.

Traditional Attribution Models: A Limited View

Before diving into advanced techniques, it’s helpful to understand where many marketers start:

While easy to implement, these models often produce incomplete or misleading insights, especially when trying to optimize spend across diverse marketing channels.

Modern MTA Models: Deep Learning for Deeper Insight

As marketing channels become more complex and customer journeys more fragmented, modern AI-driven models are filling the gap. Advanced MTA approaches—like LSTM networks, Transformers, and Temporal Convolutional Networks (TCNs)—can model sequential customer behavior, learn from historical data, and accurately assign value to each touchpoint.

LSTM-Based Attribution

Long Short-Term Memory (LSTM) networks are a type of recurrent neural network (RNN) ideal for analyzing sequences. They can process long customer journeys, understand the timing and order of interactions, and identify which touchpoints had the greatest influence on a conversion. By calculating gradients (i.e., how much a small change in one touchpoint affects the outcome), LSTM models can attribute precise credit to each step along the way.

Transformer-Based Attribution

Transformers—famous for powering models like ChatGPT—excel at understanding relationships between touchpoints, regardless of distance in the sequence. Their self-attention mechanism lets the model weigh how every touchpoint relates to every other, enabling highly nuanced attribution. This approach is ideal for complex customer journeys with many simultaneous interactions across channels.

Temporal Convolutional Networks (TCNs)

TCNs are another powerful option for modeling time-ordered data. Unlike RNNs, they use dilated convolutions to analyze sequences in parallel, which leads to faster processing and high accuracy. TCNs work especially well when journey lengths vary from customer to customer.

Applications of MTA: From Insight to Action

So how do these models translate into better business outcomes?

Smarter Budget Allocation

MTA helps marketers identify true ROI across channels and adjust budgets accordingly. For instance, if social media drives early engagement but email converts, you can confidently invest in both.

Customer Journey Optimization

MTA reveals the actual sequence of touchpoints that lead to bookings or purchases. This insight helps refine not just messaging and creative, but also the order, timing, and targeting of campaigns.

Hyper-Personalization

With granular attribution data, you can tailor marketing strategies to specific segments—delivering more relevant offers across the right channels.

From Attribution to Action: Budget Optimization in Practice

Once an MTA model is trained, it produces attribution weights that quantify each touchpoint’s influence on conversions. These weights can be used to solve a mathematical optimization problem: how to distribute your marketing budget across channels to maximize conversions or revenue.

For example, if your MTA model outputs these weights:

You can use optimization techniques (e.g., linear programming or gradient descent) to allocate your budget in a way that maximizes return, while also considering constraints like minimum spend thresholds or strategic goals.

OneSix helps brands take these results and apply them in the real world—building automated budget optimization systems that adjust spend in real time based on performance data and predictive insights.

Media Mix Modeling (MMM)

Optimize every marketing dollar you spend.

In today’s privacy-conscious environment, Media Mix Modeling (MMM) is gaining traction as a powerful, cookie-free approach to understanding marketing impact.

MMM uses aggregated historical data to quantify how different marketing activities—like TV, paid search, influencer campaigns, or email—affect outcomes like revenue, conversions, or customer lifetime value. It’s especially valuable when dealing with long buying cycles, offline conversions, or regional campaign variations.

Why Brands Are Turning to MMM

As marketing strategies grow more complex, so does the challenge of proving ROI. MMM addresses this by offering:

Improved ROI Visibility

MMM pinpoints which marketing efforts actually drive results, helping you spend smarter across channels.

Increased Accountability

With clear metrics on effectiveness, you can confidently justify your budget decisions to leadership.

Real-Time Optimization

With modern tooling and infrastructure, MMM isn’t just a once-a-year exercise—it can be run regularly to adapt to market changes.

Multi-Touch Influence

MMM can capture the cumulative impact of various touchpoints—even those traditionally difficult to measure, like print media or influencer impressions.

Privacy Resilience

Unlike methods that rely on user-level tracking or cookies, MMM uses aggregate data, making it a future-proof strategy in a privacy-first world.

Reduced Bias in Decision-Making

Advanced MMM models automate decisions around ad fatigue, seasonality, and spend thresholds, removing guesswork and gut-feeling from critical marketing calls.

How MMM Works: The Mechanics Behind the Model

MMM builds a statistical model that connects marketing activities and external factors to your key business outcomes. Here are two of the most important concepts:

Adstocking

Not all marketing effects are instant. Adstocking accounts for the delayed impact of a campaign—for example, the lingering effect of a billboard or a TV commercial. This allows the model to recognize how impressions continue to influence behavior days or weeks after the initial exposure.

Saturation

Every channel has a point of diminishing returns. MMM models use saturation curves (often modeled with a Hill function) to understand when added spend in a channel stops yielding proportional returns. This is crucial when planning budgets across multiple media types with vastly different spend efficiency curves.

MMM also adjusts for external factors like pricing, market conditions, and seasonality—ensuring you isolate marketing’s true impact.

Off-the-Shelf vs. Bespoke MMM: Choosing the Right Fit

There are a number of tools available to implement MMM—each with its strengths and trade-offs.

Off-the-Shelf Tools
Custom/Bespoke MMM Solutions

For brands with unique needs—such as regional campaign structures, legacy data systems, or complex business rules—a custom MMM model may be the best route. These models offer:

OneSix partners with clients to design and implement bespoke MMM solutions, from initial data exploration through production-ready deployment—ensuring that the model aligns tightly with your business goals and marketing operations.

From Modeling to Optimization: Turning Insights into Action

Once an MMM model is built, it generates a set of channel-level performance metrics—like marginal ROI and efficiency curves. These metrics feed directly into budget optimization models, helping you decide how much to spend on each channel to maximize ROI, given your total budget and business constraints.

Example:

Using these inputs, OneSix can help you solve for the optimal budget allocation using methods like linear programming or Bayesian optimization—automating the process of getting the most out of your spend.

Move Beyond Guesswork. Start Optimizing with AI.

Modern marketing requires more than clever creative—it demands clarity, precision, and adaptability. AI-powered solutions like MTA and MMM help brands cut through complexity and optimize every dollar. At OneSix, we build advanced marketing analytics frameworks that drive visibility, efficiency, and smarter decisions. Ready to make your marketing work smarter? Let’s talk about how to elevate your strategy and optimize your spend.

Contact Us

Right Message, Right Time: How AI is Transforming Modern Marketing

Right Message, Right Time: How AI is Transforming Modern Marketing

Written by

Jacob Zweig, Managing Director

Published

April 1, 2025

AI & Machine Learning
AI-Driven Marketing

Today’s customers don’t just want personalized experiences—they expect them. Whether shopping online, engaging with content, or exploring new services, people are looking for brands that understand their needs and speak to them on an individual level.

The problem? Traditional batch-and-blast marketing simply doesn’t cut it anymore. Generic messages sent to broad audiences risk being ignored—or worse, driving customers away.

To stay competitive, brands must move beyond one-size-fits-all campaigns and embrace AI-driven personalization. By harnessing the power of first-party and third-party data, businesses can gain deeper insight into customer behavior and deliver targeted, real-time messaging that increases engagement, drives loyalty, and boosts long-term value.

At OneSix, we help companies put data to work—building smarter, adaptive marketing strategies that deliver the right message to the right customer at exactly the right time. By integrating AI into their marketing strategies, our clients are unlocking measurable, data-backed results:

10%

Increase in customer visits

6%

Increase in profitability

15%

Increase in sales

Fueling Personalization with First- & Third-Party Data

Data is more than just a business asset—it’s the foundation for delivering relevant, high-impact customer experiences. By combining first-party and third-party data, brands can unlock deeper insights, close data gaps, and build smarter, more personalized marketing strategies.

Higher-Quality Insights for Better Customer Profiles

First-party data—collected directly from customer interactions across websites, apps, and transactions—offers high-quality, trustworthy insights into individual behaviors, preferences, and purchase history. This rich data allows brands to build detailed customer profiles and target specific segments with precision.

When paired with third-party data, which provides broader market context and behavioral trends, these profiles become even more robust. The result is a more complete view of each customer and better-informed marketing decisions.

Enhanced Experiences and Differentiated Value

First-party data helps identify customer needs, pain points, and preferences in real time—allowing brands to deliver timely, relevant offers and personalized recommendations. This not only improves the customer experience but also builds long-term loyalty.

Third-party insights enhance this by offering visibility into external factors—like competitive activity, seasonal trends, or consumer behaviors across other platforms—enabling brands to refine their value propositions and stand out in a crowded market.

Smarter Targeting and Hyper-Personalization

A combined data approach allows brands to fine-tune their targeting strategies. First-party data provides individual-level detail, while third-party data offers a broader lens into market behavior.

Together, they enable hyper-personalized campaigns—whether it’s tailoring product recommendations, suggesting relevant content in real time, or customizing messages for specific audience segments across digital channels.

Predictive Analytics That Drive Growth

While first-party data offers a historical lens into customer behavior, third-party data adds predictive power when fed into AI models. This combination supports:

By leveraging both datasets through AI, brands can make smarter, faster decisions that anticipate customer needs and drive revenue growth.

Smarter Engagement Through AI

AI is fundamentally changing how brands understand, target, and engage with their audiences. From acquiring new customers to deepening relationships with loyal ones, AI-driven models enable personalized, data-informed strategies that deliver measurable results across the customer journey.

Customer Segmentation Modeling

Segmentation powered by AI goes far beyond traditional demographic-based grouping. For unknown or prospective users, techniques such as clustering and lookalike modeling allow brands to generalize insights from known customer behaviors to broader audiences across digital platforms. These models help define high-value segments and guide user acquisition strategies.

For known users, AI enables dynamic segmentation based on up-to-the-moment behavioral data, allowing for hyper-targeted messaging that evolves as the customer does.

Real-World Example

A retail brand may use lookalike modeling to identify new prospects who mirror the behavior and preferences of their most valuable customers, tailoring digital advertising to attract high-intent buyers.

Customer Propensity Modeling

Propensity models leverage a wide range of data—including behavioral, contextual, and third-party inputs—to predict the likelihood of specific customer actions. These models help marketers identify which customers are most likely to purchase, upgrade, convert, or churn, allowing for more effective targeting and optimized marketing spend.

With AI, marketers can prioritize offers, customize messaging, and allocate resources based on real-time intent rather than static assumptions.

Real-World Example

A SaaS company could use propensity scoring to identify which website visitors are most likely to sign up, and immediately serve personalized trial offers through digital ads or email campaigns.

Real-Time Personalization

When engaging with known customers, AI plays a critical role in determining what to do next. By combining models such as Lifetime Value (LTV), churn prediction, and next-best-action optimization, brands can understand likely customer behavior and tailor marketing strategies accordingly.

Next Best Action (NBA) models go beyond traditional rule-based decision systems by dynamically adapting to real-time data and customer context. Rather than relying on static flows or pre-defined triggers, AI-driven NBA strategies evaluate a wide range of inputs—behavioral signals, preferences, environmental context—to surface the most relevant message, offer, or action at any given moment.

These models continuously learn from customer interactions across digital and physical touchpoints, enabling real-time personalization at scale. Whether it’s identifying the best time to send a message, recommending the right offer, or selecting the most effective channel, AI helps ensure each interaction is relevant, timely, and impactful.

Real-World Example

A leading casino implemented a real-time marketing engine, built on Next Best Action modeling, to personalize offers based on both in-casino activity and online behavior. The result was increased engagement, a 10% increase in player visits, and a 6% boost in player profitability. Explore the full case study →

The Future of Marketing Is Personalized

AI is no longer a nice-to-have—it’s a competitive necessity. In a marketplace where timing, relevance, and experience are everything, AI-driven personalization empowers brands to meet customers where they are with messaging that resonates.

From smarter segmentation and predictive targeting to real-time personalization and next-best-action optimization, AI enables marketing strategies that are more adaptive, impactful, and customer-centric.

Get Started

Ready to move beyond generic campaigns? OneSix helps companies turn data into meaningful customer experiences that drive loyalty and long-term value. Get in touch with us for a consultation.

Contact Us

AI’s Next Big Shift: What Business Leaders Need to Know

AI’s Next Big Shift: What Business Leaders Need to Know

Written by

James Townend & Nina Singer, Lead ML Scientists

Published

March 19, 2025

AI & Machine Learning
AI Agents & Chatbots

Artificial Intelligence continues to transform the tech landscape at breakneck speed. AI is driving innovation in every sector from how we process queries to the tools we use for automation. Below are five key trends shaping AI’s evolution in 2025—and why they matter.

1. Train Inference-Time Compute

"AI designers have a new control lever – spend more compute per query for higher accuracy and better reliability."
James Townend
Lead ML Scientist

Traditionally, AI performance scaled primarily with training-time compute: We spent more resources to train bigger models on more data. Now, inference-time compute—the compute spent when a trained model answers a query—has become a major new control lever.

Why It Matters

The Bigger Picture

As models shift more reasoning to real-time computation, the hardware and infrastructure for user-facing AI will need to scale to support these heavier inference workloads. This also opens opportunities for edge inference, which involves moving some computation onto devices like phones, robots, and IoT systems.

2. Enterprise Search Is Good Now

"LLMs have dramatically improved search through RAG, unlocking value from previously challenging document stores."
James Townend
Lead ML Scientist

Enterprise search was an afterthought for years, plagued by siloed data sources, poorly structured documents, and lack of meaningful relevance signals. Modern vector embeddings have changed everything, making Retrieval-Augmented Generation (RAG) the new standard.

Why It Matters

The Bigger Picture

With vector search and RAG, enterprise search resembles a true domain-expert assistant. Organizations finally have the tools to leverage vast document stores efficiently. It’s akin to what Google did for the early public internet—now applied to private, internal data.

3. AI Agents

"AI agents transform software interaction by automating multi-step workflows."
James Townend
Lead ML Scientist

The next revolution in AI-driven automation is the rise of AI Agents: task-oriented, often autonomous systems that can robustly interact with software and data.

Why It Matters

Important Considerations

Agents remain unpredictable at times, owing to LLMs’ black-box nature. For critical systems:

The Bigger Picture

We’ll see agents increasingly embedded in customer support, “low-code” software platforms, and legacy system integrations. However, organizations must weigh the potential for cost overruns (since agents call models often) against the productivity gains they deliver.

4. The Future of Openness

"As competition intensifies, we see an uptick of LLMs embracing open weights. Distilled models emerge to close the gap."
Nina Singer
Sr. Lead ML Scientist

Competition among large language models is intensifying, and with it comes a surge in open-weight models. Alongside these publicly accessible models, distilled versions—trained to mimic larger “teacher” models—are emerging as credible, cost-effective alternatives.

Why It Matters

The Bigger Picture

Open-source foundational models empower companies and researchers worldwide to build specialized solutions without huge licensing fees. This explosion in open models not only accelerates AI adoption but also raises questions about responsible use, governance, and the sustainability of massive training runs.

5. Capability Overhang

"As AI advances, new questions emerge: How else can we harness its potential? Who else can contribute to its development? How do we control its impact?"
Nina Singer
Sr. Lead ML Scientist

“Capability overhang” describes a scenario in which technology’s potential outstrips its immediate adoption and integration. We’re already seeing this with LLMs, where industrial and societal constraints—such as regulatory hurdles, skills shortages, and legacy system inertia—lag behind the AI’s actual abilities.

Why It Matters

The Bigger Picture

As AI’s capacity grows, the conversation shifts from “can we do it?” to “how should we do it responsibly?” The real power of LLMs will come from well-regulated, well-structured integrations that extend beyond flashy demos into meaningful, society-wide improvements.

Shaping the AI-Driven Future

From inference-time compute revolutionizing AI economics to enterprise search finally delivering on its promise, these five trends highlight a pivotal moment in AI’s evolution. Agents will streamline workflows, open-source models will democratize access, and the looming capability overhang challenges everyone—from entrepreneurs to regulators—to adapt responsibly.

As the AI frontier broadens, it’s up to us—innovators, policymakers, and everyday users—to steer its tremendous potential toward positive, inclusive progress. The question is no longer if AI can do something, but rather how we’ll harness its power to create lasting impact.

Get Started

Integrate these insights into your business strategy and make the most of AI and the power it has. OneSix can help you utilize emerging trends in AI and have first-hand experience of the impact it can have on your business.

Contact Us

A Practical Guide to Data Science Modeling: Lessons from the Book ‘Models Demystified’

A Practical Guide to Data Science Modeling: Lessons from the Book ‘Models Demystified’

Written by

Brock Ferguson, Managing Director

Published

February 10, 2025

AI & Machine Learning
Forecasting & Prediction

In the book Models Demystified, OneSix Sr. ML Scientist Michael Clark delves into the fundamentals of modeling in data science. Designed for practical application, the book provides a clear understanding of modeling basics, an actionable toolkit of models and techniques, and a balanced perspective on statistical and machine learning approaches.

In this blog post, we highlight the key insights from his work, diving into various modeling techniques and emphasizing the importance of feature engineering and uncertainty estimation in building reliable, interpretable models.

By mastering these fundamentals, you’ll not only unlock the full potential of predictive analytics but also equip yourself to make smarter, data-driven decisions. So, let’s demystify the science behind the models and turn complexity into clarity!

What is Data Science Modeling?

At its core, a model is a mathematical, statistical, or computational construct designed to understand and predict patterns in data. It simplifies real-world systems or processes into manageable abstractions that data scientists can utilize to derive meaningful insights and actionable recommendations. Models facilitate:

What are the Main Types of Data Science Models?

Data science encompasses various modeling techniques, each serving distinct purposes. Here’s an overview of the primary categories.

Linear Models and More

Linear Regression and Extensions

This category encompasses linear regression as a starting point, and extends to generalized linear models (GLMs), including logistic regression for binary targets, and Poisson regression for counts. Further extensions include generalized additive models (GAMs) for non-linear relationships, and generalized linear mixed models (GLMMs) for hierarchical data.

Special Considerations

A variety of modeling approaches are necessary when working with specific data types, such as time-series, spatial, censored, or ordinal data. These data types often exhibit unique characteristics that may necessitate specialized modeling techniques or adaptations to existing models.

Ease and Interpretability

Linear models are prized for their ease of implementation, and their ability to provide relatively clear and interpretable results. They also serve as useful baselines for more complex models, and often are difficult to outperform on simple tasks.

Machine Learning Models

Modeling Framework

Machine Learning (ML) provides a framework for systematically evaluating and improving models. It involves training models on historical data with a primary goal of making predictions or decisions on new, unseen data. The choice of model depends on the problem type, data characteristics, and desired performance metrics.

Penalized Regression

Least Absolute Shrinkage and Selection Operator (LASSO) and Ridge Regression are penalized versions of linear models commonly used for both regression and classification in a machine learning context.

Tree-Based

A tree-like model of decisions and their possible consequences, including chance event outcomes and resource costs, to model complex decision-making processes. Following are some of the tree-based machine learning algorithms:

Neural Networks/Deep Learning

Since these models are inspired by the human brain, they are widely used in deep learning applications. Neural networks mimic human brain architecture to identify complex, non-linear relationships in data, enabling deep learning applications.

Causal Models

Identifying Effects

Causal models shift most of the modeling focus to identifying the effects of a treatment or intervention on an outcome, as opposed to predictive performance on new data. In determining causal effects, even small effects can be significant if they are consistent and replicable. For example, a small effect size in a clinical trial could still be meaningful if it leads to a reduction in patient mortality.

Random Assignment and A/B Testing

Random assignment of treatments to subjects is the gold standard for estimating causal effects. A/B testing is a common technique used in online experiments to determine the effectiveness of a treatment.

Directed Acyclic Graphs (DAGs)

Graphical representations that depict assumptions about the causal structure among variables, aiding in understanding and identifying causal relationships. These pave the way for different modeling approaches to help discern causal effects.

Meta Learners

Provides a framework to estimate treatment effects and determine causal relationships between a treatment and an outcome. You can use the following types of meta-learners to assess causal effects:

Why is Data Preparation and Feature Engineering an Important Part of the Modeling Process?

Effective modeling hinges on thorough data preparation and feature engineering. The following steps ensure data quality and compatibility with algorithms, directly influencing model performance.

What is the Role of Statistical Rigor in Uncertainty Estimation?

Addressing uncertainty is integral to robust data science modeling. Statistical rigor ensures reliable predictions and enhances trust in model outputs. It involves:

1. Quantification Through Confidence Intervals

Confidence intervals offer a clear, quantifiable range within which model parameters are likely to fall. This approach ensures that we account for variability in estimates, highlighting the degree of precision in model predictions.

2. Prediction Intervals for Future Observations

Unlike confidence intervals, prediction intervals extend uncertainty quantification to individual predictions. These intervals provide a realistic range for where future data points are expected, accounting for the inherent variability in outcomes.

3. Bootstrapping for Distribution Estimation

Bootstrapping is a statistically rigorous, non-parametric technique that involves resampling the data to estimate the uncertainty of parameters and predictions. It is particularly useful when traditional analytical solutions are infeasible, providing robust insights into variability.

4. Bayesian Methods for Comprehensive Uncertainty Estimation

Bayesian approaches allow for a more comprehensive treatment of uncertainty by incorporating prior information and deriving posterior distributions. This method propagates uncertainty through the entire modeling process, offering a more nuanced understanding of variability in predictions.

5. Model Validation and Testing

Employing techniques such as cross-validation ensures that model predictions generalize well to unseen data. Rigorous testing methods reveal the extent of overfitting and provide an honest assessment of model reliability.

6. Assumption Checking and Diagnostics

Statistical rigor requires a careful evaluation of the assumptions underlying a model. When these assumptions are violated, it can lead to substantial uncertainty in the results, making thorough diagnostics and model refinement critical to minimizing risks and ensuring reliable outcomes.

Key Takeaway: Building a Strong Foundation in Data Science Modeling

Data science modeling is effectively solving challenges in today’s times and enabling demand forecasting, inventory management, and logistical optimization, leading to cost savings and improved efficiency in supply chains.

Incorporating processes like feature engineering, uncertainty estimation, and robust validation ensures that your models are not only reliable but also interpretable and adaptable to real-world complexities.

As artificial intelligence and machine learning continue to advance, we can expect models to become increasingly automated, adaptive, and precise. The future will likely emphasize real-time predictive analytics, empowering industries to anticipate trends, streamline operations, and make informed decisions with enhanced accuracy.

The journey doesn’t end with building models—it’s about using them to transform challenges into opportunities. Ready to dive deeper? You can access expert insights and further your understanding of data science modeling, in the book, Models Demystified, written by OneSix ML Scientist Michael Clark. The print version of the book will be out this year on CRC Press as part of the Data Science Series.

Navigate Future Developments With Data Science Modeling

We can help you get started with expert insights and practical guidance to build and optimize data-driven models for your needs.

Contact Us

Using AI to Extract Insights from Data: A Conversation with Snowflake

Using AI to Extract Insights from Data: A Conversation with Snowflake

Published

February 6, 2025

During Snowflake’s World Tour stop in Chicago, Data Cloud Now anchor Ryan Green sat down with leaders from OneSix. During the conversation, Co-founder and Managing Director Mike Galvin and Senior Manager Ryan Lewis note how Snowflake’s technology has changed the game, allowing them and its customers to focus less on how to build data infrastructure and more on how to extract insights from data, be it via the use of AI or reporting or dashboarding.

Get More from Your Data with Snowflake

As a Premier Snowflake Services Partner, we drive practical business outcomes by harnessing the power of Snowflake AI Data Cloud. Whether you’re starting with Snowflake, migrating from a legacy platform, or looking to leverage AI and ML capabilities, we’re ready to support your journey.

Contact Us