Plausible Blog
Discover the lightweight, privacy-focused alternative to Google Analytics, respecting user data and offering transparent insights.
If you are trying to measure and analyze the activities on your Shopify store, you can use the in-built Shopify Analytics, and pair/replace it with Google Analytics or simpler, powerful and privacy-friendly alternatives like Plausible Analytics.
A good combination of these tools can help you effectively monitor store activity, understand visitor behavior, analyze web performance, marketing campaigns, attribute sales, and much more. You can use these insights to boost revenue, optimize your store, marketing initiatives, etc.
So letās understand what each option can offer so you can make an informed decision.
- Shopify Analytics: the built-in reporting system
- Pros and cons of Shopify Analytics and why consider alternatives?
- Using Google Analytics for Shopify
- Plausible Analytics for ShopifyĀ
- Conclusion
Shopify Analytics: the built-in reporting system
The first and foremost is the native reporting system offered by Shopify itself: āShopify Analyticsā. Itās available right in your Shopify account and accessible from the admin side panel.
It gives merchants a quick overview of their storeās performance while the reports help you track your storeās activities, understand your customers, analyze sales, finances, etc.
Since itās native to Shopify, you donāt need to do a manual dashboard or report setup, unless you want a custom dashboard (available only with higher plans).
Standard reporting is available with all the pricing plans. According to Shopifyās pricing page, you can āaccess 60+ reports to track your store performance or build custom reporting with flexible, real-time analytics.ā
Shopifyās analytics includes three main components: an Overview dashboard, detailed reports, and a live view. These are available under āAnalyticsā from the admin sidebar.
Types of reporting in Shopify
Overview dashboard
This is the first thing youāll see. An overview of your most important ecommerce metrics like sales, orders, conversion rate, etc. (metrics explained below), at a glance.
Itās a collection of data cards (metrics) with summary numbers and trend graphs. You can customize which metrics appear here but only on the desktop view.
You can select a date range like today, yesterday, last 30 days, etc., while also comparing performance to a previous period or year, showing percentage changes so you can gauge growth or decline. You can click any metric card to dive into a corresponding detailed report.
If you look at the sidebar menu, you can open your āReportsā.
Reports
You will find a library of predefined reports, divided by categories like Finances, Acquisition, Behavior, etc.
Each report typically includes a graph and a detailed data table. You can filter and segment these reports to answer specific questions such as viewing sales by a specific product or channel.
The default Shopify reporting categories consist of the following:
Acquisition reports: For understanding how many sessions and visitors are acquired during a time period and from which locations and which referring sites.Ā
Behavior reports: For understanding your customersā shopping behavior.
Customers reports: This helps understand stuff like customers by location, Returning customers, One-time customers, Customer cohort analysis.
Finance reports: This helps understand everything from a finance summary to store credit transactions, liabilities finance report, gift cards, taxes, sales, etc.
Fraud reports: Monitor and analyze fraudulent activities, including chargeback rates and high-risk orders, to enhance store security.
Inventory reports: Track stock levels, monitor inventory movements, and assess product availability to optimize inventory management.
Marketing reports: Evaluate the effectiveness of marketing campaigns by analyzing metrics like sessions attributed to marketing and sales conversions.
Order reports: Gain insights into order trends, fulfillment statuses, and return rates to streamline order processing and customer satisfaction.
Profit reports: Assess profitability by examining gross profit margins, cost of goods sold, and net profit across products and sales channels.
Retail sales reports: Analyze in-person sales performance, staff contributions, and product sales within retail locations to inform business decisions. ā
Sales reports: Review comprehensive sales data, including total sales, sales by product or channel, and average order values, to understand revenue streams.
Each category has multiple reports of its own.
Custom reporting and advanced filtering options are also available but only on higher Shopify plans, but all Shopify stores have access to the core reports.
Live View
This offers a real-time visualization of whatās happening in your site right now. It shows the number of current visitors, their locations, actions they are taking, and any live sales or checkouts happening at that moment. Itās particularly useful during peak traffic events like flash sales or product launches.
Key metrics
You will find a variety of metrics across the overview dashboard and reports. These are the key performance indicators (KPIs) that Shopify tracks for your store.
Itās important to understand what each metric means in Shopify. Hereās a summary of the most important metrics youāll see:
Total Sales: The total revenue your store earned in the selected period, after all adjustments. In Shopify, Total sales is essentially your net sales plus any additional charges like taxes, shipping, and duties.Ā
It accounts for product sales minus discounts and returns, and then adds things like shipping charges or taxes that customers paid. (On some dashboards, you might see Gross Sales and Net Sales separately ā see below ā but Total Sales is the bottom-line number.)
Gross Sales: This is the value of all items sold at their full price, before any discounts, returns, or other deductionsā. Itās essentially your sales if nothing was discounted and no orders were refunded.Ā
This can be useful to see your storeās potential revenue or the pre-discount demand. For example, if you sold 10 items priced at $50 each, gross sales would be $500 (even if some customers used a coupon or returned items ā those adjustments come in below).
Net Sales: Net sales are your actual sales revenue after discounts and returns. Shopify calculates Net Sales as Gross Sales ā discounts ā returnsā.Ā
Importantly, net sales exclude taxes and shipping fees (since those are usually pass-through or additional charges).Ā
Using the previous example, if gross sales were $500 but one $50 item was returned and another $50 order had a $10 discount, then net sales would be $500 ā $50 ā $10 = $440.
Total Orders: The count of orders placed in the period, across all your sales channels. This counts each order (transaction) once, regardless of how many items were in it. Itās a basic measure of how many purchases were made.Ā
Online Store Sessions: The number of visits to your online store (traffic volume). In Shopify, a āsessionā is a period of continuous activity by a visitor. If the same person comes back later, that counts as a new session.Ā
Note that sessions are usually higher than unique visitors, because one person can visit multiple times.
Conversion Rate: The percentage of visits that lead to a purchase. Shopifyās conversion rate is typically defined as (# of orders / # of sessions) x 100%. So, a 20% conversion rate means 20 out of every 100 sessions resulted in an order.Ā
In the Overview dashboard, the conversion rate breakdown card usually also breaks down the conversion funnel by showing the percentage of sessions that added something to cart, the percentage that proceeded to checkout, and the percentage that actually completed a purchase.Ā
This helps you see where customers might be dropping off.
Average Order Value (AOV): A key sales metric that tells you the average amount each order is worth. Itās calculated as Total sales revenue / Total number of orders.
āFor instance, if you had $1,000 in sales from 20 orders, your AOV is $50. Shopify displays this to help you understand how much, on average, customers spend per transaction.Ā
Returning Customer Rate: The percentage of your customers who are repeat buyers. This is a measure of customer loyalty and retention. Higher the better.
In Shopify, itās defined as the number of customers who have placed more than one order divided by your total number of unique customers, over the time period, expressed as a percentageā. So, a returning customer rate of 20% means one in five customers has bought from you before.
Return Rate: This deals with product returns. It shows the percentage of items sold that were later returned. For example, a 5% return rate means that out of all items sold in the period, 5% were returned by customers. Youād ideally want your return rate to be as low as possible.
There are many other metrics available ā for instance, Shopify can show you things like top products, sales by channel, sessions by device, etc. ā but the ones above are some of the most commonly referenced on the overview screen.
Pros and cons of Shopify Analytics and why consider alternatives?
Moving onā¦
Pros of Shopify Analytics
- Included with all pricing plans
- No setup needed, minimal technical resources needed
- Automatic tracking
- Tracking script is less blocked by ad blockers, since itās served as a first-party from your domain
Cons of Shopify Analytics
Can be complicated for most users
The biggest con is that since Shopify recently updated its Analytics UI, there have been multiple complaints and discussions, like this one. The common consensus currently is that itās overly complicated for store owners with a lot of jargon and confusing UI, being an overkill for most Shopify subscribers.Ā
This is because there are too many reports spread across too many categories which can feel overwhelming to figure out and feel a bit too complicated for most use cases, especially if youāre not an enterprise.
Requirement to use a cookie consent banner
Since Shopifyās tracking uses cookies for analytics, you need consent from your site visitors to use cookies, which not only downgrades the user experience but when they get rejected by a visitor, the tracking script isnāt able to track them, resulting in partial website performance data capture and inaccurate reports making them unreliable.
Shopify itself gives the option to add cookie banners to your site.
Becomes expensive as you scale
As your business scales, you would need to upgrade your Shopify plan to keep your store, and analytics, running. Every plan upgrade multiplies your subscription cost by 3-4X. If youāre unable to justify the costs, you would lose access to your stats.
More focus on ecommerce metrics than web analytics
Shopifyās dashboard is focused on ecommerce metrics like Gross Sales, returning customer rate, orders fulfilled, sales breakdown, etc.
You also need to understand your website performance but metrics like bounce rate, time on page, scroll depth, etc, are missing from Shopify Analytics. Youād also want to track sessions, analyze the marketing channels that bring you traffic, and do cross-domain tracking as well in some cases.
Using Google Analytics for Shopify
Many Shopify store owners use both Shopify Analytics and Google Analytics 4 (GA4): Shopify Analytics as a quick overview of store performance and Google Analytics for deeper insights on overall website and marketing performance.
P.S. GA4 also requires a cookie consent banner and is very complicated to use, while it comes with privacy concerns and inacurrate data according to multiple independent studies (example) but I will get to it in a minute. Skip to the āPlausibleā section if you are struggling with GA4 but I recommend you read through it anyway so you can understand all your options.
Deep dive into user behavior
A separate web analytics tool makes sense when you want to get granular. You can ask complex questions like āHow many visitors viewed Product A, added it to cart but didnāt purchase, and what sources did they come from?ā ā GA4 can answer that with segments or explorations, while Shopify cannot without Enterprise plans.
GA4 can track events beyond the purchase journey (video plays, link clicks, form submissions), giving a more complete picture of user engagement on your site.Ā
Customization and flexibility
GA4 allows custom events, custom dimensions, and custom reports. If you have a unique aspect to your store like a custom upsell interaction you want to track, you can record it in GA4. This flexibility means GA4 can be tailored to your business KPIs more than Shopifyās one-size-fits-all reports.
Cross-platform and cross-domain tracking
If your business extends beyond just the Shopify storefront (e.g., you have a separate blog, or a web app, or multiple domains), GA4 can unify tracking across those. It can also track mobile app data if you have an app, integrating web and app analytics in one property.
Shopify Analytics is only for your Shopify online store. Also, if you ever do cross-domain selling (maybe Shopify plus a separate landing page domain), GA4 can handle that.
Attribution and marketing analyticsĀ
GA4ās integration into the Google Marketing ecosystem (Ads, Search Console, etc.) means you get a more complete marketing picture. Shopifyās marketing attribution is improving, but GA4 is still more flexible and detailed on this front.
If you want to know not just the last click, but the full path (first touch vs last touch contributions), GA4ās reports or BigQuery data can help.
Advanced analysis and raw data export
You can export all raw event data to BigQuery (Googleās data warehouse) for even deeper analysis or joining with other data (like CRM or ad spend data), where you can run SQL queries to answer custom questions about user behavior or build machine learning models (for churn prediction, etc.).
Shopify does not offer raw data export from its analytics ā youād have to use the Shopify API to pull data, which is more limited.
This is beyond what most merchants need day-to-day, but itās a major advantage for data-driven e-commerce stores with specialized teams.
How to Set up Google Analytics 4 on Shopify?
There are 3 methods you can use to start using Google Analytics 4 for your Shopify store.
With the Google & YouTube Channel app
The most straightforward way to install Google Analytics 4 is by using the Google & YouTube Channel App. You wonāt be able to track checkout steps with this method if youāre not on Shopify Plus.
You can install it directly through your Shopify account by going to Online Store > Preferences. If you already have a GA4 account, you can connect it by following the on-screen instructions or create a new one first.
Once done, the events set up through Enhanced Measurement (like page_view, scroll, click, view_search_results, video_start, video_progress, video_complete, etc.) and e-commerce-related events like view_item, add_to_cart, begin_checkout, add_payment_info, purchase, will become visible.
Through Google Tag Manager or GTag
These are manual methods and typically require a lot of time and effort to ensure that correct metrics are tracked, the setup is accurate, and the codes are placed at right places in Shopify but allows more flexibility and customization.
Google Tag Manager (GTM):
To implement GA4 using GTM, youāll first need to create a GTM account and container. Once set up, add the GTM container code to your Shopify theme. This involves inserting the GTM script code into the head section and the noscript code immediately after the opening body tag of your theme.liquid file.ā
Then, you can add a GA4 configuration tag within GTM using your Measurement ID. This setup allows for more granular control over event tracking, enabling you to configure events like page views, add-to-cart actions, and purchases.Ā
Donāt forget to also test your setup using GTMās Preview mode and GA4ās DebugView to ensure data is being collected correctly.
Global Site Tag (gtag.js):
Alternatively, you can implement GA4 directly using the Global Site Tag (gtag.js). This involves adding the GA4 gtag.js snippet to your Shopify themeās head section.Ā
For tracking purchases, add the gtag.js code to the āAdditional Scriptsā section in your Shopify checkout settings, which allows you to capture transaction data on the order confirmation page.
While gtag.js provides a straightforward setup, it offers less flexibility compared to GTM, especially when it comes to managing and customizing event tracking without modifying the siteās code directly.
For a detailed walkthrough and additional insights, you can refer to this full guide.ā
Third-party Shopify apps
You can also explore third-party apps such as Analyzify, AD Google Analytics 4, Magic Google Analytics 4, etc. You can find such apps on the Shopify App store to compare the features and benefits youād get from each app.
Google Analytics vs Shopify Analytics
You can compare both the tools considering the following factors:
Setup complexity
Shopify Analytics is plug-and-play, whereas GA4 requires planning and verification. With Shopify, thereās no need to worry about tagging pages or mapping e-commerce events ā itās automatically tied into your storeās functionality.
With GA4, you must ensure the tracking code runs on all pages (including checkout and thank-you page) and that all relevant events are sent with the correct parameters (like value, item IDs, etc.).Ā
Missteps in setup can lead to missing or duplicate data in GA4. For example, installing GA4 via both the Shopify integration and GTM simultaneously can cause double-counting if not handled carefully.
If youāre non-technical or want quick insights with minimal effort, Shopify Analytics is essentially turnkey. If you need the advanced tracking GA4 offers, be prepared for an initial setup phase.
Data accuracy
Data accuracy can be affected by:
- Ad blockers blocking the script: GA4 relies on a JavaScript snippet running in the userās browser. Users using ad blockers or privacy-focused browsers (Brave, DuckDuckGo, etc.) may completely block Google Analytics scriptsā and those user visits and even purchases will never be recorded in GA4. Shopify Analytics, however, is part of your site ā it can still count the order because the order is processed on Shopifyās servers. This leads to GA4 under-reporting vs Shopify.
- Timing and processing delays: Data freshness can make numbers temporarily inconsistent. Shopifyās reports are near real-time for orders and fairly quick for traffic. GA4 data often needs up to 48 hours to stabilize.
Privacy concerns
Google Analytics, being infamous for not being good for privacy and tracking users across devices and apps, has caused multiple legal and ethical concerns to be raised over the years. This forces a lot of users to block the GA script, run cookie consent banners, etc.
Shopify Analytics can be trusted better in this regard.
Plausible Analytics for ShopifyĀ
Plausible is a privacy-friendly, much easier to use alternative to Google Analytics. Hereās a quick overview:
Unlike traditional tools like Google Analytics, where you need multiple reports or custom explorations, we keep it simple with a single-page dashboard. In fact, itās easier to track visits, exit pages, conversions analysis, and a lot of things in Plausible than GA4.
Very simple setup
Itās childās play to get started with Plausible. Just add the script to your site and start interacting with your dashboard. You donāt need to work with code even for setting up pageview goals or tracking 404 pages, external clicks, etc. Although you can if you want advanced custom event tracking.
Single dashboard for everything
You get all your reports and metrics in literally one dashboard āā something missing from both Shopify Analytics and Google Analytics.
Weāve also specifically designed Plausible keeping simplicity in mind, while not compromising on features or privacy.
Cookieless tracking
Plausibleās tracking is cookieless, unlike Google Analytics and Shopify Analytics ensuring you donāt need to put a cookie consent banner on your site meaning that there are no banner declines and no missing data.Ā
For example, Safariās ITP (Intelligent Tracking Prevention) that limits cookie lifespan or blocks third-party cookies doesnāt hinder Plausible at all, whereas Shopifyās reliance on cookies for session tracking could be affected if, say, a user blocks all cookies or frequently clears them.
Out-of-the-box compliance
We are inherentlyĀ GDPR-compliant due to our privacy-first nature so you donāt need to worry about legalities, mishandling of personal information and putting up a Consent banner and subsequent declines.
Plausible can actually count more of the ārealā visitors while Shopify and GA4 would stop tracking those who opt-out of cookies ā this can lead to Shopify under-reporting traffic.
Special care with accuracy
Our stats are very accurate as we take special measures in ensuring so, such as we detect and automatically exclude bots and spam traffic whereas GA4 users constantly battle with skewing data due to this reason.
Bypasses ad blockers
Plausible script is significantly less blocked by ad blockers and privacy conscious users. For a small chunk that blanket-block all trackers regardless of whether they are privacy-friendly or not, you can even serve the Plausible script as a first-party connection easily.
Real-time analytics
Our data is always fresh and constantly updated. We provide real-time analytics. This is missing from GA4 and Shopify Analytics too. Google Analytics can take up to 48 hours to fully process data and update your reports.
Shopifyās data is mostly not real-time except the Live View ā standard reports update at least every few minutes to an hour.
Open-source
We are open-source which is a huge plus for tech-oriented businesses or those with strict data control requirements. If you need a custom integration or want to verify exactly what the analytics code is doing, you can inspect and adapt Plausibleās code (itās on GitHub).Ā
Shopifyās analytics is a black box ā you get what they provide. While this advantage of Plausible might not be relevant to a typical small store, itās very relevant for enterprises or developers who want complete control and transparency.Ā
In a nutshell, you get in-built privacy, very high accuracy, freedom from consent banners (and their declines), and additional insights upon using Plausible with/against Shopify.
How to setup Plausible on Shopify?
You can easily add the Plausible snippet to Shopify by navigating to Sales Channels > Online Store > Themes > Edit code > ātheme.liquidā file where you can paste your Plausible snippet.
If you also want to track e-commerce metrics in Plausible like checkouts and revenue, you can easily follow the instructions here.
How to use Plausible in tandem with Shopify?
You can do comprehensive tracking with Plausible plus Shopify, or even if you choose to use only Plausible. You can track e-commerce events with Plausible independently.
This also gives you the flexibility of switching your ecommerce provider at any time while keeping your analytics the same.
Here are some things you can do with Plausible when using it with Shopify:
The web tracking stuff (absent from Shopify Analytics)
In a regular Plausible dashboard, you get an extremely easy overview of your visitors, visits, engagement metrics, top channels, top pages, countries, devices, etc.
You can even do stuff like:
Connecting Google search console to track keywords bringing SEO traffic. Or, UTM-tag your links to track your paid ads within Plausible itself. Also works with gclid.
Track other domains, subdomains or cross-domain. For instance, if your store isnāt exclusively using Shopify (maybe you run a WordPress blog to drive traffic to the store) ā Shopifyās analytics wonāt cover that blog, but Plausible could track both blog and store in one place.
Share stats with stakeholders. It is possible to share your Plausible insights with others. For example, you can generate public or password-protected links to your Plausible dashboard for a marketing team or a clientā. You can even embed it anywhere youād like.
The ecommerce tracking stuff (also available in Shopify Analytics)
In Plausible, apart from seeing unique visitors, conversions, bounce rate, etc., you can set up goals, funnels, and track revenue too. See our complete guide on using Plausible for ecommerce revenue attribution.
You can simply filter your traffic by a revenue-marked goal, and see an overview of the sources (channels + campaigns) that brought in those sessions, the pages that received those sessions, and the devices, browsers, operating systems on which the sessions were conducted.
Assume a situation where you have an ecommerce store on Shopify that sells hoodies and beanies. For marketing, you run Google ads, post daily on Instagram, and have some referral links from other domains.Ā
This is how you can understand which channels, campaigns, and other factors work best for your sales:
Start by setting up some event-goals, including revenue-marked purchase goals.
In this case, your event-goals can be ācomplete purchaseā (a revenue-marked goal), āstart checkoutā, āremove from cartā, āadd to cartā, āadd to wishlistā, etc. Along with such events, you can send some custom properties like product category, product name, product color, product size, etc.
Once everything is set up and you have started receiving data, use your dashboard to simply filter your traffic by such goals and/or properties in the dashboard. Here is an example:
In this example, the user has their dashboard filtered by āAll timeā data, the goal of ācomplete purchaseā and the property of āproduct category is hoodiesā. So every metric they see on this single-page report is directly related to the session in which the conversions (hoodie sales) occurred.
They can see the revenue earned, the campaigns that contributed to the sale (possible due to UTM tracking), the all-time top pages that received the most traffic in the sessions receiving conversions, and some other data like locations and devices.
By toggling amongst UTM sources, mediums, campaigns, content, and terms, the user can understand the effectiveness of such traffic sources too. Similarly, they can toggle between top, entry and exit pages to understand their best performing webpages.
Conclusion
I hope you got a full view of what you can do to analyze your Shopify store. Write in to us for any queries and all the best!
Imagine logging into Google Analytics. You want to see how your page from last month is performing ā how much time does a visitor spend on average on your page. Itās 10 seconds. You end up spending marketing resources, you make strategies for improving your siteās engagement time.
But what if I told you the metric wasnāt as accurate to begin with? It could be underreporting by as much as 80%?
We recently did an experiment to compare the average time spent per page per visitorāāin Google Analytics vs Plausible Analytics and found some sizeable differences.
- What is time on page or average engagement time?
- The experiment
- Why the differences?
- Test the difference
What is time on page or average engagement time?
Time on Page or average engagement time is one of the most important engagement metrics for site owners, that helps understand the average time your visitors spend on each page in a specific time period.
Time on page =Ā total active time spent on a page / total unique visitors
This further helps with understanding which pages are most popular in terms of engagement, and the least.
If youāre using Plausible, you would see ātime on pageā. If youāre using Google Analytics 4, you would see āaverage engagement time per active user,ā Both of these use the same formula as above, and just use different labels.Ā
You could even look at āaverage engagement time per sessionā in GA4 if you want to judge average time on page per visit, instead of visitors.
The experiment
We installed both the Google Analytics and Plausible Analytics tracking scripts on a food recipes website. Then compared the data for Mar 12 - Apr 8 2025 (a period of 28 days).
We refer to the same metric as Time on Page in Plausible, whereas itās referred to as āAverage engagement time per active userā in Google Analytics 4.
Hereās the screenshot from GA4ās āPages and Screensā report:
And hereās Plausibleās Top Pages report:
Letās compare some pages. TOP=Time on Page.
Page path | TOP (GA4) | TOP (Plausible) | GA4 captured (% of Plausible) | GA4 underreported by |
---|---|---|---|---|
/make-corn-tortillas-polenta/ | 36s | 1m 11s = 71s | 50.7% | 49.3% |
/zucchini-ricotta-fritters/ | 34s | 2m 33s = 153s | 22.2% | 77.8% |
/vegan-gnocchi-made-tofu/ | 16s | 1m 21s = 81s | 19.8% | 80.2% |
/liquid-nitrogen-ice-cream-how-to/ | 102s | 1m 45s = 105s | 97.1% | 2.9% |
/tomato-soup-blender-not-acid/ | 61s | 1m 30s = 90s | 67.8% | 32.2% |
/spaghetti-burger-pasta/ | 21s | 1m 7s = 67s | 31.3% | 68.7% |
/coconut-tapioca-pudding-with-mango-sauce/ | 51s | 2m 6s = 126s | 40.5% | 59.5% |
/spanish-tortilla-potatoes/ | 42s | 2m 8s = 128s | 32.8% | 67.2% |
As you can see, GA4 underreports by as much as 80% as compared to Plausible, while the average underreporting turns out to be 54.7%.
Aka, more than half of the actual user engagement time is missing from Google Analytics 4ās reports, at least as compared to Plausible ā which can seriously distort content performance insights and mislead decisions based on time-on-page metrics.
What could be going on here? Why are we betting more on Plausibleās engagement time tracking method and benchmarking that against Google Analytics? Where do these differences stem from?
Why the differences?
There are two major reasons we found which are related to how engaged time is logged in Google Analytics, and how the final calculation of average time on page works.
How is engaged time logged in Google Analytics?
Google Analytics 4 considers a session (official src) as āengagedā onlyif the visitor:
- spends at least 10 seconds actively on the site (by default), or
- if they trigger a conversion event (a designated key event), or
- if they view 1+ pages during the session.
This means that if a visitor does not view a second page during their session or doesnāt convert, and leaves before 10 seconds are completed, that chunk of visits is basically absent from all the related metrics in GA4 like the total/aggregated time on page or āuser engagement,ā number of engaged sessions, and eventually the average engagement time.
Not including sub-10 second visits seems fair at first, especially if youāre only interested in understanding highly engaged visits, but it compounds when final average engagement times are calculated for hundreds of visitors, distorting it by a sizable difference as we saw with our experiment above.
But are non-engaged sessions (sub-10 second visits) really that much? Well yes. Look at the screenshot below.
Itās the same particulars (page paths) for the same time period, but I custom-added four metrics: user engagement, engaged sessions, sessions and average engagement time per session.
Letās focus on the second row: /make-corn-tortillas-polenta/
Ā
Total sessions: 209
Engaged sessions: 103
Non-engaged sessions: 209-103 = 106
i.e., 106 sessions were considered to have an engagement time of zero as per GA4ās rules and were excluded in the calculation of user engagement, engaged sessions, average engagement time, etc.
ā¼ļø106 out of 209 sessions were considered non-engaged. That is half of the sessions! Thatās one major cause of difference.
How is the average time on page calculated?
Here, it gets more interesting, thanks to my colleague for noticing it. Look at āAverage engagement time per sessionā for once. If you notice closely, itās 29 seconds for the same second row example.
If the average time on page was calculated against engaged sessions only, we would be getting the following:
(103) mins / (103) engaged sessions = 1 minute
But itās 29 seconds. Is GA4 dividing total user engagement time by total sessions (and not engaged sessions) instead? Thatād be:
103/209 = 0.49 mins = 29 seconds
Bingo!
Turns out, GA4 is anyway dividing engagement time by all sessions, which includes non-engaged visitors, when giving a final calculation of āAverage engagement time per sessionā and most likely, āAverage engagement time per active userā.
Concluding the two reasons above: exclusion of sub-10 seconds sessions and still including total no. of sessions in calculating the final average value of time spent on page distorts the final average value by manifolds.
Huh! It took me two days and unlimited patience to sit through various engagement metrics in GA4 and understand the differences and calculation method. And thatās precisely the problem: A typical user wonāt be doing that.
You would be seeing the average engagement metrics in isolation, thinking to yourself, āhah! That is my average time on pageā and move on.
It could lead to inaccurate insights. Imagine analyzing social media traffic, which typically has higher bounces rates or short visits. Itād be very difficult to measure the performance of your content or design for pages that need improvement.
Time on page tracking in Plausible
If youāre interested in understanding actual time spent on page, itās necessary to stick to real world scenarios as much as possible. Hereās how we calculate time on page on the sites using Plausible:
- We track active engagement only when the page is in āfocusā. So we donāt count the time a visitor spent away from your website, say by switching tabs or switching apps.
- We include time on page for bounced visits (i.e. visits where only a single page was viewed).
- We count all time spent, even very short visitsādown to a single second. This is the major difference in calculating time on page in Plausible from GA4.
And so we are confident about the accuracy. Speaking of which, since unique visitors is a part of calculating average time on a page, itās important to address the differences in calculating that as well.
Calculation of unique visitors
If you compare the two screenshots of GA4 and Plausible, you would see more unique visitors in Plausible as compared to the active users (in our example websiteās case, active users are equal to total users) in GA4. We see this all the time due to different calculation methods.
This is another reason that explains the differences in average time on page between the two tools. So the only question remains, which one is more accurate?
Due to our privacy-respecting nature, Plausible counts unique visitors differently than GA4. We delete old salts every 24 hours to avoid the possibility of linking visitor information from one day to the next. This makes us inherently GDPR compliant too.
On the other hand, GA4 counts unique visitors by using a combination of browser cookies (client ID), user IDs (if available), and device IDs (or app instance ID) to identify and track individual users across sessions and devices through multiple days.
But, when it comes to accuracy in counting unique visitors in the first place, many reasons come into play.
- Many visitors never get tracked in GA4 in the first place due to consistent cookie banner declines.Ā Hereās an interesting independent study that showed how every number in GA4 can be misleading due to this very reason.
- GA4 is infamous for nosy tracking and that has led many ad-blockers to prevent the GA tracking scripts from loading. Today, 31.5% of internet users worldwide report using an ad blocker. Same goes for privacy-friendly browsers.
- 58% of Hacker News, Reddit and tech-savvy audiences block Google Analytics. Read study.
So itās fair to conclude that while GA4 can technically remember a unique visitor for multiple days, itās actually blocked by multiple privacy-conscious folks for that very reason.
Whereas, Plausible is able to account for all the above cases and bring good accuracy in counting unique visitors without invading their privacy. In a nutshell:
- You donāt need to put up a cookie consent banner when using Plausible, at least from our side. We are GDPR, CCPA, etc., compliant out-of-the-box.
- We are significantly less blocked by privacy-friendly ad blockers and browsers.
- Tech-savvy audiences love Plausible and we havenāt had any reports of such users blocking our script.
Also,Ā
- We use cookieless tracking, whereas GA4 stores cookies on your browser/device which many users delete, or return through a private browser, preventing it from recognizing the same unique visitor.
- We block spam and bot data by default. GA4 users constantly battle with skewing data due to this reason. For eg. When a site uses a cookie management platform (CMP), the CMP might send data center traffic which GA4 is unable to detect and exclude from stats.
- Our data is always fresh and constantly updated. We provide real-time analytics whereas GA4 can take up to 48 hours to fully process data and update your reports.
ā¦etc. Thereās a full post on our accuracy practices that ensure we can align your stats with real-world data as much as possible.
Lastly, for the purposes of understanding average time spent on a page, it doesnāt matter which visitor spent how much time, what matters is that there was a visitor and they spent some time.
Test the difference
We invite you to conduct your own tests. We encourage you to install the Plausible script side-by-side your GA4 tracking script and see the differences for your site on your own.
Plausible is absolutely free to use for the first 30 days. If you find any differences, donāt forget to share them with us. :)
āļøThatās a screenshot from the publicly open Plausible Analytics dashboard of the European Alternatives website, featuring independent, privacy-friendly digital tools.
With almost 2M visitors, 2.3M total visits, and 8.5M pageviews, this library is no longer a niche projectāitās the go-to destination for users looking for independent, privacy-friendly tools. Over 1 million of the ~2 million all-time visitors came in 2025 aloneāand weāre only in March.
Letās use the analytics dashboard to see what trends are emerging and what is the demand looking like for EU-built, privacy-friendly tech.
- 1100% surge in 2025 traffic
- Reddit surpasses Google, while privacy-friendly search engines gain traction
- Top 5 categories: What are people looking for?
- Not just a European trendādemand goes beyond the EU borders
- European countries leading (and lagging) in the shift to privacy-friendly tools
- Which European tools are winning?Ā
- Need a hand in deciding which tools to go with?
- Privacy-friendly web analytics is on the rise
1100% surge in 2025 traffic
If we filter the dashboard by āYear to dateā period, i.e. 2025 so far, we see there have been 1.3 million visitors to the website, with a surge of 1.1K%.Ā
Another interesting number is the ātime on page,ā which is 2m47s spent on average. This goes on to show that people are interested and engaging well with the library.
Reddit surpasses Google, while privacy-friendly search engines gain traction
Look at the Top Sources report for 2025 so far:
Google is typically the dominant search driver for most websitesābut not here. Reddit ranks as the second-largest traffic source, bringing in 311K visitorsāsignificantly more than Google (193K). This isnāt a regular thing we see with sites.
This suggests that people arenāt just searching for alternatives but are actively discussing and recommending them in forums, subreddits, and privacy-conscious communities.
But hereās another interesting insight: Privacy-friendly search engines are also key traffic sources:
- DuckDuckGo: 33.8K visitors
- Ecosia: 13.5K visitors
- Qwant: 10.8K visitors
This says something about the increasing preference for privacy-friendly tools, i.e., more people are actively using privacy-friendly search engines to discover other privacy-friendly alternatives.
Top 5 categories: What are people looking for?
A quick look at the āTop Pagesā report tells us about the most popular top 5 categories:
- Email providers
- Search Engines
- Cloud Computing Platforms
- Navigation Apps
- Web Analytics Services
Upon expanding the report, we can take a deeper look at the most popular categories in terms of unique visitors, while comparing the engagement metrics.
B2B categories win?
While categories like email providers, search engines, and navigation apps attract a large B2C audience, B2B categories are more diverse and consistently rank high.Ā
Businesses are actively seeking privacy-friendly analytics, hosting, and infrastructure alternatives, driving growth in EU-built tools.
Not just a European trendādemand goes beyond the EU borders
Looking at the all-time Countries report: European countries make up the bulk of visitors (as expected), but the United States ranks fourth, sending over 100K visitors to the site!Ā
Even Canada and India make an appearance in the top 25, contributing decent amounts of visitors to the site.
If you scroll all the way down, or simply look at the āMap view,ā thereās hardly any part of the globe thatās not participating in the shift towards privacy-focused digital tools.
Privacy-conscious users are everywhere.
European countries leading (and lagging) in the shift to privacy-friendly tools
Germany leads in visitor numbers, with 379K, and contributing 20.5% of the total traffic all-time! This is followed by the Netherlands (141K) and France (135K). These countries show significant engagement with European digital alternatives.
On the lower end, Guernsey recorded 130 visitors, while regions like Vatican City State and Saint Martin (French part) had 3 each. The variation in engagement levels may reflect differences in population size, internet adoption rates, etc.
Which European tools are winning?Ā
The dashboard has an Outbound link click goal configured, which is useful in understanding which listed tools eventually attract real visitors. Once we filter the dashboard by this goal (see it here), we can see which URLs were clicked.
From these results, these are the top 10 tools receiving traffic from the EU alternatives website:Ā
- Mailbox.org (email provider)
- ProtonMail (email provider)
- Soverin (email provider)
- Startpage (search engine)
- Qwant (search engine)
- Posteo (email provider)
- Ecosia (search engine)
- Scaleway (cloud provider)
- Startmail (email provider)
- Good search (search engine)
And if we look at the all-time data, thereās a whopping 1.2M total outbound clicks from the website to such tools, thatās more than half of the total all-time visits to the library.
And conversions arenāt just from Europeāthe U.S. audience alone has a 16.8% conversion rate (check here)!
Need a hand in deciding which tools to go with?
We recently vetted many EU-built B2B tools and picked out a few of our favourites. Theyāre all:
- Built in the EU ā Companies headquartered in a European country.
- Hosted in the EU ā Ensuring your data doesnāt leave the European borders and stays compliant with European privacy laws.
- GDPR-compliant āĀ Tools that align with European data protection laws.
- High quality ā Competitive with mainstream solutions.
- Privacy-focused ā Respecting user data and following GDPR regulations.
Check out our handpicked list of 16 privacy-focused European tools.
Privacy-friendly web analytics is on the rise
We started Plausible Analytics six years ago as a privacy-first, GDPR-compliant, cookieless and muuuch easier-to-use alternative to Google Analytics.
So, we took a look at how our category is doing:
- 5th most popular category in the library
- 9th most visited page overall
- More than a 2,700% increase in unique visitors in 2025
The demand for privacy-friendly web analytics is booming. If youāre looking for an alternative, check out our privacy-first approach.
š
Scroll depth tracking has always been a crucial metric for businesses, marketers, and site owners. It helps you see how far users scroll down a page, and understand the content performance, engagement levels, and areas for improvement.
Yet, setting up scroll tracking has always been a tedious process ā no matter which tool you switched to. For instance, Google Analytics 4 (GA4) only offers basic tracking at 90% depth as part of their enhanced measurement, which is only useful in knowing whether visitors are scrolling all the way down or not.
To track other percentages, like 25, 50 or 75 percent, you need to set up custom events in Google Tag Manager, which involves a learning curve and requires time and patience.
Whereas other alternatives either donāt allow tracking this metric or require complicated and technical setups and/or custom reporting to track scroll depth.
At Plausible, weāve simplified this completely. Scroll depth tracking is now built into our analytics by default, requiring no setup or additional configurations.
We automatically track and display all percentages to which your visitors scroll to. Available at no extra cost, this feature is live in your dashboard and ready to provide actionable insights.
- What is scroll depth?
- Introducing,Ā automatic scroll depth tracking in Plausible Analytics
- Whatās wrong with other web analytics toolsā scroll tracking?
- What is a good scroll depth?
- Ready to track your page scrolls?
What is scroll depth?
As the name suggests, scroll depth is the vertical length to which a visitor scrolls a particular page on your website. This metric is often reported as a percentage, such as 25, 50, 75, or 90 percent.
If they scroll down all the way down to the page (generally including the footer), itās a 100% scrolling event. If half, 50%. And so on.
This insight helps you:
- Understand how visitors interact with your content, identifying which sections capture attention and which are overlooked.
- Optimize content placement by determining the ideal locations for important elements like calls-to-action, ensuring they are positioned where users are most likely to see them.
- Improve page layout by analyzing scroll patterns.
For example, if youāve got a long-form article with a call-to-action (CTA) near the bottom, scroll depth can tell you how many of your readers actually made it to that point.
This is also measured in pixels (usually rounded off values) in some tools. Although, itās better to track it in percentages because that way itās easier to judge the engagement level of the page and easily compare it to other pages too.
Introducing,Ā automatic scroll depth tracking in Plausible Analytics
The scroll depth metric is now live in your dashboards!Ā
If you donāt have an account with Plausible, you can check out the Top Pages report in our live demo to see what this looks like.
Hereās everything you need to know about it:
Tracked by default
Scroll depth is measured automatically for every page on your site that is tracked with the Plausible script.
No sending scroll depth events, no plugins, no tag managers, no extra cost, no extra time and effort.
This is a rare feature that you wonāt easily find in any other web analytics tool.
Tracked at all page heights
We track page scrolling at all scroll depth percentages: 1, 2, 3, 4,ā¦33, 34, ā¦67, 68, ā¦, 98, 99, 100%.
Itās not limited to predetermined specific points only like 25, 50, 75, 90 percent ā like some other tools.
Found with the Top Pages report
You can find the scroll depth metric in two main areas of your dashboard:
- The top row of metrics in the dashboard, whenever a page filter is applied. You can click on the metric to display its performance over time on the line graph as well.
- The expanded Top Pages report (i.e. Details section), where you can even sort pages by scroll depth for deeper insights.
If thereās no data available yet, scroll depth will simply display as ā-ā until sufficient traffic is captured.
If you are new to Plausible, you can refer to the gif above to visualize this, or play around with our live demo.
Available with the official Plausible WordPress plugin
This metric is available with our official WordPress plugin as well.
Set your Scroll Depth goals
Scroll Depth goal tracking in Plausible goes beyond measuring how much of a page was scrolled.
You can set up a Scroll Depth goalĀ with a specific scroll depth percentage threshold to see how many visitors scroll to and beyond your desired scroll depth mark.
To get started with Scroll Depth Goals, go to your websiteās settings in Plausible Analytics and visit the Goals section.
See how here.
Group pages to see the average page scroll depth
Another thing you can do in the Plausible dashboard is group similar pages and see the average scroll depth for them.
For eg. If you want to see how much on an average do your blog posts have scroll depth, and you know that all your blog posts contain the word āblogā within the path of their respective URLs (as is the case with our website), then you can use the Filter option to filter such pages. Here is how:
Click the āFilterā option located beside the time period selector. This will open a dropdown, where you can select the first option that says āPage.ā This will open a modal as shown in the following screenshot.
Open the dropdown list located below the āPageā section and select the ācontainsā operator. Type a word like āblogā (like the example above). Click the āApply Filterā button.
This way, your dashboard will show the average scroll depth for only the blog posts on your website, which happens to be 52% for the last 30 days in the case of our website:
By the way, if you pay closer attention to this example, youāll see that only the pages containing the word āblogā in their URLs are being displayed in the āTop Pagesā report.
Whatās wrong with other web analytics toolsā scroll tracking?
Other web analytics tools:
- Either donāt allow tracking the scroll depth metric, such as Cloudflare Analytics amongst others,
- Or require complicated and technical setups and/or custom reporting to manage, track and view their scroll depth data.
Letās take a look at Matomo and Google Analytics, which are popular choices for web analytics.
How to track scroll depth in Matomo?
Matomo offers three ways to track scroll depth:
- Install the Matomo Heatmaps and Session Recording plugin. Then, by viewing the heatmaps, youād be able to automatically see how far down visitors scroll on your pages.
- Use the Matomo Tag Manager. Matomo has its own tag manager, like the Google Tag Manager. To see scroll depth data in your Matomo dashboard, youād need to create a new tag, configure the event with specific scroll depth percentages, and set up the appropriate trigger.
- Manually track scroll percentage using a JavaScript code on your website.
P.S. We also have a list of other similarities and differences between Plausible and Matomo as web analytics tools for your easy analysis.
How to track scroll depth in Google Analytics?
Google Analytics has been the default choice for web analytics for years. Yet, the built-in scroll tracking (available as an enhanced measurement, which also needs to be turned on manually) in GA4 continues to be limited as it only allows tracking at 90% scroll depth.
This is only useful in knowing whether visitors are scrolling all the way down or not. Therefore, the solution suggested by GA experts is to disable it entirely and implement a more customized solution within Google Tag Manager.
Here is an overview of the steps to do so:
- Disable scroll tracking in the GA4 interface.Ā
- Configure a new scroll depth trigger in Google Tag Manager. Go to Triggers -> New -> Scroll Depths -> Vertical scroll depths -> Percentages -> enter values for which you want to track scrolling depth. Name the trigger and save it.
- Test your trigger using the debugger and preview mode and verify the setup by checking whether the data is correctly collected in the data layer.
- Create a new tag in GTM by selecting the tag type as a GA4 event, selecting the relevant configuration tag from the dropdown. Enter the event name, set event parameters. Choose the triggering. Name and save the tag.
- Open GA4 and to test your setup with the debugview.
- Create a custom dimension in GA4 settings. Decide the dimension name and scope. Select the relevant parameter (earlier setup within GTM). Save it. Wait for 24 hours, itāll be available in your reports. By the way, you can have up to 50 custom dimensions only.
As you can see, this setup is time-consuming and tricky for those unfamiliar with Google Analytics and Tag Manager. Youāll need to test rigorously and need to maintain your setup over time as both your website and GA evolve.
Many engage freelancers and agencies to help with such setups, which is expensive and time-consuming. There are many other things that are much easier to do in Plausible than Google Analytics 4.
What is a good scroll depth?
There isnāt a universal āgoodā scroll depth because it depends on the purpose of the page in question. For a blog post, youād ideally want most users to reach at least 75% of the page, as thatās where the conclusion or CTA often resides.
On landing pages, especially ones designed for conversions, more than 70-90% scroll depth can be the goal, as it signals that users are understanding about the offering and reaching the final CTA or contact form.
Similarly, if most users drop off before 25%, it might indicate issues with your header or introduction. But, if you have a CTA around that mark and your bounce rates are low, it could also indicate that the visitors converted around that time.
Or if engagement drops significantly at 50%, perhaps your content isnāt holding attention, or the layout is causing friction. If many users arenāt even scrolling down below the 10% mark, then probably you targeted the wrong audience.
So, it all depends! This is why you need a dashboard to compare all metrics and make meaningful conclusions.
In Plausible, you can club this metric with other metrics such as time on page and bounce rate to understand the engagement patterns properly. This can be measured against the visitors and pageviews, amongst other things.
Ready to track your page scrolls?
Plausible prioritizes simplicity, privacy and accuracy ā all at the same time. Our goal is to remove unnecessary complexity while providing the insights you need to improve your website and business.
Europe has been building world-class digital tools for years. A major advantage is that many of these tools prioritize privacy and open-source development by default.
Millions have been exploring European alternatives this year (2025). If youāre looking for alternatives to mainstream big tech services, hereās a handpicked list of high-quality European alternatives for your business.
- Criteria for choosing these tools
- Privacy-friendly European B2B tools (A-Z)
- AppSignal (Datadog alternative)
- Brevo (Mailchimp alternative)
- BunnyCDN (Cloudflare alternative)
- Crisp (Intercom alternative)
- DeepL Translate (Google Translate alternative)
- Element (Slack & Microsoft Teams alternative)
- LanguageTool (Grammarly alternative)
- Mistral AI (ChatGPT alternative)
- Morning Score (Ahrefs & SEMrush alternative)
- Mullvad (ExpressVPN alternative)
- Odoo (Salesforce & SAP alternative)
- Passbolt (1Password & LastPass alternative)
- Plausible Analytics (Google Analytics alternative)
- ProtonMail (Gmail and Outlook alternative)
- Tally Forms (Google Forms and Typeform alternative)
- Whereby (Zoom and Google Meet alternative)
- Final thoughts
Criteria for choosing these tools
We selected these tools based on:
- Built in the EU ā Companies headquartered in a European country.
- Hosted in the EU ā Ensuring your data doesnāt leave the European borders and stays compliant with European privacy laws.
- GDPR-compliance āĀ Tools that align with European data protection laws.
- High quality ā Competitive with mainstream solutions.
- Privacy-focused ā Respecting user data and following GDPR regulations.
Note: This list is based on information available in March 2025. If a tool is listed as GDPR-compliant, it is based on the vendorās own claims. Always verify compliance for your specific needs.
Privacy-friendly European B2B tools (A-Z)
Letās go alphabetically as we have no order of preference:
AppSignal (Datadog alternative)
AppSignal is an intuitive APM for developers which helps track performance, spot any errors, monitor servers & uptime of your apps. Itās easy to use and powerful at the same time.
Based in: The Netherlands
Hosted in: EU
GDPR compliant? Yes
Cost: Starts at ā¬18 per month (30-day free trial)
Brevo (Mailchimp alternative)
Brevo is a comprehensive email marketing platform, they also help you manage customer relationships across email, SMS, chat, and moreābringing communication and support in one place.
Based in: France
Hosted in: EU (source)
GDPR compliant? Yes
Cost: Free to start
P.S. If you just need a transactional email service, try Scaleway TEM, (hosted in the EU).
Self-hosted alternatives
Quick revision: Self-hosting may require some developer hours, but if you have the expertise available, it can help you have full control over its deployment and infrastructure, eliminating concerns about where a third party might be hosting it in the cloud.
If you are looking for a comprehensive list manager, check out Listmonk ā a fully open-source, simple newsletter and mailing list manager.
BunnyCDN (Cloudflare alternative)
BunnyCDN is a Content Delivery Network (CDN) designed to enhance website performance by caching and delivering content through a global network of servers.Ā
Unlike many big-tech CDNs that track user data, BunnyCDN focuses on speed and efficiency without invasive data collection. They also include features like image optimization, video delivery, and edge storage.Ā
P.S. We use BunnyCDN at Plausible and have been happy users for a long time now.
Based in: Slovenia
Hosted in: Global. Itās not possible for a CDN to be hosted from a singular location.
GDPR compliant? This needs to be checked for your specific case because Bunny is global, but according to their website, āno user-identifiable data is collected or processed whenever possible.ā
Cost: Pay as you go (14-day free trial)
Crisp (Intercom alternative)
āCrisp chat is a business messaging platform that provides a unified messaging platform with live chat, email, and chatbot automation.
It offers features like a collaborative inbox, AI-powered chatbots, CRM integration, help desk management, etc. They also have a mobile app.
Based in: France
Hosted in: EU. Messaging data is stored in The Netherlands and Plugin data is stored in Germany. However, their relay data is stored in the USA, UK and Singapore (which they plan to change)
GDPR compliant? Yes
Cost: Free plan available
Self-hosted alternatives
Chatwoot ā an open source customer engagement platform. It provides omnichannel support, allowing businesses to manage customer conversations across email, live chat, social media, and messaging apps.
DeepL Translate (Google Translate alternative)
DeepL Translate is an AI-powered translation tool known for its accuracy and privacy focus, making it a strong alternative to Google Translate.
- Based in: Germany
- Hosted in: Iceland and Sweden (source)
- GDPR compliant? Yes
- Cost: Free for basic use
Element (Slack & Microsoft Teams alternative)
Element is an open-source app for team communication, powered by an open protocol called Matrix, itās also built by the builders of Matrix.
It keeps messages private with end-to-end encryption. Because Matrix is decentralized, Element users can chat with people on other Matrix apps and servers without being tied to one provider. You can even self-host your own Matrix server.
Based in: UK (not EU)
Hosted in: EU (source)
GDPR compliant? Yes
Cost: Starts at ā¬5 per user/month, paid annually (Free app available for personal use)
LanguageTool (Grammarly alternative)
LanguageTool is an AI-based, open-source, multilingual grammar and spell checker supporting over 30 languages. They have a Chrome extension, Google Docs add-on, and a desktop app as well.
It also comes with features to help track your productivity, see an overview of languages used, errors made, etc., so you can track your improvements over time. You can also self-host.
Based in: Germany
Hosted in: Dublin, Ireland (source)
GDPR compliant? Yes
Cost: Free
Mistral AI (ChatGPT alternative)
Mistral is a French AI startup, with their own chat app called Le Chat, similar to ChatGPT, Deepseek, etc. They also published an OSS model a while back, which you can run on your own.
Based in: France
Hosted in: Sweden (source)
GDPR compliant? Yes
Cost: Free
Morning Score (Ahrefs & SEMrush alternative)
Morning Score is a user-friendly SEO tool designed to simplify keyword tracking, competitor analysis, and website optimization while respecting privacy.
- Based in: Denmark
- Hosted in: Germany
- GDPR compliant? Info not available
- Cost: Starts at ā¬49/month (free trial available)
Mullvad (ExpressVPN alternative)
Mullvad is a privacy-focused VPN service with over 700 servers in 38 countries. It provides apps for Windows, macOS, Linux, iOS, Android, and a Firefox add-onāall of which are open-source and available on GitHub.
True to European values, Mullvad VPN has a very strong stance on privacy which is clear upon visiting their homepage.
Based in: Sweden
Hosted in: A VPN cannot be restricted to a singular hosting location, however, they claim that all their VPN servers run from RAM, and donāt use any shared compute resources. Given these claims, it seems worth taking a look.Ā
GDPR compliant? Yes
Cost: ā¬5 per month flat
Odoo (Salesforce & SAP alternative)
Odoo is an open-source enterprise resource planning (ERP) software that integrates multiple business applications into a single platform. They have a wide range of modules, including CRM, sales management, e-commerce, warehouse management, accounting, manufacturing, and human resources.Ā
This modular approach helps businesses to customize the system to their specific needs for efficiency.ā
Based in: Belgium
Hosted in: Data stored closest to your region, and you can request to change it (source)
GDPR compliant? Yes
Cost: Free
Self-hosted alternatives
While the community edition of Odoo is open source, they do have a proprietary offering with additional features. In case youāre looking for a fully open source offering, try ERPNext, which many consider easier to self-host and manage.
Passbolt (1Password & LastPass alternative)
Passbolt is an open-source password manager for secure team collaboration. It offers end-to-end encryption using OpenPGP standards, ensuring that only authorized users can access stored data. They have been around for over a decade.
You can also self-host it.
Based in: Luxembourg
Hosted in: EU
GDPR compliant? Yes
Cost: Free
Plausible Analytics (Google Analytics alternative)
Weāre Plausible Analytics and after using Google Analytics for many years we believe we have created an alternative thatās privacy-first, simple to use, lightweight and much better at certain things.
We donāt use cookies so thereās no need for cookie banners. We donāt collect personal data so no need for GDPR and CCPA consent prompts either.
Weāre open source and can be self-hosted too.
Based in: Estonia
Hosted in: EU
GDPR compliant? Yes
Cost: Starts at $9 per month, cheaper for an annual subscription (30-day free trial)
ProtonMail (Gmail and Outlook alternative)
ProtonMail is an email service that emphasizes security and privacy through end-to-end encryption. The service is accessible via webmail, as well as Android and iOS applications. They have a strict no-logs policy, ensuring that even ProtonMail cannot access user emails.Ā
With features like Hide-my-email aliases, calendar and drive, they offer a compelling alternative to Google and Microsoft.
Based in: Switzerland
Hosted in: Switzerland, Germany, and Norway (source)
GDPR compliant? Yes
Cost: Free
Tally Forms (Google Forms and Typeform alternative)
Tally Forms is a free and intuitive forms builder. You can build your form by working in a text document like format.
It also offers advanced features like conditional logic, signatures, calculations, file uploads, etc. In other words, a better alternative to Google Forms or Typeform.
They initially created it in a motivation to replace the big tech as they were expensive, and have been at it for about 5 years now, completely funded by customers.
Based in: Belgium
Hosted in: EU
GDPR compliant? Yes
Cost: Free
Whereby (Zoom and Google Meet alternative)
Whereby is a user-friendly, browser-based video conferencing tool, requiring no downloads or logins for guests.
It offers features such as screen sharing, customizable meeting rooms, and integrations with tools like Trello, Google Docs, and Miro Whiteboard. Whereby also provides an API for embedding video conferencing capabilities into websites and applications.
P.S. We use Whereby at Plausible for internal video calls.
Based in: Norway
Hosted in: User data stored in Ireland. However, being fully EU-hosted isnāt entirely feasible since they serve a global audience and need to maintain video routers worldwide. However, users in a European country will connect to a data center physically located within the EEC.
GDPR compliant? Yes
Cost: Free
Self-hosted alternatives
If you wish to self-host, check out Jitsi.
Final thoughts
By choosing European-built alternatives, you support businesses that respect privacy, security, and local data regulations. Whether self-hosted or cloud-based, these tools provide viable, high-quality replacements for big tech solutions.
Do you have any suggestions? You can write to us at [email protected].
If you are noticing your website traffic is down, the first thing to do is investigate the reasons. Donāt panic, you were probably hit by a Googleās core update, something maybe off on the technical side, the site may be responding to a new trend, or you may simply need to write more/better content.
Interestingly, at the time of writing this post, we are also noticing a drop in our website traffic for the last 30 days. This compelled me to investigate on our end too and opened the perfect opportunity to cover this topic.
To do this, you need a web analytics tool, and if the situation demands, access to your SEO tool (or other similar free alternatives) and Google webmaster tools.
By taking you through the things you can do on such tools and taking our own investigation as an example, we will demonstrate how to diagnose the issues causing your websiteās traffic to decline.
- Analyzing the traffic drop
- Analyzing an organic search drop
- Concluding investigation
- How much traffic drop is too much drop?
- Youāre in good hands
Analyzing the traffic drop
There are mainly five things you need to check to analyze your traffic drop, as defined in the five sections below. This can be applied to any web analytics tool you are using.
We used our Plausible dashboard for this analysis. However, if youāre using a different tool like Google Analytics, we have mentioned the equivalent reports in GA4 to access the same information.
Letās begin.
Determine if itās a drop or a trend
The first thing to do is try and understand whether your traffic has been declining for a while or are you experiencing a sudden drop for a day or two?
If you are experiencing a sudden drop, do check out the Google Search status page to see if there are any crawling, indexing, or similar issues from Googleās side. And then, align the dates in the reported incidents with the dates in your web analytics tool.
Taking our own traffic-drop example, I was noticing a drop for the past 7 days and 30 days time period. I switched on our comparative view to compare the data on the last 30 days with the period of 30 days prior to that:
As you can see, while our unique visitors and total visits are both down by 3%, our engagement metrics like views per visit, bounce rate and visit duration have declined a bit too.
Thatās the first clue: since people have been visiting fewer pages in a visit, spending less time on the site and bouncing off more, something might be off with the content.
I repeated the same exercise for the past 7 days and the drops are still there. When I check for the āyear to dateā time period (i.e. effectively, past 2 months), the traffic arrows are green again.
So I am also compelled to check for the past 12 months altogether to understand if there are bigger things to worry about and that showed improvements upwards of 50% so we should be good. Thereās nothing big to worry about.
A couple of tips if youāre a Plausible subscriber:
- You can send custom properties along with your pageview or custom goals to determine, for eg., if specific actions taken by logged-in users. The Plausible dashboard can be just filtered by properties too, so youāll be able to make a more nuanced analysis. You can even compare the trend for logged-in vs logged-out visitors.
- You can set traffic drop notifications by determining your 12 hour visitor threshold.
Where to find this info in GA4?Ā
Hereās a detailed explanation.
See which traffic channels are contributing to lower traffic levels
In Plausible, you can see a straightforward Top Channels report. You can check out what each channel means here.
Using GA4? You can open your āTraffic Acquisitionā standard report and continue investigating using the same principles we describe below.
To investigate, see which channels are showing a decline. In Plausible, itās indicated by the red, downward arrows.Ā
Upon hovering over any such arrow, youāll be able to see the drop in percentage as well. How to comprehend that?
- If your Direct traffic has been hit, you can investigate, for instance, if your brandās popularity recently went down due to some new competitors.
- If your organic traffic is down, it could be a content quality or technical SEO issue or probably even the AI wave eating up your organic traffic.Ā
- If your referral traffic has dropped, you can zero down on which exact ones are these and what must have happened on their end.
ā¦and so on.Ā
You can also click on any such Channel to see the exact sources within each channel (eg. Twitter, Reddit, LinkedIn, etc., in organic social) along with their respective traffic contributions as well as changes compared to previous period.
For us, out of the top four significant channels, organic search, referral and organic social are the three with declines.
Referrals is something we donāt try to typically influence or buy, so it may not be worth investigating that further. If anything, producing content is the way to go for us so we continue investigating the other two declining channels: organic search and organic social.
If I segment the dashboard by clicking on āOrganic Search,ā it is seen that Google, DuckDuckGo, Brave, etc., have contributed to the drops. Out of these, Google is the most significant traffic contributor.
As for organic social traffic, the significant one is GitHub. Out of the channels that we actively post in, only LinkedIn seems to be having a drop, while the absolute numbers are only in hundreds.
Also, a little decline in organic social traffic is natural because itās a well-known fact that social media algorithms these days try to prevent people from exiting their platform.
Interested people do make it to a website through the links in your profileās description or when you post something truly interesting but itās tougher to get organic social traffic nonetheless.
Since the biggest culprit until now seems to be Google search traffic, itās possible to drill further down by seeing the exact search terms that bring Google traffic to you. This is available in the Plausible dashboard through a GSC integration, or you can simply open up your Search Console in another tab.
It could also be an algorithm update, lost backlinks or technical issues with crawling. We have a whole section on analyzing an organic search drop down below, but first letās check the other reports for more information and eliminate any other possibilities.
Which pages are affected?
This information is available in Plausible Analyticsā Entry Pages report and it can help spot the exact landing page or pages seeing the drop.
Using GA4? You can open your āLanding Pagesā standard report and continue investigating using the same principles we describe below.
If you are seeing a pattern in such pages or only see specific 4-5 pages with significant drops, you could instantly get closer to the reason. For eg.,
- If you see that a blog post is seeing a traffic drop this month as compared to last month and you know that it was going viral last month, then the drop is natural and you probably donāt have much to worry about. Or if that wasnāt the case, you can check the specific blog post for any SEO issues like drop in rankings, outdated content, or any technical issues.
- If you see a pattern in the kind of pages seeing a drop, such as the pages in your SaaS app, you can compartmentalize better and know that you likely need to turn your eyes to your product marketing efforts.
For us, upon checking the Entry Pages (i.e. landing pages) report, almost all the pages have seen a drop. So itās a sitewide thing and a single page or pattern canāt be held culprit.
This means that only the traffic sources side of the equation is affected, as we didnāt make any recent changes on the website either.
Could this be a geographical or devices related thing?
This information is available in the Locations and Devices reports in Plausible.
Using GA4? You need to select country, region, city as a primary and/or secondary dimension in Demographic reports āā one at a time. Same thing needs to be done for getting the Browser, OS, device info, etc., within the āTech detailsā report.
To analyze,
- If you see certain regions with drops, youād know where to turn your eyeballs to.
- If you see specific devices or OS with drops, then you can check for technical issues. For eg., a sudden drop in traffic from mobile phone users could mean that your site is not accessible on phones anymore. In this case, get your tech team involved.
In our case, our top five countries are consistent regardless of whether we apply the filters of organic search and organic social channels or not. So, keeping focused on the so-far observed organic traffic-drop would be enough.Ā
Similar is the case with our devices report. The browser, OS, size data is also just a reflection of what our visitors use and donāt tell anything new about the traffic drop.
So, thatās eliminated.
Is it a ātime of the yearā thing?
If you are investigating a traffic drop in February, what was happening in February last year? This information can also be pulled from the Plausible Analytics dashboard (provided you were using it a year ago too).
Hereās what to do:
ā” ļø Choose a custom range (or press āCā) and set February 2024 (as per this example).
ā” ļø Select āprevious periodā in the vs dropdown.
For us, no significant pattern emerged. And we know that we are not a seasonal business so it does not apply to us.
Using GA4? Use different time ranges, such as past months, or corresponding dates from previous years to account for seasonality. This will help identify trends and significant shifts in user behavior or traffic sources.
Analyzing an organic search drop
The conclusion so far in our personal investigation is that Google search traffic is the main culprit for causing a drop in traffic for the Plausible site.
This is also one of the most common reasons for seeing a drop in traffic so itās worth checking the following while investigating any traffic fluctuations.
Check if Google penalised the site
To check if Google has issued a manual penalty to your site, you can check for any notifications in Google Search Console. You can also check the Security and Manual actions section in the Console for any detected issues. We did this for Plausible and found no such penalties.
There are also algorithmic and site-wide penalties. Algorithmic penalty is something that causes a drop in your rankings if you donāt follow Googleās guidelines.
A site-wide penalty is severe and usually causes the entire site to be marked as spam and be deindexed. A quick way to rule this out is to go to Google and type āsite:domain keywordā and swap the keyword with a main search term usually used for your site.
If Google search results return no or less than normal search results for your site, you may be battling with a partial or full site penalty.
As for us, thatās not the case. The next thing to do is check for lost keywords, backlinks, broken links, and/or rankings.
Check with an SEO tool
Many SEO tools provide an option to conduct a site audit which is the fastest way to come across any lingering issues. Other than that, look at your keywords and backlinks reports.
Hereās what I found from our Keywords report after sorting it by lost traffic column (Look at the orange āChangeā column):
The first row shows us losing traffic (by 83) and ranking (by 13) for the keyword, āas goalsā. But thatās a good thing because itās not a relevant keyword, and this drop actually leaves us with quality traffic only!
As for the next keywords like āgoogle analytics alternatives,ā āself hosted analytics,ā etc., I did a manual Google search from our top 5 countries to find out if we really did drop in rankings and that wasnāt the case for my search at least.
So why is this tool showing a drop in rankings? Itās difficult to pinpoint. It could be a one-off thing, maybe Google algorithm knows that I prefer Plausible as a search result, or the changes have been reset now.
It at least doesnāt matter at this point because we know thereās nothing much to do on our end.
I also checked our Backlinks reportāāa popularly considered SEO ranking factorāāand it shows a decline too but we never trade in backlinks and itās all organic for us so we believe this will recover on its own.
However, if you notice a decline and have an SEO team or an SEO strategy, you can improve your backlinking efforts.
Confirm there are no technical issues
Technical SEO, or general technical health of a website, ensures that search engines can efficiently crawl, index, and render a website. Slow site speed, indexing issues, crawl errors, server errors, etc., are all factors that can affect your siteās technical health.
Thereās a good guide by Semrush if you want to get deeper into this, but a quick and comprehensive way to check for any technical issues is to run an audit with your SEO tool.
If you donāt pay for an SEO tool, you can use Lighthouse Metrics. Simply add a page URL from your site and you can get performance scores on its metrics like first contentful paint, speed index, etc.
You can also see areas for improvement on accessibility and SEO. Any score in green is good.
If the issues seem to be of a larger scale, itāll become important to involve your engineering team.
Concluding investigation
Luckily, we didnāt have much to worry about. We will continue doing what we do with our content marketing while maintaining content quality. If you do happen to find anything worrisome, thatās not too bad because now you are not operating in the dark and can take corrective measures.
But itās worth taking a sanity checkā¦
How much traffic drop is too much drop?
This depends on:Ā
Your normal traffic levels
If you have 100 visitors in a month and notice a 3% drop, then thatās probably nothing to worry about since youāre still very close to your average. But if you have 1M visitors, and you see a 3% drop, that means youāre down to 970K visitors āā a loss of 30K visitors.Ā
The same 3% drop can mean very different things for different traffic levels. So do look at absolute traffic levels too before getting overwhelmed with red, downward arrows. :)
A small dip might be statistically insignificant for smaller websites, while for high-traffic sites, even a seemingly small percentage loss can translate to substantial revenue and engagement losses.
Industry and industry trends
A 10% drop for a SaaS business can be considered bad. Many SaaS companies rely on steady, predictable growth, and a sharp drop could signal problems such as customer churn, shifting market trends, or increased competition.
Meanwhile, a 10% drop for a news site that thrives on viral content may not be as concerning, especially if the dip coincides with fewer trending stories or seasonal variations.
Some industries naturally experience fluctuations due to external factors. For example, travel websites might see a dip in off-peak seasons, while e-commerce stores often see drops after big sales events like Black Friday.
Personal goals and observations
Itās also essential to align expectations with personal goals. If your aim is aggressive growth and you see even a slight dip, it might warrant an investigation. But if youāre maintaining steady traffic levels without significant business impact, a small drop may not be alarming.
For instance, even if your traffic has dipped but conversion rates on important goals (like sign ups or purchases) havenāt, itās alright.
Whether the drop is obvious
No matter how much your traffic numbers are, a 50% drop canāt be good (unless thereās a strong reason such as being in a seasonal business).Ā
A sudden, significant drop often signals a major issueālike a Google penalty, site indexing problems, technical errors, or an algorithm update. If you see such a drastic decline, immediate action is necessary to diagnose and fix the problem.
In cases where seasonality is a factor, a steep drop might be expected. For instance, a tax-related website might see traffic plunge after tax season ends.Ā
However, if the decline is unexpected and sustained, itās a red flag that warrants deeper investigation into possible causes like ranking drops, content performance, or shifts in audience interest.
Youāre in good hands
Fret not, traffic fluctuations are a natural part of site ownership. With the right tools and consistent monitoring, you can quickly identify dips and take corrective action as needed.
If you want to try out an extremely simple way of tracking traffic and ditch Google Analytics, do sign up for our free trial and join our 14k+ happy subscribers.
We are privacy-friendly and GDPR-compliant by default, our tracking code is less blocked by ad blockers, we automatically keep bot traffic at bay, and are proud to be a more accurate alternative to Google Analytics.
Good luck!
Itās an interesting time to be an online business, catching up with rapidly shifting ātrendsā regarding AI, increasing concerns around consumer privacy, and the one you canāt miss: shifting SEO āā one of the most favourite channels of digital marketers for more than two decades.
Some people on LinkedIn and other niche communities claim how āSEO is deadā (trust me, thatās been a pretty common thrown-around phrase), others beg to differ and encourage to adapt.
Meanwhile AI is being adopted more and more all over the world, privacy-conscious users are increasing, and it can all feel overwhelming. We try to take an objective view at everything and provide answers to what SEO looks like in 2025, how to adapt, what are some things to take care of, and how to measure it all in Plausible.
- Is SEO dead in 2025?
- So, what to do?
- Analyzing organic search traffic
- Why use Plausible for SEO analysis?
Is SEO dead in 2025?
This is a good time to revisit what SEO even means. āSearch Engine Optimizationā is the method of optimizing whatever content you push on the web in a way that search engines like to present it (preferably on the first page of search results) to the people who search for related queries.
There are essentially two parts of this āoptimization.ā One is hunting keywords to understand what people are asking on search engines and if relevant to you, writing content to offer answers to those questions. Second is making the content presentable, easily readable, accessible, etc. (the on-page, off-page and technical SEO bit of it).
The second part is something that content makers should anyway do, i.e. provide a good reading experience. But what lies at the core of all of it is the first part. So letās break that down by asking three questions:
Have people stopped asking questions on the web?Ā
Not at all (have you?).
Is the place and the way everyone searches for answers on the web evolving?Ā
Certainly. Think AI chatbots, privacy-first browsers, etc.
Has your overall organic traffic dropped?
Hmm, moment of truth. So, I decided to check ours at Plausible. I simply compared the last 12 monthsā organic search traffic and itās actually increased by 12%.
Last 3 months looked stable too. Since Google is the main organic search source that sends the most traffic and is generally equated with SEO, I just looked at traffic sent by Google searches, and the patterns are still similar for us.
If click-through rates from Google search result pages should have been dropping (because SEO is supposed to be dead and AI overviews are the new normal), it didnāt seem to affect us at least.
Not a good sample size but since we seem to be on the āsaferā side, this meets our purpose of continuing the study on what is working if SEO is not.
So, traffic has been stable and growing from organic searches. Next, we look at the breakdown of organic sources. Apart from Google, it shows some sources like DuckDuckGo, Brave, ChatGPT, Perplexity, etc.
The pattern here is that there are privacy-focused search engines and AI chatbots showing up decently more than the previous years.
The conclusion is simple. Search Engine Optimization is not just about gaming the top search engine in the world anymore, but the ability to be discovered by all the other āreplacements.ā If I could, I would rename it to organic search optimization.
Yes, there are a few exceptions. Say, you are an ed-tech and compile math exercises as a content strategy, then AI could be taking away some traffic from you by engaging with students in real time, sharing math exercises according to the set difficulty level, helping with formulae, exam questions, etc. In such cases, you will most likely need to adapt the whole content or business strategy itself.
Or, if the nature of questions in an AI chat donāt need a referral link to be shared or a product to be recommended, you could lose out on some traditional traffic but I do think that the balance will be maintained because people need solutions at the end of the day and they will find your high-quality product/service through a method.
So, is SEO dead? Yes and no.
Yes: traditional SEO that only tries to game the system could be dead, or at least struggling. The difference is that search engines have gotten better (and continue to improve) in weeding out content thatās optimized for ranking and not for providing value.
No: If youāve been providing valuable content. All the newer AI bots, search engines, etc. will find, recommend and cite your content on their own.
So if youāve been making high-quality content answering real questions of your target audience and the nature of your business allows you to stay afloat, you should be good to go.
So, what to do?
Take care of technical SEO
Since the new age channels are good at discovering your content on their own, we need to make sure that our content is indeed ādiscoverable.ā This would mean the following:
- Optimize crawling and indexing: Use robots.txt, XML sitemaps, and canonical tags to ensure search engines index the right pages.
- Improve site speed and Performance: Minify CSS/JS, enable caching, use a CDN, and optimize images for faster loading.
- Ensure mobile-friendliness & keep the Core Web Vitals in check.
- Enhance website security: Use SSL certificates, HSTS, and security updates to protect against vulnerabilities.
- Optimize internal linking and site architecture: Use a logical hierarchy, breadcrumbs, and internal links to improve navigation and SEO.
- Consider adding an llms.txt file
Optimize for all the āsearch enginesā
As we just saw, organic discovery sources have been spread among newer types of platforms. Beyond Google, optimize your content for alternative search engines like DuckDuckGo, Brave Search, Bing, etc., if you can.Ā
These platforms are gaining more traction due to increasing privacy concerns and evolving user behavior. You can start by ensuring your website is indexed properly on these search engines, testing how your content ranks there, and finding any newer search queries that might be there to answer.
AI traffic is real
Chatbots like ChatGPT, Perplexity, Claude, Deepseek, etc., have become discovery sources. These tools scrape the web for answers and provide direct responses to users. They:
- Suggest a product, leading to āDirectā traffic,Ā
- Cite a source, leading to āOrganic Searchā traffic on the cited source, or
- Do a web search for you, show results, and answer your question by assimilating information from those sources. This would also lead to āOrganic Searchā traffic.
Interestingly, we saw a traffic surge of 2,200%+ in 2024 just from AI channels. The whole study along with ways to āAI-optimizeā are here.
Speaking of whichā¦itās worthy to take a moment to observe how AI companions are being utilized now: as a personal note taker, movie recommender, finance manager, strategist helping with big business decisions, and whatnot.
This has raised concerns over vulnerability of personal and sensitive data, since AI gets access to the usersā routines, financial data, and other sensitive information. Itās speculated how this is going to inspire a lot of policies and regulations this year around the world, which brings me to the next point.
Run along with privacyĀ
This is a bit wary of our main discussion about organic search traffic but since privacy as a factor is changing user behavior, I feel compelled to talk about it as an essential piece of the puzzle.
Your consumers care more and more about privacy. Becoming more customer oriented than algorithms oriented can be a key strategy. Consider these:
- Companies like Apple have used privacy as a competitive advantage by educating and protecting its users from being tracked by introducing features like Private Relay, app tracking protection, etc.
- Companies like Meta, Amazon, Google, WhatsApp, Tiktok, etc. have faced GDPR fines in millions of dollars within the last few recent years. In fact, Meta faced a fine of ā¬1.2 billion in May 2023. All this for mishandling personal data.
- Third-party cookie tracking is coming to a demise, which we covered here.
- Increasing use of VPNs, ad blockers, and private browsing tools. Another study we did here that revealed 58% of tech-savvy audiences use such methods.
- There is a shift towards self-hosted and decentralized services as users move away from centralized Big Tech platforms.
- Younger generations are increasingly aware of data rights and privacy risks.
- If you run a quick Reddit search, youāll notice communities of privacy anxious individuals with a combined total of 1.7M members.
If this is not enough, know that 120 countries have established privacy and security regulations just like the GDPR to protect their peoplesā data privacy.
Itās much easier to build a privacy-friendly tech stack for your business rather than navigating all those regulations with costly legal aid.
Respecting and proactively protecting your customersā privacy can very realistically emerge as a brand differentiator, and hence, help your overall traffic and business.
Analyzing organic search traffic
Letās get to the measuring organic search (the ānew SEOā) part of the equation. Hereās everything you want to be able to do:
- See all organic search traffic together, while seeing a breakdown of specific sources like Google search traffic, Perplexity traffic, and so on.
- Understand which content is effective in driving such organic trafficĀ
- Understand which search terms drive traffic the most
- Engagement metrics like time on page, bounce rate, scroll depth, etc.
- Assess how well organic traffic contributes to conversions like sign-ups, purchases, form submissions, etc.
All of this is possible to do with Plausible. If you have been in SEO even for a bit, youāll know how tough it can get to analyze your traffic in Google Analytics which is the traditional choice.
For all the research work (keyword analysis, competitive analysis, etc.), we suggest you use the specialized SEO tools. But analyzing such traffic on your website needs a web analytics tool. All of this can be done with Plausible.
You can play around with our live demo, if you donāt have a Plausible account.
See overall organic search traffic
We have a pre-built āTop Channelsā report in the dashboard which automatically consolidates relevant traffic sources into their respective channel types such as Direct, organic search, organic social, etc.Ā
This makes it super convenient to see all organic traffic together rather than piecing different traffic sources together from different reports, or relying on custom reporting.Ā
When you click on the organic search entry (like shown in the case study above), your dashboard filters by it to only show data relevant to organic search, and you get to see a āTop Sourcesā report with a breakdown of these organic search channels.Ā
This is an expandable report that shows the sources by visitors, bounce rate, and visit duration, which you can sort by as well. You can click on any of such sources to filter further. Take a look here.
Assess effective content
In the filtered dashboard, you can use the Entry Pages report to see which pages attract the most organic search visitors. This is also an expandable report with metrics you can use to sort your landing pages by.
You can click on any row to further filter the dashboard and study that landing page: its bounce rate, geographies it was visited from, goal conversions in the sessions acquired through this page, etc.Ā
Assess effective keywords
You can break down your Google search traffic further by integrating your Google Search Console account with Plausible. Youāll be able to see the search terms that got you traffic from Google directly in the Plausible dashboard.
Compare the queries driving traffic with the Entry Pages report to see which content aligns with search intent. Over time, you can monitor which keywords gain traction and adjust your content strategy accordingly.
Correlate conversions and other things
Thatās it. Now you can also see your goal conversions (learn how to set them up here) in this filtered dashboard. You can also see the used devices, operating systems, popular locations, etc.
You can use a mix and match of filters to create your segments and analyze organic traffic further. For eg. I can apply the following four filters to assess how many people signed up for our free trial from organic searches, landing on the homepage, in England.
You can check out this exact segmented dashboard here.
Why use Plausible for SEO analysis?
Plausible is an open-source, simpler, and privacy-friendly alternative to Google Analytics.
- Everything is in one place. Unlike traditional tools like Google Analytics, where you need multiple reports or custom explorations, we keep it simple with a single-page dashboard. In fact, itās easier to track visits, exit pages, conversions analysis, and a lot of things in Plausible than GA4.Ā
- Our stats are very accurate as we take special measures in ensuring so. This is another differentiator from most analytics tools.
- We donāt rely on cookies or invasive tracking, ensuring privacy-friendly analytics and out-of-the-box compliance with GDPR, and similar laws around the world.
P.S. Now you can even make your own SEO dashboard that suits your needs the best in Looker Studio using our brand new connector. Iām thinking of making one as a template for SEO professionals, should I?
Different businesses or teams have unique goals, workflows, and preferences for tracking and visualizing their data. For marketers and marketing agencies, creating custom dashboards is crucial to help clients visualize and understand their data. One powerful tool for this purpose is Looker Studio by Google.
This beginnerās guide provides a detailed exploration of Looker Studio, covering everything from basic visualizations to advanced techniques like data blending and calculated fields.Ā
Additionally, weāre excited to introduce our all-new Plausible Analytics Looker Studio Connector, now live in beta, which simplifies integrating Plausible data into Looker Studio for more flexible and powerful reporting.
Using the example of replicating a Plausible Analytics dashboard, weāll introduce you to Looker Studioās capabilities while showcasing how to create sophisticated, custom reports.
Weāll introduce you to what Looker Studio is, its capabilities, and how to utilize it by taking the example of replicating a Plausible Analytics dashboard in Looker Studio.
- What is Looker Studio?
- How to use Looker Studio?
- Plausible Analytics connector for Looker Studio
- Creating a simple report in Looker Studio
- Advanced uses of Looker Studio
- Try it for yourself!
What is Looker Studio?
Looker Studio is what was previously known as Google Data Studio. It is a data visualization tool designed with the intent of helping you create custom reports and simplify their interpretation through visualizations that you or your clients prefer.
It allows you to bring data from multiple sources into one place, transforming complex information into clear, easy-to-digest reports. This helps you get actionable and valuable insights from raw data.
Looker Studio is free of cost to use, as a self-service business intelligence tool. However, the Looker Studio Pro is also an option.
In a nutshell, you can do the following with Looker Studio:
- Use visuals like tables, pie charts, bar graphs, etc.
- Select the specific data and custom metrics you want to showcase.
- Customize fonts, colors, and overall design. Or even do something like incorporating your clientās logo for personalization.
- Share reports with others, giving them permission to either view or edit the reports based on your preferences.
The Looker Studio is not only limited to Googleās tools like Google Analytics 4, Google Ads, Google Sheets, etc., but also connects with various third-party tools that connect data and analytics that are useful to businesses. It could be a CRM, Meta Ads, LinkedIn Ads, product analytics, etc.Ā
The reports can be dynamic, meaning they automatically update whenever the original data source changes, ensuring your reports automatically reflect the most current information.
How to use Looker Studio?
If youāre new to the Looker Studio, you can start by creating an account here. The dashboard offers a variety of templates, from simple data overviews to detailed analyses.
You can choose a template based on your needs and/or the audience of these visualizations. For eg., if you are a digital marketing team, you can look at the templates showcasing key KPIs like conversion rates, impressions by channel, and audience engagement.
But the more important part is choosing a data connector. This basically means which tool you want Looker Studio to source its data from, so that you can create custom reports from it.
As a crash course on how to use Looker Studio, we will take the example of our own connector.
Plausible Analytics connector for Looker Studio
Plausible Analytics is a simpler, privacy-friendly, and more accurate alternative to Google Analytics, that now comes with the official Plausible Analytics Looker Studio Connector.Ā
This helps you add some sophisticated and powerful reporting features that help turn Plausible into an even better replacement for Google Analytics.
It allows you to link your Plausible data with Looker Studio and integrate it with all your other data sources to produce custom and flexible reports in seconds. See our documentation on how to start using the Plausible Looker Studio connector.
Creating a simple report in Looker Studio
To create our first report in Looker Studio, we will create a simple replica of the default Plausible Analytics dashboard. This will give us a feel for what fields are available in the Looker Studio connector and how we can begin to create our own custom data visualizations.
You can explore this simple report template that weāve created in Looker Studio which you can use to start building your own custom reports.
First, a brief overview of how Looker Studio works. On the right hand side, you will see a toolbar that gives you options of the different fields available while above that, you can see the different visualization options.
When you insert a visualization such as a time series chart, you will have the option to add fields as āDimensionsā, āMetricsā or āFiltersā. You can also specify how you want the data sorted and apply some custom styling.
Letās see this in action by recreating the top graph in the Plausible Analytics dashboard, which looks like this.
Time Series graph
The most prominent part of the dashboard is the line chart that displays the number of āUnique Visitorsā over time. In order to recreate this, we will select a āTime Series Chartā from the right-hand toolbar or you can do the same from the top menu under āInsertā.
Then, we will simply need to select āDateā as our dimension and āVisitorsā as our metric.
Depending on what time range you want to use your chart for, you can set things up differently in Looker Studio. If you want to see daily or weekly numbers, you should use the āDateā dimension. For weekly, you can change the way that Looker Studio reads the āDateā field by clicking on the calendar icon and changing āData Typeā to āDate & Time > ISO Year Weekā.
If you want to see annual or monthly data, you could either have Looker Studio do it for you by changing āData Typeā to āDate & Time > Year or Year Monthā, or you could select āYear or Year Monthā as your dimension instead of āDateā (both ways work the same).
Finally, if you want to see hourly data, you should use the āTimeā dimension. Once weāve configured our fields, we should have something that looks like this.
Scorecards
Going back to the Plausible dashboard as our guide, next we want to add the individual metrics across the top. In Looker Studio, these are called āScorecardsā. Letās add one by going to āInsertā and choosing āScorecardā.
The configuration for a scorecard is simple, you just need to pick the dimension that you want to highlight. Letās start with āVisitorsā, then we can simply copy and paste the scorecard and update the metric for visits, pageviews, views per visit, bounce rate and visit duration.
In order to add the comparison to the prior period, we can select āComparison Date Rangeā and choose āPrevious Periodā. This tells Looker Studio to automatically calculate the change based on the date range you have selected so if youāre looking at the last 30 days, it will take data from the 30 days before that and tell you what the difference is.
Once weāve set up all of our scorecards, we have something like this.
Date range
Speaking of date ranges, this is a good time to add one to our report. To do this, you simply go to āInsertā, choose āDate Range Controlā and click anywhere on your report.
This will give you a dropdown menu that enables you to select the date range you want to view for your entire report. When you change this date range, any comparisons that youāve enabled (like our scorecard) will automatically update as well.
Tables
Then we can move down our Plausible Analytics dashboard and recreate some of the tables you will find there. Tables in Looker Studio are one of the most versatile ways you can use your data as you can add several dimensions at once and export to CSV or Google Sheets.
For our purposes, we will look at the āCountriesā and āDevicesā section of the Plausible dashboard as these particularly show the strength of Looker Studio. In order to recreate the list of countries, we simply need to select āInsertā, choose āTableā and then choose āCountry Name as our āDimensionā and āVisitorsā as our metric.
But then if we want to add āRegionā and āCityā, we can actually just add those directly to the same table by adding those as new dimensions.
So now we have a table that gives us the granular level detail of each country, region and city combination, something that you would have to click on each item individually in the Plausible dashboard.
You can see this as well with the āDevicesā table. By selecting āDeviceā, āBrowserā and āOSā as dimensions, you can see the stats for each individual combination.
Filters
Finally, letās add some filters to our data. Looker Studio offers simple and advanced filtering and for now weāll stick to the basics. To add a simple drop-down menu that will enable you to filter by different dimensions, you can go to āInsertā and choose āDimension Controlā.
Then you simply need to select what dimension you want the drop-down to use. For our example, letās add one that corresponds to the Plausible dashboard: āSourceā.
Once you add this, you will see that if you click it, you will see all the sources that referred traffic to your site. By selecting one or many, we will filter our report accordingly.
Considerations
Creating your own customized reports gives you the power to use your Plausible Analytics data in new and interesting ways but it also means that you will be exposed to some of the limitations of how different data fields can or cannot be combined.
One of the primary considerations to keep in mind is that some dimensions are based on events (every action that takes place on your site) while others are based on visits (sessions that take place on your site). Depending on which category a dimension falls into, different metrics will be available. In general, page, hostname and goal are event dimensions while all others are session dimensions.
Bounce rate, visits and visit duration can only be used in combination with session dimensions, while events can only be used with event dimensions. In the case that you use an invalid combination of dimensions and metrics, you will either see null values for the invalid metric or you will see an error in Looker Studio.
For example, if you try to use entry page as a dimension together with events, you will see null values because entry page is a session dimension. (it keeps track of the first page that a user visited during their session)
You have several ways that you can work around this. First, you can use visitors or visits as a metric with session dimensions like entry page. Second, you can use page (which is an event dimension) with the events metric. And finally, you can use entry page in a filter and then pick an event dimension for your table like goal name. This way, you can see event-level details for a list of goals corresponding to the landing pages that you specify via the filter.
In general, filter dimensions do not have the same limitations as dimensions that you add directly to your charts and tables, so this can be a good alternative in many cases.
Another consideration is that goals and custom properties have an additional conversion rate metric that can only be used when one of these fields is either added as a dimension or used in a filter. In order to get the number of unique conversions, you should use the visitors metric and in order to get total conversions you should use the events metric.
So to create the table in the Plausible dashboard that shows unique conversions, total conversions and conversion rate by goal, you would have a table that looks like this in Looker Studio.
Advanced uses of Looker Studio
Plausible Analytics Looker Studio connector proves even more useful when you are using it for advanced applications that cannot be replicated in the Plausible dashboard.
Here we will cover a few examples of different advanced uses. You can also see our advanced Looker Studio template.
Combining elements
One of the effects that is featured in the advanced dashboard is to have a scorecard that has a chart line background giving you the ability to quickly grasp the trend of the metric you are displaying.
You can accomplish this effect by layering two Looker Studio components one on top of the other. In this case, we have a āTimeseries Chartā and a āScorecardā, with both using the same metric: āVisitorsā.
In Looker Studio, you can control the order that objects are displayed by right-clicking on an element and selecting āOrderā. You will then have the option to send an element up or down relative to other elements in your report.
In our case, we have set our scorecard to be the same height and width as our chart but we have then set the order of the chart so it is below the scorecard. Finally, we just need to set the background color of the scorecard so it is transparent enough for the chart to show through. You can do this by going to āStyleā and choose āBackground and Borderā and choose āBackground.ā
Custom groups
The next element on the advanced dashboard is a stacked line chart that uses a custom grouping of data to show the split of direct vs. non-direct traffic.
To achieve this breakdown we will be using the āStacked Area Chartā visualization and we will be selecting āDateā as the dimension and āVisitorsā as the metric. Then under āBreakdown Dimensionā, we will need to select āAdd Metricā and choose āAdd Groupā.
This will open up a screen where you can configure a new custom āData Groupā. In our example, we want to use the dimension āChannelā. By default, this field will indicate what channel a visitor used to visit the website including direct as well as organic search, email, organic social and others. We can configure our own groups that are direct and non-direct, by specifying that the direct group should exactly match the value direct while anything that doesnāt match will be grouped as non-direct.
Once we have done that, we can now use our direct traffic group in our chart as our breakdown dimension.
Calculated field
One element that we didnāt fully recreate in the basic Looker Studio dashboard that is present in the Plausible dashboard is the table that shows the percentage breakdown of traffic by country. Previously, for simplicity, we stopped at total numbers of visitors without showing the percentage.
The reason for this omission is that displaying the percentages requires that we use a calculated field. To do this, we click on our table and go to āSetupā and under āMetricā, we can select āAdd Metricā and choose āAdd Calculated Fieldā.
This opens up a screen where you can create your own custom calculated fields based on the data that is already available in the report. In our case, we want to create a new metric called ā%ā that simply returns the visitors metric in a new format.
We will select āPercentā under āData Typeā and āPercent of totalā under āComparison calculationā. This tells Looker Studio that we want our new metric to calculate the percentage of the total for each row in our table.
Once configured, we can now see the percentages in our country / regions / cities table.
Advanced filters
Previously, we looked at simple Looker Studio filters that can be accomplished by adding drop-downs to the report. Looker Studio also allows for more advanced filtering that can be done at the level of individual elements.
To demonstrate, we will build a stacked bar chart that shows two specific goals over time: āvisit /registerā and āSign up for a trialā. This would be a useful view to look at to track the performance of a specific register page over time in terms of sign ups. Note that these two specific goals are related to our own Plausible dashboard and youāll need to use goals that youāve set up on your site.
To start off, we will insert a āStacked Column Chartā and we will select the āDateā as the dimension, āGoal Nameā as the breakdown dimension and āVisitorsā as the metric. Remember that when dealing with goals, the visitors metric gives the number of unique conversions.
But now we have a chart that has all of our goals rather than the two that we are interested in so we need to add a specific filter to the chart. We can do this by going to āChartā, select āSetupā, then āFilterā and finally āAdd a Filterā.
This brings up a screen that enables us to configure our advanced filter. We will select āGoal Nameā as our dimension and we will select āIncludeā as we are selecting the conditions to include data. Exclude can be used if we wanted to filter out these two goals instead. Then we will select āInā which enables us to list the goal names that we want to filter for.
For other situations, Looker Studio offers the ability to check for equals, contains, starts with as well as Regex matching.
Data blending
Now that we have a chart that shows the performance of our goals over time we might want to calculate the % relationship of one goal to the other to see what percentage of visitors completes this stage of our conversion funnel. In our example case, we might want to know the conversion rate by day of our registration page, in other words, what is the number of sign-ups divided by the number of visits to the register page.
You might think that we could simply create a calculated metric like before where we take a percentage of the total but unfortunately in this case Looker Studio will give you the percentage out of all of the goal conversions rather than just the two goals that we want to see.
As a result, we need to blend our data in Looker Studio. This enables you to create custom data views by joining data together based on fields, filters and join conditions that you specify.
Letās look at how it works in more detail. First, you go to āResourceā, select āManage blendsā and click on āAdd a blendā. Then we need to configure the blend based on the fields that we are interested in and specify how we want to join the data together which in our case will look like this.
In the left table we want date for dimensions and visitors as the metric. Then we need to add a filter that matches goal name equal to āSigns up for a trialā. This will be the numerator for our calculation as it will give us visitors that signed up for a trial by date.
Now we need to configure the right table to give us our denominator. For this we will configure everything the same except we will change our filter to equal āvisit /registerā. This way we will now have our visitors that visited the register page.
It is helpful to rename the metrics on each side so you donāt mix them up. We will call the left-hand metric āSign Upsā and the right-hand metric āVisit Registerā.
Finally we need to configure the join condition. We will select āRight Outerā and we will join on date which means that there might be some dates that have āvisit /registerā but no conversions and these days should still be included.
Now that we have our data blend ready, we just need to select this blend as the data source for a table. Then we can select date as our dimension and āSign Upsā and āVisit Registerā as our metrics. Finally, we can create a calculated metric that divides sign ups by visit register.
With that calculated field, we now have our table that shows the conversions by date as well as the conversion rate for that specific step in our customer journey.
Advanced formatting
Finally letās see some more advanced ways you can format elements in Looker Studio by looking at the bottom section of the advanced template. First, weāve created a horizontal bar chart that shows the channel breakdown of visitors to different entry pages on our blog.
To create this, weāve simply set āEntry Pageā as our dimension, āChannelā as our breakdown dimension and āVisitorsā as our metric. Then weāve added a filter for āEntry Pageā contains āblog.
This gives us an interesting but unwieldy chart with too many rows and colors to properly comprehend. In order to make the chart more useful we can do two things. First, we will limit the number of rows to 25 by going to āChartā, select āStyleā and setting āBarsā to 25.
Then we will go to āChartā, choose āStylesā, then āSeriesā, set the number to 4 and check āGroup Othersā. This means that the number of channels shown as stacked bars will be capped at 3 and all the others will be grouped into a fourth other category.
With these style settings, weāve now created a concise chart that gives us a quick view of our top 25 blog posts and where the main sources of traffic are coming from.
Finally underneath this chart we have a table that is set up to conditionally format a row based on whether or not it makes up 80% of the blog traffic. This way we could do an 80 / 20 analysis and see which blog posts are the most important in terms of driving new traffic.
To do this, we create a table with entry page as the dimension and visitors as the metric. Then we need to create a calculated metric that keeps a running total of the visitors percentage. We can do this by selecting āPercent of Total Relative to Corresponding Dataā under āComparison Calculationā and āRunning Sumā under āRunning Calculationā.
Then we just need to use this to style our table by going to āChartā, then āStylesā, āConditional Formattingā and finally āAdd. This will bring up a screen that will allow us to configure our āConditional Formatting Ruleā.
Here we will select our total % calculated field and specify that anything āLess Thanā 0.8 should be highlighted. With this, we will have a nicely highlighted table with all blog posts that make up the top 80% of traffic highlighted.
Try it for yourself!
I hope this post was a good introduction to Google Looker Studio. If youāre looking for an easy to use, open source, lightweight and privacy-friendly alternative to Google Analytics with an official Google Looker Studio connector, do explore Plausible Analytics. All the best!
You can track almost any activity you want on your site with modern web analytics tools. And the process has only gotten simpler during the last decade.
Itās now extremely affordable and easy to track anything and everything you want for your website. Plus, the need for making ādata-driven decisionsā has never been higher.
This has caused a āmetric-overload.ā The excitement to get to track everything takes away from the clarity and value of useful metrics. Itās even worse if the site owner is already confused about the siteās purpose.
Many sites track too many metrics. If some metrics seem to be performing bad, the next action is to add a few more metrics.Ā
But extra metrics doesnāt always mean extra insights āā but in many cases, more confusion. In other words, this gives the illusion of doing something right in the online world but actually counts for more motion and less movement.
The question, though, is which metrics are actually useful for a website to track. This begs for the fundamental question to be resurfaced: what is the purpose of analytics? And how to decide which ones to track for your website.
- What is the purpose of analytics?
- What is the purpose of my website?
- Which metrics do we track at Plausible and why?
- Tips for choosing the right metrics
- Mission drives metrics
What is the purpose of analytics?
Analytics exists to show light in a dark and confusing room. It exists to show facts: the health of a website (and to an extent, the business or entity with the website) and bring objectivity to help create any strategies.Ā
It is to turn complex and unorganized data into useful information, i.e., to simplify complex stuff and eventually create the most effective and actionable strategy. Strategy to reach an end-goal, which is usually earning money, but websites can still (and usually do) have different purposes as we will see below.
What are metrics?
Metrics are the things you see on your dashboard. Anything that can be measured on your site is a metric. For eg. pageviews, bounce rate, exit rate, conversions, conversion rate, screen size, etc.
For a fuller overview, you can see the list of metrics our subscribers track with Plausible.
What is the purpose of my website?
In a typical website, there are far too many things that can be tracked: which buttons were clicked, how much time was spent on which page, which conversions occurred, which forms were filled, if the light or dark mode was enabled, which browser was the traffic from Germany using, if the traffic from Reddit signed up for the newsletter, and endless more.
Itās not about tracking everything. Itās about tracking the right things that align with your websiteās purpose. If you end up tracking a lot, it defeats the purpose of analytics and causes more confusion than clarity.
So take time to think about what your website is meant to do and match that purpose with that of analytics.
For eg., The purpose of the website of an educational institution is to provide learning materials, communicate with students, and offer online courses. So their performance metrics can be:
- Enrollment rate: Percentage of site visitors who enroll in courses.
- Course completion rate: Percentage of students completing online courses. (if relevant)
- Bounce rate: Visitors who leave after visiting only one page (important for course pages).
- User engagement: Time spent on learning resources or tutorials.
- Return visitors: Students returning for more content or courses.
This is quite contrasting to an e-commerce website. Its purpose is to help consumers browse, research, and purchase products or services. So it makes sense for them to track the following:
- Conversion rate: Percentage of visitors who make a purchase.
- Cart abandonment rate: Percentage of visitors who add items to their cart but donāt complete the purchase.
- Average Order Value (AOV): The average value of each order placed.
- Traffic sources: Identifying where visitors are coming from (paid ads, organic search, etc.).
- Customer lifetime value (CLV): How much revenue a customer is expected to generate over their lifetime.
Even the most complex businesses can be reduced to a high-level simple definition. It may help to start there.
Even better: if you know what your mission as an entity is (for eg. itās Plausibleās mission to spread privacy-friendly and simple analytics).
Here are a few more examples to get you thinking:
- A service-based business might focus on tracking contact form submissions, appointment bookings, or phone calls to measure interest in their offerings.
- A nonprofit organization could prioritize tracking donations, volunteer sign-ups, or petition submissions to measure support for their cause.
- An educational website might track course enrollments, student progress, or the completion rate of online modules.
- A portfolio website for creatives may want to measure the number of views on specific projects, inquiries for services, or downloads of resumes or portfolios.
- A news or media website might track page views, ad revenue, and social media shares to gauge the popularity of articles and overall site traffic.
- A membership site could focus on tracking membership sign-ups, retention rates, or content engagement from members.
- A community-based website may prioritize metrics like forum activity, member interactions, or event participation to gauge user involvement and community growth.
Every other metric is simply noise, unless thereās a good purpose to track it. For eg., itās okay to track how many mobile users an e-commerce site has if you plan to make a mobile application for it.
Purposeful metrics will bring clarity, and others can distract you from the main goal. Letās take Plausibleās case.
Which metrics do we track at Plausible and why?
We are an analytics tool ourselves and it is the easiest for us to track whatever we want but we keep it limited to only a handful of metrics, as is visible in our live dashboard.
Our marketing philosophy and bias towards simplicity play as anchors in deciding what to track. Our marketing philosophy is to create content that:
- Educates new people about privacy-friendly analytics and that a solution like Plausible exists, and
- Keeps our existing stakeholders informed.
We donāt exactly do any lead nurturing, promos or retention programs for example. Thatās because we believe once the person has been aptly informed about Plausible and privacy-friendly analytics, our job is more or less done.
No more tactics are needed to nurture the people and āfeed the metricsā for a short-term illusion of success. Meanwhile, we try to take care of things like retention with the highest quality of product and support.
This helps us decide on which metrics to track, and more importantly what not to track.
For instance: in order to follow our mission and marketing philosophy, we have some core pages on our site: the Plausible vs Google Analytics page, simple analytics, in-built compliance with privacy laws, our privacy-first nature, high accuracy of analytics, etc. Basically itās everything in our siteās navbar and footer.
So we see if the unique views on these pages are increasing over time, and if the time on page remains adequately high. If itās not the case, we can analyze the reasons with the dashboardās help and take corrective actions as necessary.
We also need to understand how many sign ups we get, as that too is tied to our mission of spreading privacy-first analytics and helps measure the progress as a business. So we track it as a goal in Plausible.
But we have never tracked our pricing section for example. Similarly, we donāt track which of the āGet startedā buttons on the homepage brings the most registrations.Ā
We could and it would be a nice to know information but that doesnāt help us because itās comparatively trivial to the other main metrics that help us keep simplicity and clarity. And that is exactly the kind of noise site owners should cut out.
It may be helpful to start looking at the following tips.
Tips for choosing the right metrics
Donāt track everything
If you track too many things, especially without purpose, it can lead to āanalysis paralysis.ā Itās the feeling of being overwhelmed with data (rather, being trapped in an endless maze of data) and not knowing what to do with it.
Doing it might show you a lot of activity, but whatās it all telling you? Tracking too many metrics can distract you from your main goal.
Instead, focus on a few key metrics that truly reflect your websiteās performance and health. For example, if you are a Health and Wellness website, its purpose can be to share fitness plans, nutrition advice, and wellness tips. So, only track the following:
- Page views: Number of visits to specific health or wellness articles.
- Conversion rate: Sign-ups for fitness plans, consultations, or online sessions.
- Social media referrals: Frequency of the siteās content being shared on social media and how much traffic it brings.
Align metrics with individual teamās goals
Each part of your business will have different priorities. For example:
- Marketing might want to track traffic sources and campaign performance, like how many visitors come from ads or social media.
- Sales teams might care about conversion ratesāhow many visitors turn into paying customers.
- Customer service might want to know how many people contact you for help or leave reviews.
Itās okay to track it all from a single dashboard as long as internal clarity is present and they all connect back to overall business goals. This also keeps everyone aligned and focused on what matters most.
A good rule of thumb is: if your web analytics tool only allowed for the three most important metrics to be tracked, what would you pick?
Track metrics over time
One of the most important things to remember is that metrics should be tracked over time, not just on a single day or week. Trends are often more valuable than isolated data points.
For example, tracking how page views on your site increase over several months tells you if your content is becoming more popular. A sudden drop in traffic might tell you thereās a problem, but understanding these patterns over time helps you spot opportunities or risks early.
Keep it simple
In the end, simplicity is key. Choose a small set of important metrics that will give you the clearest picture of your websiteās performance. Simple metrics will attract simple clarity and actionable strategies.
You can always adjust or add more metrics as you grow, but start simple and build from there.
Mission drives metrics
Do you have a web analytics tool on your website? Why? What according to you is important to track on a daily, weekly, monthly, quarterly, and yearly basis? If your answer is just revenue, think again. Think about the websiteās purpose from your customersā perspective.
Do you track some metrics because everyone does or do they serve a purpose? Is it form submission rates? Time on page? Or exit rates? Why? What can be cut from this?
All the best! :)
In 2024, the Plausible website saw a ~2,200% increase in referral traffic from four AI search engines: ChatGPT, Perplexity, Claude, and Phind (refer to the screenshot above). These numbers were in the 100ās in 2023.
AI has officially taken over the way we search for things on our minds and get answers. Earlier, the AI models used to have a knowledge cutoff date with no internet access and had limited use cases. Eventually, the information became more up-to-date and transformed the way AI chats were used.
Similarly, AI has started citing its sources now and started sending traffic to websites. Many businesses and individuals are curious about how they can start appearing or being mentioned more in the answers given by AI ā a so-called āAI optimizationā or āAI search optimization.ā
The need seems to be somewhat more urgent given the lower click through rates to websites from Google SERPs because of Google AI overviews. Gartner says search engine use will drop by about 25% by 2026,Ā because of AI chatbots and other virtual assistants.
Coming back to our traffic surge from AI channels, we did not exactly apply any āAI search optimizationā techniques but something seems to have worked well for us. We will analyze this traffic in the Plausible dashboard to figure out what that is.
While we are at it, we did figure out some AI optimization ātechniquesā that are worth noting. Letās go.
- Analyzing Plausibleās AI search traffic boost in 2024
- What can you really do?
- Conclusion
Analyzing Plausibleās AI search traffic boost in 2024
AI traffic can be isolated using the Top Sources report (click on any source to filter the dashboard by it) or by straight away using the Filter button on the dashboard. We used the AI search engines we could identify from our list of Referrer URLs in the Filter modal:
Figuring out where AI chats mention Plausible and send traffic
We have a pre-made Entry Pages report in Plausible. After filtering the dashboard with only AI traffic, we looked at this report to figure out where such people entered our site from.
This would directly correlate with what subject was being discussed in those AI chats before the visitor came to Plausible. Hereās the screenshot of our top entry pages from AI channels:
The clear winner is our homepage and from our SEO data, support chats, and social media mentions āā we know that Plausible is usually mentioned in the discussions related to these topics:
- Privacy-friendly analytics,Ā
- GDPR compliance and no need for cookie consent banners
- Open-source
- Self-hosted
- Our data processing methods
- Simpler alternative to Google Analytics
- Lightweight script
So, is it possible that if we chatted with an AI bot about any of such topics, it will mention and/or cite Plausible? We can confirm that first-hand.
Another thing before trying that, that also hints towards the legibility of this hypothesis is the entry pages listed second, third, and so on, after the homepage.
These ones also talk about the topics for which Plausible is usually referred: GDPR compliance, self-hosted analytics, open-source, our data policy, cookie consent banners, and so on. So the hypothesis is matching so far.
Letās confirm it by asking ChatGPT and Perplexityāāour two topmost performing AI channelsāāwith the laziest prompt ever (because niche prompts would do well anyway):
^ Thatās ChatGPT sending traffic to the second topmost mentioned entry page, i.e. /blog/legal-assessment-gdpr-eprivacy
.
P.S. I used the āSearch the webā option along with the prompt to ensure ChatGPT returned some links from Bing (Bing is the search engine used by ChatGPT). If you use a regular prompt, you may or may not get citations.
Another thing ChatGPT did while sending this specific traffic was add a UTM source automatically to it. This is what I got after clicking the link suggested above: āhttps://plausible.io/blog/legal-assessment-gdpr-eprivacy?utm_source=chatgpt.com
By the way, these UTM sources are also visible in the UTM reports in Plausible, making it even further easier to track traffic back to its originating source.
Letās try the same exercise with Perplexity AI. It returned the same citation to the same blog post as above.
Isnāt there a better way to track my brandās visibility across AI search tools?
Other than manually confirming hypotheses, a more sophisticated method would have been somehow monitoring AI conversation trends. We need data for that directly from AI search engines.
But unlike publicly available search engine results, AI chats are personal and not publicly available to assess.
There are some Enterprise-focused companies offering AI brand visibility services where a lot of manual work is anyway put in to determine which conversations the brand is mentioned by AI and how that compares against their competitors.
Hence, when analyzing your AI traffic in your web analytics tool, some manual work would be required at this stage (as of Dec 2024).
Was this āqualifiedā traffic?
Qualified traffic is high quality traffic, i.e. people who would be genuinely interested in the things your business, or at least website, offers. Any other traffic is not useful for meeting business goals.
To figure out if the traffic we were getting from AI channels was qualified or not, we saw two things:
- What did they do on the site?
- Were any conversions met from this?
What did they do on the site?
For this, we can simply look at the āTop Pagesā report. This is an overview of the top pages visited in the sessions coming from AI channels. So it is a good indicator of what was happening in those sessions: an insight not completely offered by the Entry pages report.
Hereās the screenshot of our siteās top pages visited in the sessions acquired from AI channels:
This indicates that after the homepage (which was by the way, scrolled 58% of the page length indicating that many visitors read most of the important info we like to offer on our homepage), the second-most visited page was the free trial registration page!
That is great news, since we know that the journey taken by these folks, that started from searching about a relevant topic on AI tools, included visiting our registration page too. So far so good!Ā
But did actual conversions occur? We donāt have to guess because we have a goal for that. We will look at this info in the next section. Before that, we can quickly look at the other top performing pages in AI-acquired sessions.Ā
These consist of our live stats page (which we also utilize as a product demo), activating an account, adding sites to Plausible, visiting our documentation, and other pages we have about the topics for which Plausible is popular (as listed in the Entry pages section above).
That is very good qualified traffic, in my opinion!
Were any conversions met from this?
Exactly these goals were met in AI-acquired sessions:
If we look at one of the funnels, we also know how many people entered the sign up flow and completed it:
P.S. All of this info is openly available in our live demo as well.
When?
If you notice the top graph, thereās been a spike in AI-acquired traffic from mid-August and has maintained those traffic levels ever since. Hmm, what could have happened at that time?Ā
A little digging revealed that ChatGPT had a stable release on August 8. This could be it!
So what? The only conclusion from this is that not much was needed to be done from our side to get āAI-featuredā. Since AI models are in heavy development and will continue to be for the foreseeable future, they will continue looking for good information on the web and keep getting better at it.
So if we focus on creating valuable content āā AI will hopefully proactively pick it up at some point. This arguably depends on other factors like brand authority as well, but thatās for a deeper discussion later in this article.
How do we approach content and marketing at Plausible that may have helped with this AI traffic boost?
At the core of everything is a thoughtful product, built over the years that continues to improve with customer feedback. We like to focus on real problems and real people. This is why we get organically featured in social media, search engines, and communities.
Whenever we decide on a new topic to write about, we like to understand what the ideal reader might be seeking from it. This helps us understand the intent behind the topic and helps us match it. Intent matching triumphs keyword matching for us.
This aligns with Googleās regular āhelpful content updatesā as well, which regularly penalizes websites for trying to hack the algorithm. This confirms that only useful content that understands the problem and offers real solutions will do well in the long run and not necessarily the one being optimized for searches.
Since AI tools pick their knowledge from search engines, itās quite obvious that if a search engine likes you (high brand authority) āā you are more likely to be noticed by AI as well.Ā
What can you really do?
Letās look at the things that can practically be done to ensure best chances of being featured by AI in its answers.
Identify your low hanging fruits
Figure out what you are already being mentioned for, if at all. You can use the web analytics tool of your choice for this purpose and do some hypothesis and reverse-engineering like we did above.
This gives you a good starting point to see what works well for you and why. From there, you can improve whatās already working and gradually explore related topics. Over time, this will help you get recognized for a wider range of topics in your industry.
Focus on those AI channels that cite the most.
For instance, Perplexity AIās USP since the beginning was to always cite its sources. And where does Perplexity get its information from?
Itās Microsoft Bing! And so does ChatGPT.
So you can shift your focus from not just trying to rank well in Google but Bing as well.
And of course, there are the Google AI overviews that use citations too. While they may not always lead to clicks, thatās no reason to stop creating helpful content for your audience or striving to get cited.
So yes, SEO is still your best bet in many ways. Speaking of which:
Get better at SEO ā but the way Google looks at it
This isnāt an isolated advice but if you want to optimize for Google AI overviews, you need to:
- Create high-quality, authoritative content directly addressing user intent.
- Optimize for top organic rankings with strong click-through rates (CTR).
- Regularly update content to maintain freshness and relevance.
- Address related queries to broaden coverage for AI Overviews.
- Incorporate multimedia like videos for diverse content formats.
- Align with EEAT principles: Expertise, Authoritativeness, and Trustworthiness.
Thereās someone who looked at Google AI overviewās patent and therefore it is more than safe to say that above practices should help.
Another experiment would be to try out different content formats since Google AI overviews fetch their answers from different types of content formats like text, images, videos, etc.
We also have a study on how SEO is evolving, that can help you navigate it better.
Consider adding an llms.txt
file
Have you ever used a robots.txt
file for your site? Itās used by websites to give instructions to web crawlers/spiders/bots about which pages or sections of the site they are allowed to crawl and index.
It is a plain text file that is placed at the root of a website and contains directives that guide web crawlers on how to interact with the websiteās content.Ā
Now, āllms.txtā files are making an appearance. According to the proposal:
We propose that those interested in providing LLM-friendly content add a /llms.txt file to their site. This is a markdown file that provides brief background information and guidance, along with links to markdown files (which can also link to external sites) providing more detailed information. This can be used, for instance, in order to provide information necessary for coders to use a library, or as part of research to learn about a person or organization and so forth. You are free to use the llms.txt logo on your site to indicate your support if you wish.
Even though thereās nothing official about this, if you notice the list of projects already using an llms.txt
file in this library, you will notice that Anthropic itself uses it too. And who is Anthropic? Itās the creator of Claude AI. So there must be some merit in this optimization.
Thank us later.
Write content for citation and reference purposes
If this is relevant for your industry, it will help to create citation-worthy content. Usually what gets cited, in ChatGPT for example, is this:
Please do note that this doesnāt always apply to situations when ChatGPT users use the āsearch the webā functionality while prompting. This is because this feature refers to real-time results fetched from the web anyway.
Cover topics wellĀ
Donāt stop at an isolated content piece but address related questions and cover the subject in-depth because AI chat tools allow follow-up queries.
Optimize for natural language and voice search
This has been a growing SEO technique as well but makes a lot of sense for āAI optimization.ā Why? Because voice search is becoming a key way people find information.
Try to match natural, conversational language rather than traditional typing patterns. People tend to phrase their queries in a more conversational tone when using voice search.
Conclusion
AI can help multiply your brandās visibility. Just get going with relevant and qualitative content, and start tracking AI-referred traffic in an analytics tool. All the best!