Google Analytics initially evolved from Google's purchase of Urchin Analytics, an Apache log analyzer. That's why legacy Google Analytics property IDs all begin with UA-. The familiar ?utm_ keywords that can be tacked onto URLs to track referrer statistics in Google Analytics are remnants of Urchin Tracking Modules.

All these years since Google initially purchased and overhauled Urchin, they built a from-scratch, much more robust analytics app, Google Analytics 4, also known as GA4. Much to the chagrin of every Google Analytics user everywhere, GA4 is not backwards compatible, incredibly confusing and bulky to use, and all users are being forced to upgrade by this July.

I completely get Google wanting to start over from scratch with something that's more robust and efficient. I really do get that. I also get that it's not practical to maintain both independent systems simultaneously. I get the need to migrate all users off of a legacy platform and onto the new platform. What I don't get, and what infuriates me to no end, is that Google can't figure out some way to import all of the data that its users have been collecting over the past nearly two decades into GA4.

So ends my rant.

I'm keeping an eye on Google's Paris show. Seems they are to showcase or even rollout their AI offerings.

More than one article thinks it's going to be a change in direction from being a link purveyor to AI powered responses. I have my doubt on that as this would decimate the current system in place.

As to GA4 and your concerns, maybe that will be a subject in Paris.

Google has been doing AI-powered responses for many, many years now, but a lot of people think they’re going to massively step it up. There have already been some screenshots floating around with some experiments they’ve been running across a limited audience.

It’s causing a big commotion in the SEO industry because, when Google extracts excerpts of content from a publication’s page to give the searcher the specific answer they’re looking for right within the search results, they’re using the benefit of the publication’s research, knowledge, staff writers, etc. without the end-user seeing their ads or generating the publication any revenue.

Google will have to come up with a new model because, right now, website publishers have a symbiotic relationship with Google. We let them crawl our sites and let their bots consume our bandwidth and server resources (which ain’t cheap), and in exchange, they send us nearly all our real traffic. We are all, for the most part, Google’s puppets, because we are willing to do practically anything for Google for them to bless us with lots of traffic sent our way. That’s why when “the big G” says create AMP pages, we do, even though they were so terrible they only caught on with publishers who are at Google’s mercy. And when they say switch to GA4, the most unintuitive analytics app ever, we begrudgingly do, albeit after posting a rant. But if Google shifts too much from sending traffic our way towards using our content to serve their end-user with no benefit to us, us webmasters are going to be less inclined to succumb to Google’s every word. It will massively shift the balance of power. Right now Google holds allllll the power with website publishers, and they know if they don’t tread carefully, that could change. For example, if Google stops sending me the majority of my traffic, I have no incentive to allow them to crawl my site, so I can simply block them via robots.txt, or use cloaking to block the googlebot user agent, etc.

Google has systematically pushed websites to be more user-friendly and intuitive, at the expense of the websites potentially making more money per visitor, under the guise that being favored by Google will send the websites more traffic. Websites could start going back towards doing a whole bunch of techniques, such as intrusive pop-ups everywhere a la 1990s, that would make them a hell of a lot more money from ads, that they’re currently refraining from doing because Google says it’s bad.

There it is:

give the searcher the specific answer they’re looking for right within the search results,

That's one of the components that makes the AI that I won't name so attractive to users. Instead of a "link purveyor" folk were getting direct information. And yes this really upset the apple cart or would it be the buggy whip cart?

This has brought up all sorts of counter arguments that ML/AI gets it wrong at times but hey, there is plenty of bad/wrong information from humans. For example I have these conservatives in my circle that continue to send me proof that Ivermectin cures and prevents COVID. Another group says COVID is part of The Great Reset and a hoax.

Imagine if Excel didn't provide an answer but showed you the formula on how to solve or maybe a link to an article for you to figure it our for yourself.

That's one of the components that makes the AI that I won't name so attractive to users.

Yes, it makes ChatGPT attractive, but, as I keep saying, what ChatGPT lacks is Google's understanding of the quality of information found across the web, and how to determine content quality. That's why its concept is very attractive, but, execution greatly flawed.

And, as mentioned, it's something Google has been doing for many years now, but they have been purposefully scaling it back so as to not upset the balance of power. We're now expecting for Google to release something more along the lines of ChatGPT, but with attribution and links as to where the information was sourced.

As I see it:

attribution and links as to where the information was sourced.

And to repeat, as I see it that would result in a drop of follow through visits/clickthrough to the information source. Which was the goal of getting rank on search results.
And let's differentiate that information differs from product searches. Clickthrough will likely remain strong for product search results.

Examples:

  1. I wanted to know more about Canada's recent ban on home sales to non-Canadians. The search results had enough detail so that was all I needed for now.
  2. I was shopping for a new laptop, the search gave me enough to know the price range and then I went to my usual online retail site and there it was at a good price and surprised to see it hundreds less for the ongoing sale.

This just in, Google at the Paris event has named it AI/ML "Bard."

That's why I said that the issue is with website publishers. Publishers are online content producers who provide information, articles, social media, etc. in exchange (typically) for advertising.

Either way, I digress, as we're getting quite far off-topic. This thread is for discussion about frustrations regarding Google Analytics 4, which extends far beyond publishers, but also to e-commerce sites, etc.

Let's remember Paris. My thought is that Google might have something in the works about GA4 and it has to wait for "the big show."

Google doesn't take the same approach as Apple, for example, does with their annual WWDC events. There's also nothing novel to really share about GA4. Google has already been pushing everyone to GA4 for nearly 5 years now. All that's changing is that they're finally deprecating the legacy version, which everyone prefers, while also offering no way of migrating the data off of it to the new solution.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.