Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi there and welcome back!! :)

stefh commented: Thaks a lot Dani! This warm welcome helps me feel at home! :) +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Welcome back! :)

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Usman, they’re asking about a DaniWeb blacklist. Not a phone blacklist.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

It completely depends on what type of website you have. Is it a static site with fewer than 50 pages? Or is it a dynamic site with millions of pages? If there are fewer than 50 pages, I strongly urge you to handwrite your sitemap file. If it's a dynamic site, then your best bet is to code a script to generate your sitemap (having database access is a must when determining which pages to index on a dynamic site.)

Once you have a strategy in place, then figure out what pages you want included in the sitemap, what pages to noindex, and what pages to exclude bots from crawling. This is something best left to experienced SEOs. If you're just getting started, we could help you with some ideas, or answer questions you may have, but I urge you to put a lot of thought into what goes here, because you could easily shoot yourself in the foot, and wind up deindexing all your pages from Google.

That's why I am very against using automated tools as AndreRet suggests. I think that hand crafting a sitemap.xml and robots.txt file are super important, and each line should never be taken lightly.

AndreRet commented: You are totally correct Dani, if the OP gave us more information and some sign of effort, I would have elaborated as well. +14
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

The article you linked to seems to be different from what I've read elsewhere in other articles. It says jsDelivr is inconsistent, but other articles I've come across have said they're the most consistent. Sigh.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

OK, the filter is fixed now.

  • It now only shows a maximum of two items instead of six
  • It previously showed editorial content created over the past year in forum listings, and over the past month only in tag listings, and now it shows only over the past month in both situations
  • It now hides editorial content that you have already seen
rproffitt commented: Have to write thank you for the intense efforts here. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I'm marking this question solved since the problem has been fixed since they asked the question.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Looks like you have now purchaed an SSL certificate and have a secure website.

This question was asked on March 19th and it looks like an SSL certificate was purchased April 8th.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

What is the value of Session["Items"]?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Glad it worked. I'm going to mark this question solved.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

So I have discovered how to do it :)

Since Googlebot ignores Google Analytics, you have to do this serverside. In my case, I created a new Property in my Analytics account to handle Googlebot traffic.

Then, you need to use Google Analytics Measurement Protocol, which is an HTTP request to https://www.google-analytics.com/collect with parameters of your choosing.

Documentation is available at https://developers.google.com/analytics/devguides/collection/protocol/v1/reference and a reference for all the parameter options are at https://developers.google.com/analytics/devguides/collection/protocol/v1/parameters

In my specific case, I use Cloudflare CDN, which caches a lot of my pages on the edge for users who are not logged in. Since that's the case, I can't implement the HTTP request from within my PHP code, because it would only execute when the page is not cached.

Instead, I took advantage of Cloudflare Workers to send the HTTP request directly from the edge. In my case, I created a new worker with the following script:

const analyticsId = 'UA-98289-3'

addEventListener('fetch', event => {
    event.passThroughOnException()
    event.respondWith(handleRequest(event))
})

/**
 * Check request object for Googlebot UA to send tracking data
 * @param {Event} event
 */
async function handleRequest(event) {

    let request = event.request
    let userAgent = request.headers.get('user-agent')
    let response = await fetch(request)

    if (userAgent)
    {
        // If Googlebot, then track hit in Analytics
        if (userAgent.includes('Google')) {

            event.waitUntil(analyticsHit(
                {
                    uip: request.headers.get('CF-Connecting-IP'),
                    dl: request.url,
                    dt:
                        response.status + ' ' +
                        response.statusText + ' ' +
                        userAgent
                }
            ))
        }
    }

    // Return the original content
    return response

}

/**
 * Send bot tracking data …
alan ingram commented: Dani, Its great to see this information is will be helpful in future. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Otherwise, you can change lines 7-9 to be something like:

<?php if ($i == 0): ?>
    <article> 6th Element </article>
<?php endif; ?>
<article>
    Element
</article>

That will inject the 6th element in at the beginning, before we increment $i.

fecapeluda commented: Dani, that worked pertect! Gracias! +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

In the context of web development, how did you come across it? What are you trying to do that makes you think you need to work with it?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member
rproffitt commented: Thank you for the reports. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Yes, you can get in trouble. There are very strict guidelines that dictate which entities are permitted to store credit cards in their database, or even under which circumstances you can collect credit card information.

A set of standards called PCI DSS (Payment Card Industry Data Security Standard) specifies who, what, where, when, and why businesses may collect and store credit card information.

You may consult an attorney (and pay for it!) if you feel like you need more help understanding. However, if you are a business that accepts credit cards, it is your responsibility to be aware and keep yourself updated of all of the PCI DSS requirements and to ensure that your business is taking the necessary steps.

In my case, my credit card merchant audits me on a regular basis to ensure I'm in compliance.

This is not something you should need to hire a business / finance attorney for. Even if you are having difficulty understanding the PCI website, there are people who work at your credit card merchant who should be able to work with you to explain the exact requirements you need to be in compliance.

You can get more information at https://www.pcisecuritystandards.org/pci_security/maintaining_payment_security

If you have questions, I would begin by reaching out to the PCI Compliance division of your credit card merchant. (Stripe and Braintree are the two popular ones for online processing.)

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Sorry, I’m not understanding what you’re asking.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

We will snip personally identifying infofmation from within your posts that you didn't mean to post, such as your full name or email address.

However, outside of PII, we do not edit or delete posts unless they explicitely violate our rules.

Yusuf_13 commented: really? rproffitt says I can delete comments as a moderator. If it's yours I can take such a request and get it done +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I hope they start rolling out algorithms even more frequently! We got clobbered by the May 2020 core update and I’m hoping there will be another core update in a few months to give us a chance to recover.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Why should we do your homework for you?

It’s fine if you need help, but we won’t just do it for you. Please show us the code you have so far and where you’re stuck or what errors you’re getting.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi there and welcome to DaniWeb!

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi there! Nice to meet you.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

SimonIoa commented: what do you mean with div above the columns? +5

Exactly what I posted in my code snippet. Where the top and bottom divs are not part of columns.

SimonIoa commented: i did that doesnt work +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

What's the purpose of it being column-count 1 instead of just a regular div above the columns?

SimonIoa commented: what do you mean with div above the columns? +5
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Oh, sorry, I just saw now this is for mobile app development.

What you're probably looking for is something like this:

https://masteringionic.com/blog/2017-12-04-creating-a-css-masonry-layout-for-an-ionic-application/

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I guess I'm not understanding what you're looking for. Why not do something like this?

<div class="full-width">
  Foo
</div>
<div class="columns">
  <div>
    Foo
  </div>
  <div>
    Bar
  </div>
  <div>
    Baz
  </div>
  <div>
    Bat
  </div>
</div>
<div class="full-width">
  Hello World
</div>

And then for the CSS:

div
{
  margin-bottom: 1rem;
  border: 1px solid black;
  padding: 1rem;
}
.full-width
{
  background-color: pink;
}
.columns
{
  columns: 2;
  border: 0;
  padding: 0;
}
.columns div
{
  background-color: green;

}

Something else that comes to mind, if you want to start involving Javascript, is to use a JS library such as Masonry.

https://masonry.desandro.com/

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Buying followers will not make you more popular. However, it’s a way to portray yourself to others to be more popular than you already are. It’s then up to you to use that benefit within your marketing technique to the fullest.

Similarly, no one wants to participate in an empty forum. If you come here and the site appears dead, every visitor paid for with marketing efforts will leave and will be a waste of marketing budget.

But if you come to what appears to be a thriving site, you want to be a part of it. Your market budget immediately stretches much farther.

rproffitt commented: Why does this make me think about the current POTUS? I wish it didn't. +15
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hello there and welcome to DaniWeb!

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

There actually were some more timely answers but they were all deleted for spamming. Better late than never.

However, since Shivya is not exactly correct, I'll elaborate.

Google crawls the web, following links, looking for pages with good content. As mentioned, you can restrict bots to not crawl specific pages based on your robots.txt file. You can recommend which pages of your site you care the most about in your sitemap file. Google crawls your site based on a crawl budget. Based on how much clout your site has in terms of incoming backlinks, your site's speed, server response time, and many other factors, Google determines a unique crawl budget for each site, that dictates how deep into your site they'll crawl. When they crawl, they read and understand all the content on each page.

Once they've crawled your site, only pages that make the cut make it into the Google index, where they become searchable. Duplicate content, low quality pages, thin pages, spammy pages, etc. all won't make the cut. If you have too many of those kind compared to all of the pages that Google wasted their time crawling, you'll be hit with a Panda penalty.

rproffitt commented: Shivya had 7 posts mostly to very old discussions. Only way to learn is to comment like this. +15
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Good luck with it!!!

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I see you got rid of the keyword stuffing at the bottom of every page. Good stuff.

I still highly recommend you get rid of the what is teletappie paragraphs at the bottom of every page to avoid duplicate content. If you want, just keep the one paragraph unique to each page.

Create a faq page and move all the other paragraphs to the faq page and link to it from the bottom of every page.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Just to clarify, I'm not referring to the About this page section. I'm referring to the Topics that are covered section.

As far as the About this page section is concerned, it's fine to have What is this specific page about but you will get hit with a duplicate content penalty for repeating the paragraphs What is Teletappie, Why was it created, etc. at the bottom of every page. Google wants to see all unique content on each and every page. You can have a shared header (logo, navigation, etc.) but the body text of the page cannot be repeated across pages.

rproffitt commented: I know about TeleTubbies, will have to look what that other thing is soon. +0
Hotkeysk commented: Thank you very much Dani, I will fix that :) +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

As someone with 25 years of experience in SEO, this is a clear example of keyword stuffing that you will get penalized for:

Topics that are covered
Teletappie new mobile mmo, mmorpg. Top mobile games online
Mobile moba games. Most popular multiplayer moba games
mobile games to play with friends. Top mobile mmorpg
top mobile mmo
top mobile moba

top mobile online games

Please don't do this.

rproffitt commented: Thank you. So many do that. +15
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

As of this morning, the Speed Report (Experimental) in Google Search Console has been replaced by a more robust Core Web Vitals report.

It still is broken up into a Mobile and Desktop version. However, instead of just monitoring FCP (First Contentful Paint) and FID (First Input Delay), it now seems to monitor CLS and LCP ... or at least those are the issues it's flagging for my site.

Issue type is the status of the various measures:

LCP (largest contentful paint): How long it takes the page to render the largest visible element
FID (first input delay): How long it takes the page to start responding to user actions
CLS (cumulative layout shift): How much the page UI shifts during page loading.

Speed issues are assigned separately for desktop and mobile users.

As you may recall, the experimental Speed Report was disallowing any revalidation because of an alert message saying that it would be changing soon. Well it looks like it's here!

Is anyone else seeing this yet?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi there and welcome to DaniWeb!

I would like to politely request that you please stop adding useless text to the bottom of each page. You'll get severely penalized for keyword stuffing. Doing what you're doing is like 10000X worse than having a poor, yet natural, keyword density.

Write a useful article about each keyword to add content to your site instead.

Good luck!

Hotkeysk commented: By useless text i don't mean it's actually useless. The text describes each page, described the way i rate games, review points etc etc. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi Luka and welcome to DaniWeb. Trying my best but the days just blend into each other ...

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Can you give a more in depth description of what you mean by attendance registry? What features and functionality are you looking for? MongoDB is a database, node.js uses javascript as a server-side language, and then I'm assuming you are wanting to use javascript on the front-end as well. That's a full stack, but I'm not sure why you picked those particular technologies. Why MongoDB with node.js? That stack works well with big data and working with nodes. MongoDB is a NoSQL database engine. You might be better off with a SQL-based database such as MySQL.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

You’re confusing Google Ads and Google organic.

rproffitt commented: I wonder if Brad is buying or selling. They haven't been writing much. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Sounds like millions to me.

You might think so, but a technical SEO's salary is sub-$100K. A decent technical SEO should be able to rank in top placement for a related non-broad term query.

rproffitt commented: I bet Amazon spends this and more every year. They rank in the top 3 often. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I think that the OP is saying they were presented with this code, either by their professor or they found it online, and they’re having a hard time following it. They want to learn so they’re asking can someone walk them through this code step by step. I guess by commenting it??

That’s my guess. I did a lot of that back when I used to tutor.

rproffitt commented: Thanks Dani. Once in a while I get legacy apps to fix. Then I have to reverse engineer to see what the intent was. Here, the OP gets to explain more. +15
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

So what I did to test was I opened a page of my site with an ad at the bottom using Chrome. I then right clicked and selected Inspect to open the Developer Tools and I went to the Elements tab. I searched for where in the DOM the ad would be, and I found the code:

<div id="div-gpt-ad-1321481451196-0">
    <script>
    googletag.cmd.push(function() { googletag.display('div-gpt-ad-1321481451196-0'); });
    </script>
    <div id="google_ads_iframe_/86406173/ArticleLeaderboard_0__container__" style="border: 0pt none; width: 728px; height: 90px;"></div>
</div>

You can see it looks just like the way it was initially copied/pasted from GAM, but with the following empty <div> was added:

<div id="google_ads_iframe_/86406173/ArticleLeaderboard_0__container__" style="border: 0pt none; width: 728px; height: 90px;"></div>

Then, I slowly scrolled down the page while keeping an eye out on that particular line. After about scrolling halfway down the page, that line morphed into:

<div id="div-gpt-ad-1321481451196-0" data-google-query-id="CPrnqODByukCFcYXrQYdZ0YOTA">
    <div id="google_ads_iframe_/86406173/ArticleLeaderboard_0__container__" style="border: 0pt none;"><iframe id="google_ads_iframe_/86406173/ArticleLeaderboard_0" title="3rd party ad content" name="google_ads_iframe_/86406173/ArticleLeaderboard_0" width="728" height="90" scrolling="no" marginwidth="0" marginheight="0" frameborder="0" srcdoc="" data-google-container-id="2" style="border: 0px; vertical-align: bottom;" data-load-complete="true"></iframe></div>
</div>

I'm just giving you the summary because the <iframe> was now populated with the full third-party ad copy.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Ah, thanks so much for the catch! I copied the code from the sample code at https://developers.google.com/doubleclick-gpt/samples/lazy-loading and didn't realize that they were repeating it multiple times for clarity and not for production. I'll fix it right now. Thanks!

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Oh, and if I was guaranteed a top rank for a search phrase of my choosing, I would not just pick it entirely based on traffic. Relevance is most important ... another factor to consider is that people genuinely searching for that search term have a low bounce rate when landing on your page. That's how you keep the spot once you get it.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

It absolutely depends on the industry and what keywords you're trying to rank for. You can rank for your brand name with barely any effort at all. Does the site already have backlinks? If there's a strong link profile, then use one of the many keyword tools (Moz, Ahrefs, etc.) to search for relevant keywords with the most amount of traffic, and inject those keywords into strategic places ... factoring in the right keyword density and all. Broad term keywords are another matter, of course.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

This seems like a homework problem. What have you done so far? What have you tried? What have you learned in class that could relate to this? Where are you stuck? Do you have any code so far?

It’s currently seeming as if you are tasked write this complicated c++ program but you don’t know c++ at all. We will help you to figure it out, but we won’t just do your work for you from scratch.

Jin En commented: #include "Resistor.h" #include <cstdlib> void Resistor::setResistance(double r1) { r=r1; } double Resistor::getResistance() { return r; } v +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Sorry, if ($date_obj > $now_obj) should be if ($date_obj < $now_obj)

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

As rproffitt points out, it's possible that $birthdate is in an incorrect format and you think you're retrieving a year, but you're really not. For example, if my $birthdate was set to 11-11-82 then it will retrieve 11-82 instead of 1982.

You should make sure that $birthdate is in an appropriate format before calculating age.

What I would suggest doing is changing your code to something like this:

// find age
if(! function_exists('Age') ) 
{
    function Age($birthdate)
    { 

           // Break the date in the format YYYY-MM-DD apart into a year, month, and day
            list($year , $month, $day) = explode('-', $birthdate);

            if (checkdate($month, $day, $year))
            {
                       // The birthdate is a valid date

                        // Create a PHP Date object for the birthday
                        $date_obj = new DateTime("$year-$month-$day");

                        // Create a PHP date object for the current date
                        $now_obj = new DateTime();

                        // Check to make sure that the birthday is a date older than today
                        if ($date_obj > $now_obj)
                        {
                            return date("Y") - substr(trim($birthdate),-4,4);               //// THIS IS THE LINE FOR THE SAID ERROR
                        }
                        else
                        {
                            // Birthdate is in the past
                            return 0;
                        }
            }
            else
            {
                // Invalid birthdate specified
                return 0;
            }
    }
}

Doing something like that not only ensures that the $birthdate variable is in the correct format, but it also ensures that a user doesn't specify an invalid date for their birthday, which means either a future date, or a date that never existed such as February 29th in a non-leap year or September 31st (September …

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Don’t. Just don’t. Basically you’re talking about a whole bunch of things that Google is now easily able to detect and throw the book at you for.

Don’t use PBNs anymore. They haven’t worked since last year, supposedly. Have you had success with PBNs more recently than that? If so, let us know! I’d be very curious.

Secondly, Google is constantly getting better at being able to detect automated, spun content. Google Panda still exists.

rproffitt commented: Listen to Dani! +15
DeForseti commented: I had no experience in creating networks. But I read that creating them is still real. Unfortunately, they do not disclose how they do it. I have litt +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Peleg,

I have confidence that what needs to happen is the URL's referer on the signup page needs to be passed as a form variable to the confirmation PHP. The confirmation PHP page can then accept it and redirect to it. The problem is this does take custom coding, which might be out of the scope of what the OP is able to manage on their own right now.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

This sounds like an odd question.

I think what he's getting at is that the basics of SEO are different depending upon what type of site you have. For example, what Google is looking for in an e-commerce site are very different from what they're looking for in a news site.

For an e-commerce site, I would begin by using Product schema. There's some more information about it here: https://developers.google.com/search/docs/data-types/product

However, before I can really give you a thorough answer, I need to know what framework you're using (Magento, Shopify, etc.) and how technical you are.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I no longer use reCAPTCHA because I find the "find the road signs" puzzles designed to help their self-driving car research extraordinarily annoying.

I don't want to explain too much about how what I'm doing works here in this public forum, as we get a lot of spam bots that are designed specifically for DaniWeb. But I'll run you through the basics ...

Basically what we do is use javascript to inject a hidden form input field upon pressing the Submit button. A lot of bots don't evaluate javascript on a page and just send the HTTP request to submit the form without actually evaluating the page the form is on or loading the page's external javascript file, so they won't ever be triggering the javascript. If the input field doesn't exist, the server knows it's a bot. Secondly, the input field contains an encoded string that only stays active for a couple of minutes and is unique to the user. Therefore, a human spammer can't submit the form manually, inspect the HTTP request being sent out, and then copy the string value used that one manual time for all the future bot requests.