Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

It's 2019 so can you share what changed from 2017 to 2018 then 2019?

LOTS! The SEO industry is primarily about keeping up with the latest algorithm shifts that Google makes. As a result, techniques differ year over year. Last year was the year of infographics. IMHO, they don't bring as much benefit in 2019. Longer blog articles work better now I heard. Just my opinion. But I'm the nutcase who is constantly googling, "best seo techniques in October 2019" ... "best seo techniques in November 2019" etc.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

PowerBasic?
Visual Basic for Applications?
Microsoft Access?

I THINK I know what you’re talking about but the name escapes me. It sounded something like PowerBasic but not that?

rproffitt commented: Excel back then with VBA could make some nice (for the era) reports and systems. To me Excel and some VBA seems to match what is asked here. +15
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Looks like I was right on about breadcrumb enhancements recently showing up in my GSC. The official Google Webmasters twitter account announced it went live as of today => https://twitter.com/googlewmc/status/1174693878835875840

So basically what that means is log into your GSC, ask Google to crawl one of your pages that has breadcrumb schema, and you'll see if Google has a specific issue with your implementation, or if they understand it correctly.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi,

Sorry for taking so long to see this thread.

Please supply exactly what is broken. Error messages along with what you expected to happen.

They did.

Good luck with getting Google to change this.

Knowing what to change to encourage Google's algorithms to change things is the point of this forum. Google is all about helping them to help you, and they go a long way towards providing the tools to make that happen. Schema is just one of many ways.

That being said, are you using Google Search Console? In the Enhancements section of Google Search Console, are they picking up on your schema? I noticed that the Breadcrumbs enhancements just very recently showed up in my GSC.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi,

I'm not sure I fully understand your question. I browsed your site and most pages do not have meta tags. For example, I'm on your Zalgo Text Generator page and I don't see meta keywords or meta description.

Every unique page of your site needs to have at least four things:

  • A unique title different from every other page on the site
  • A unique meta description different from every other page on the site
  • A unique meta keywords different from every other page on the site
  • A canonical tag that indicates that it is the primary version of the page

A <link rel="canonical" href="https://www.example.com/page.html"> tag tells Google, and other search engines, the URL of the primary version of the page. If you have more than one different URL that has the same, or similar, content on it, you can use the same canonical URL in the meta tags to tell Google which page you want showing up in the search engines, and not to penalize the other pages for duplicate content.

If you have two or more pages that have near-identical content but different URLs, then you can use the same title, description, and keywords as long as there is the same canonical tag as well.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

There are many more than that!

There is author, viewport, refresh, generator, etc.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

This is an English only forum. However, you are missing a semicolon at the end of the line that begins $usuario =

LUIS JESUS commented: pero tampoco me sale +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Depending on what you want to use the computer for, there will always be limitations based on ram amount. Your processor capabilities determine how fast your computer can work, and ram basically limits how many things your computer can keep track of at one time.

In my experience, no reasonable amount is going to provide diminishing returns if you’re doing something like running virtual environments, graphic design, etc.

MickeyD commented: OK, this helps a lot. As I said, I don't do gaming but I do keep a lot of windows open at one time drawing from all of them and several things are goi +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

At this time, we only email you about updates to specific articles you've specified you want to watch.

However, there are some other tools at your disposal:

You can fetch an RSS feed of all new topics tagged with java. You can receive notifications by plugging this RSS feed into an RSS reader. Or, you can use something like IFTTT to send you alerts, text messages, tweets, facebook posts, or even flicker the lights in your smart home, etc., each time the RSS feed has been updated.

Additionally, if you want to do something creative and roll your own RSS parser, our RSS feeds use PubSubHubbub, which means you can get notified for updates via PUSH notifications instead of having to poll.

You can use our JSON feed to fetch all topics tagged with java, sorted by when they were last posted in. You will, unfortunately, have to write your own parser / reader / notification system (or perhaps one already exists somewhere on the web, probably a Github repository somewhere that would only need slight modifications to make it work).

Our complete API documentation is available at https://www.daniweb.com/welcome/api

John_165 commented: Thanks,but this sound complicated to me >< +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi there. Welcome to DaniWeb!!

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I use CodeIgniter, a loosely MVC-based PHP framework where models are "optional".

When I began coding the current iteration of DaniWeb, I began with some of the basics of the infrastructure. I first focused on the model layer and built up the essentials of what a post looks like, what a thread looks like, what a forum looks like, and what a member looks like.

From there, I created a bare bones crontroller, and focused on the view. As I built up the UI, each time I needed a new getter or setter, I built that functionality into the model layer. Each time the UI called some other type of behind-the-scenes behavior or functionality, etc., I built it into the controller.

I did that for the top three pages: Forum listing, forum thread, and member profile.

By the time those pages were done, which means they had a usable UI and basic functionality, I had a pretty sophisticated back-end infrastructure to support them. From there, I was able to build up the rest of their features as well as all the other pages focusing mostly on adding controllers and views.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Sorry about that. That was a bug from the migration we discovered late last night. Unfortunately, there were some serious repercussions as a result of it, where everyone who attempted to post yesterday was flagged as a spammer. It should be under control now.

rproffitt commented: Thanks for the explanation. Otherwise I would think these folk were all spammers. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I said let me know if anything isn’t working. I never said anything about being able to fix anything.

Thank you for your bug reports. They will be looked into.

rproffitt commented: Nothing new broken. Seems as good as before. About climate change, that's all of us doing our part. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Sorry for the intermittent downtime over the past few days. We think we were being DDOS'ed, perhaps unclear.

However, we've upgraded to new web servers tonight, to bring HTTP/2 support and finally move from PHP 5.x to PHP 7. A few years late, I know.

Let me know if anything isn't working as it should!

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Just stop with md5. It's completely useless.

Not that you're doing it, but MD5() with both a salt and a pepper are a little better because they make rainbow tables useless, but I would use password_hash() because not only is it more secure, but it isn't any harder to use.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Thank you for clarifying. The one in the email should be the current password. I’m not sure why it doesn’t work for you. I’ll investigate a little bit later today.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Sorry, I'm misunderstanding what you're saying. What you described is the expected behavior, no? What appears to be broken?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hiring a team from the beginning is a horrible and excruciating expensive business strategy. First find product-market fit. This means creating a minimally viable product and proving that there is interest in using it. If you don’t have the skills to go it alone, outsource to a small dev team or consulting firm. But, please, do not hire a team. Your business is too immature to know exactly what aspects will be most time consuming, the difficulty level of each component, who the first hire should be, or even what skills to look for. Otherwise you may find yourself hiring a bookkeeper only to find out months later that invoicing and accounts receivable only takes an hour a week.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I see you marked this thread as solved. Were you able to figure out the problem? What was it?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Are you sure you accidentally aren't running the cronjob from multiple users on the server? Two users might have the same crontab file accidentally. Just a thought?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Firstly, every page in the Google search results you linked to are from one specific forum. That's kinda weird ever since Google recently announced that, except in very extreme cases, they changed their algorithm to ensure that there's not more than one result per domain in the top 10 results.

That being said, though, you said that the ideas you had were in the first ten hits, but that's a lot of forum discussions to sift through. Could you maybe point out some specific posts or post the solution(s) here instead?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I'm a user of Ahrefs and I really like them. Obviously they're a leader in the SEO industry. The only competitor I know of is Moz. The thing is, as others have pointed out, they aren't an analytics company. You need both: Google Analytics and Ahrefs. They aren't competitors.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

You can try https://codecanyon.net/ which is an Envato Market (same parent company as Theme Forest) but designed for scripts and plugins instead of templates. Perhaps you were just trying to post within the wrong marketplace type?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Also, aren't you meaning to do cin >> s1.age and not just cin >> age?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Are they marketing their site for sale or to the consumers?

I think it's clear in the OP's question that they are trying to promote their blog posts within social media.

Also the folk I know always check out these sites (bankrate or similar) then go direct to the seller to see if the deal is the same or better.

I'm the opposite. I am of the variety that is willing to pay for convenience. Whether it's buying a movie ticket straight from Fandango where I can browse multiple theaters at once, doing grocery shopping from multiple stores at once on Instacart, or ordering food delivery on DoorDash, I am someone who buys into the upcharge from sites that curate across multiple stores. To each their own :)

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

rproffitt, may I ask why you would recommend that a small business hire a digital strategist? IMHO, this should be the CEO/CMO's job until the company is very well established, and certainly not one of the first handful of hires. It's often much more cost effective to outsource to an SEO firm or digital agency.

That being said, in terms of promotion on social media, I think you're right on track with recent blog posts on the homepage. Images are a help as well, in addition to the actionable content that you seem to have. What is clearly missing, however, are share buttons for people who read each article. Readers should be encouraged to share your content to their social network.

I get what you're saying about facebook and instagram not being the right place where consumers might go to consume content about loans and finances. Instagram, for sure. However, not everyone uses Facebook to look at memes or pictures of their friends. My facebook network is almost exclusively people I know in the digital space who are posting articles about the latest news in the SEO industry.

Again, this brings us back to having those share buttons front and center in your content. Create blog content targeted to experts in the field, and those experts will share the content with their network of other industry experts.

Also, LinkedIn and Twitter are much more relevant to you than Facebook and Instagram. Use appropriate hashtags with Twitter. Share content on LinkedIn, …

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Kcal you please provide the code that generated this error so we can help to debug?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Please show us the MySQL query causing this error.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

An oldie but a goodie, for you Canadians out there.

I thought I'd make the solution provided within this thread a bit easier to discover for those still suffering from the problem.

To switch between English and French keyboards, press Shift+Control.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

This is the code behind the demo at https://www.daniweb.com/connect/oauth/demo

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Newbies already get a warning message when attempting to post in threads that have been inactive for a long time. The thing is, I am against blatantly disallowing it because I do believe that sometimes it can be helpful. If someone stumbles across a thread from a Google search and has an answer that has not yet been posted, that could be useful to people in the future.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi,

My apologies for taking so long to reply to this.

DaniWeb itself is written in CodeIgniter, and I love it. I find that it's a very lightweight PHP framework, and I used it to roll my own ORM, etc.

Codeigniter has a session management library, but you would still need to write all the login functionality, lost password reset, etc. yourself (unless it's built-into EA).

How does your sign in work? Do you use native PHP sessions? It should be reasonably easy to implement CodeIgniter's session management into an existing login class within CodeIgniter.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Yeah, traffic has been poor recently, as has activity. There was a strong uptick that lasted for a few months after our relaunch back in October, then it plateaued, and now the past few weeks have seen a decline.

I'm not giving up on user matching quite yet. I'm actually doubling down on it right now. We'll see what happens.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I just want to comment that I have it set to show Errors, Warnings, Info, and Verbose, so everything. I also have Log XMLHttpRequests checked.

vikashsharma commented: so, have you fixed it ? +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

You might be familiar with the dreaded blank page when your PHP script doesn't work.

Here's how to spit out errors to the screen, instead of getting just a blank page, as well as logging errors to a file.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I'm not quite sure what happened, but at some point fairly recently, Chrome DevTools stopped showing things in the Console.

As a web developer, I obviously frequently use this to debug Javascript. Now, instead, there will be a little red circle with an error count in the top right corner of the DevTools window (as always), so it will recognize when there's an error, but the Console will be empty instead of spitting out what's wrong.

Even if I do console.log('foo'); from within a Javascript file, it won't write to the console.

I'm using what's currently the latest version of Chrome, Version 74.0.3729.169 (Official Build) (64-bit).

Help much appreciated as I don't like Firefox or Safari, but not having access to the Console is not sustainable.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member
DaniWebUser_1 commented: Perfect. Thank you. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

A lot of forums and blogs have links at the bottom of the article to jump to the Previous Post or the Next Post, by way of various forum and blog system PrevNext plugins. More recently, Q&A platforms have been shifting to show a sidebar listing of other similar questions asked. DaniWeb goes this route.

I was wondering if anyone out there has found either of these methods very helpful? What is the likelihood that, when stumbling across a question or topic as a result of a Google search, that you would have an interest in the question that just so happened to have been asked before or after?

I understand that the goal here is to increase time on site and user retention, but surely there are some better ways of going about achieving that. You know, ways that actually take the user's tastes or interests into consideration. Or, perhaps, I'm just way too much about data mining.

I guess the benefits are not so much for a Q&A site, but more for a discussion forum about a topic where there might be interest in browsing one topic after the next. What about going the route of infinite scrolling if that's the case?

Out of curiosity, what do the interwebs here think about news sites that do infinite scrolling on articles, such that you click on one article, and just keep scrolling infinitely down seeing one article after the next?

A problem that we definitely suffer from here at …

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Sorry, I don’t know how to do that off the top of my head. If the page contents have changed, you can use a sitemap file. But I don’t think googlebot wants your sitemap to contain dead pages. I think just naturally wait for them to come around and recrawl you.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Very few new posts if any and rarely a reply to a discussion.

I'm under the impression that this is a reflection of DaniWeb's activity level, as opposed to being a comment on the algorithm itself. The homepage originally just showed the cards we thought you'd have a keen interest in, but it was barely any. So then we changed the algorithm to show seemingly random cards, but at least populate it with something versus nothing.

What would be most helpful is if you were to link me to a few forum threads and say, "Hey, I stumbled across these, they weren't shown on the homepage, and I wish they had been."

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

It would help if you gave us some more information about what it is about Big O notation that you find confusing. Where are you stuck? What, specifically, is confusing you that we can help with?

Zainab_7 commented: how to compute the run time of nested for loop ? +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

That robots.txt file looks correct to me. But will not deindex from what I've read.

I think you've misunderstood what I was saying. A robots.txt file, alone, will not deindex. It was imperitive that userunfo was able to get all the pages to return a 410 Gone HTTP status. The advantage to robots.txt is that Googlebot won't be able to crawl spammy URLs on a domain, consider them spammy, and negatively affect the quality score of the domain as a whole. Therefore, it helps preserve the integrity of the domain (which can take months or years to recover from) while figuring out how to 410.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

And unfortunately that brings us back to how the original post was snipped to remove a crucial part of the question. It had some foul language as well as links to spammy pages, so I'll try to use example URLs instead:

Ex:
https://www.example.com?foo=bar+html
https://www.example.com/?foo=bar.html + many more

I want somehow to block indexing all links from:
"foo="

Check out https://geoffkenyon.com/how-to-use-wildcards-robots-txt/ and you can see there that you can do something like:

User-agent: *
Disallow: *?foo=

(At least, I think so. Please correct me if I'm wrong.)

rproffitt commented: That robots.txt file looks correct to me. But will not deindex from what I've read. +15
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

"using a robots.txt won’t remove pages from Google’s index." was his point and again why I wrote no.

rproffitt, your quote is taken out of context. robots.txt will not remove valid pages from Google's index. If you have a webpage that you want visitors to be able to access, but you simply don't want it to appear in Google's index, then adding it to your robots.txt file won't remove it from the index.

However, when dealing with the issue the OP is suffering from, this is not the solution to the problem. He needs to make sure all the spammy pages no longer exist on his domain, and then use robots.txt immediately in order to do damage control so as to not get hit by a Google algorithm update that doesn't like domains with lots of spammy pages.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

rproffitt, I still believe that robots.txt is the best solution here.

It seems as if malware has created many spammy pages with have subsequently been indexed by Google. The article you links to suggests the best way to deindex pages that you want Google to still be able to crawl and want visitors to access. In such a case, I would agree with John Mueller, who is Google's ambassador to the SEO community.

However, I would not recommend that strategy in this case. Basically the strategy involves manually adding a <meta robots=noindex> tag to every page, and then updating the sitemap to tell Google to recrawl the page soon to notice the change.

The problem with doing that, in this case, is firstly, I would hope that the spammy pages have already been permanently removed. Secondly, if for some reason they haven't been, manually modifying every spammy page doesn't seem like a reasonable solution ... if it were that easy, one would just as easily be able to remove the pages altogether or change their content to be non-spammy.

Instead, a robots.txt file is the best way to quickly tell Google to not crawl that section of the site. This is imperitive when the pages are spammy, and you don't want Googlebot to ding you for having spammy pages on your domain. If the pages no longer exist, they'll eventually fall out of the index over time, don't worry about that.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I think he’s trying to show the format of the links to see if there is regex or something that can be used to mass deindex them. I believe wildcard characters are included. I can’t provide more advice without seeing the format of the original links that were snipped unfortunately.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Lol. Spammy.

rproffitt commented: For moment I thought we were going back the distant past about a scene in "The Jolsen Story." Not P.C. +15
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Sorry for not updating the database. I investigated the issue, and it looks like I already had a little comment in my code saying that it was inefficient to do the extra database lookup to undo reputation on-the-fly, and instead we are just letting the cron counters recalculate this number daily.

John_165 commented: Noted :) +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I’ve found press release websites don’t perform as well as they used to.

1 good quality backlink can be worth 100 poor backlinks. For top tier (e.g. editorial backlinks from sites like the Huffington Post) then look into responding to HARO queries on a daily basis.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Darn, yes! I'll get to this tomorrow. Promise.