Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I have found it’s really hard to create Reddit backlinks unless you already have a really strong reputation in the Sub. If it sticks though, it’s definitely valuable!

Olu, what’s your trick with Reddit?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I think you're referring to physical memory (hard drive space) and not random access memory (RAM). Using up all your hdd space can make your phone go slower. RAM is transient and is just the memory the apps you currently have open at the time are using. The amount of RAM you use goes down when you close out of apps or restart your phone.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Oh, I forgot to mention, from within a member’s business card, you can always bypass paying the introductory fee if you know the member’s email address. There should be a link there that says something along the lines of do you already know this member? Unlock this member with their email.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

You can message a member by clicking on the Business Card button in their member profile.

If you have never interacted with this member before, and our patented algorithm has detected you are not a likely match, then there is a small barrier to entry (typically a few cents) to introduce yourself to them, to prevent spam.

To find a member, you can search for their username in the top search bar, and a link to their profile should appear in the dropdown.

Alternatively, you can browse our list of Active Members or Top Members.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hello and welcome!!

My husband was really into Red Matter on the Oculus as well. Unfortunately, it gave me a bad headache after just a few minutes.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

A friend in the SEO industry (Jim Boykin) used to have a Delorean he bought off of EBay with the license plate TIMECAR. He’s since sold it, unfortunately.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I think I see a running theme here of no one remembering what they did to arrive at a particular website on a particular random day of their life 20 years ago.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Sounds more than a little heavy-handed but I can sympathize.

It stems from a place of just being very proud of the work he does (he leads the team that is responsible for battery life across all Apple devices), and wanting his household to use the products that he took an active role in working on.

Unfortunately, when I moved to the Bay Area, I left my Dell workstation tower back in NY. When I moved across the country, I was renting for the first bunch of years, and was just using a Macbook hooked up to external LG monitors. It wasn't until we bought a house a handful of years later, and settled down here, that I brought the majority of my belongings over. At that time, I brought over my well loved Dell, only to be sorely disappointed that it was now outdated, and my recently purchased Macbook M1 was more performant for my daily tasks. Soooo, I gave the Dell to blud, because I knew he would be able to make really good use out of it.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

That was in the mid 1990's so it's a bit fuzzy now.

Nonono. You must be mistaken. DaniWeb was founded in 2002.

Reverend Jim commented: Obviously an early early early adopter. +0
rproffitt commented: Sorry, the co-conspirator was met in mid 90s. Their invite was years later. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

What brought you here? How did you find out about DaniWeb?

(It was a Google search, wasn't it?)

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I guess AI is replacing traditional search engine queries?

ChatGPT traffic still doesn't surpass Google, but it's definitely way up there. I believe it's heading in that direction, yes.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I used to have a Dell Precision workstation that I loved. My husband, however, is an engineering manager at Apple and he declared that we have an Apple-only household. Alas, the Dell started really showing its age, and my Macbook is actually more performant. I ended up giving the Dell to DaniWeb's sysadmin, blud, who promptly upgraded nearly every component except for the motherboard and CPUs.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Inquiring minds want to know. What are you working with?

I currently have an Apple M1 Max 14" 2021 model.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I think people are not understanding what I'm saying here. Please allow me to demonstrate:

Looking at our Google Analytics right now, I can see that, aside from the top search engines such as Google, Bing, and DuckDuckGo, the next biggest place we get traffic from is ChatGPT. Moreover, the average engagement time per session for visitors finding DaniWeb through ChatGPT is more than double that of visitors finding DaniWeb from all other sources.

Us publishers are very aware that ChatGPT plagiarizes our content. We don't like that ChatGPT plagiarizes our content. Similarly, we are aware that Google plagiarizes our content, and we don't like that either. But, ultimately, it's a symbiotic relationship because, in return, ChatGPT gives us a good amount of quality web traffic we can't get from anywhere else. Google gives us nearly all our web traffic.

Poisoning ChatGPT isn't going to solve any problems. Rather, put your energy towards finding a way to give publishers like DaniWeb a way to earn an income without being dependent on ChatGPT and Google.

SCBWV commented: Wow! I find it surprising most of your traffic comes from ChatGPT. I guess AI is replacing traditional search engine queries? +8
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member
  • Self-driving cars
  • Robotics
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hello and welcome to DaniWeb! :)

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

If you used it only a few days ago, how do you know it works? Did you see your Google traffic increase so quickly after that?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hello!!

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Firstly, I'd suggest you edit your post to remove your real email address from the code.

I have replaced their email address with a dummy email.

Sanitise your subject and/or message. You can't in general feed whatever was typed into your dialog directly into mail.

Salem is referring to passing each of those variables in htmlspecialchars() to make sure there is not any HTML or Javascript embedded in the strings that could potentially be executed. Or, alternatively, you can use PHP's filter_var() function to sanitize user input.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Google has said that they plan to remove the tool because it's consistently done more harm than good for the majority of sites that have used it.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

There's a simple answer: DaniWeb doesn't have an account on Bluesky because I run DaniWeb and I've never heard of BlueSky.

We do, however, have an account on X. I have been on Twitter since 2008, and Facebook since even before then, and, aside from those two that we've had forever, I don't see the point in having multiple social media accounts across all different platforms. Way too much to keep track of :)

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hello and welcome to DaniWeb!

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hello and welcome to DaniWeb!! Thanks for joining. What got you interested in opening an eCommerce store?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Something tells me that you are going to post a recommendation that meets that exact criteria. Please don't. We don't tolerate promotional content here.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hello and welcome!! Thank you for joining us.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi Jon Wilson as well!

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

As to updates, is there a way to disconnect this from fascist run companies like X (twitter)?

Use Mastodon?

rproffitt commented: For the moment it's bluesky. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I would begin by making sure that you sanitize input passed in via $_POST. This ensures that someone doesn't pass in something like $POST['oneway'] = '</p>Foo!<strong>Blah</strong> and completely screw up your HTML, or, worse yet, inject Javascript into your HTML.

    <p><b>One Way:</b> ' . htmlspecialchars($oneway) . '</p>
    <p><b>Return:</b> ' . htmlspecialchars($return) . '</p>

Can you please clarify what isn't working as intended? I don't see anything in your code having anything to do with PHPMailer, so I'm a bit confused.

Why do you have <form action="#errors"? Typically, the form action would be a PHP page that processes the form.

What is the form trying to accomplish? Currently, we have these two POST fields being passed into this PHP page, and we're using them to set the values of a form that loads on the page. For what purpose? Is there a different form that is used to pass in the values of oneway and return?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I have heard of the "Ignore all previous instructions" thing when ChatGPT first came out, but I don't know how effective that is anymore. I don't think very much so. I haven't heard of anything related to Tiananmen Square or Holocaust Remembrance Day or such, and I don't see how those would be effective at all.

rproffitt commented: I've tried all 3 methods on deepchat, deepai and they work fine. That is, reveal what state is involved in the software. +17
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

That is the area in Google Search Console where you can check if you have been hit by a manual penalty. A manual penalty is when a Google employee specifically looks at your website, and all the backlinks pointing to your website, and makes sure that your site is not engaged in any funny business. In other words, they check to see if you're violating any of the Google policies such as paying for backlinks, or doing any obvious black hat techniques.

However, just because you have no manual penalties doesn't necessarily mean that Google loves, or even likes, your site. It's very rare to get a manual penalty because almost all "penalties" are algorithmic ... meaning they're factored into Google algorithm updates, as opposed to a human individual manually flagging your website for review. Google does not let you know if you've been hit by an algorithmic penalty. Additionally, they don't let you know if your website has just been knocked down in ranking due to a core update, or one of their other 300+ algorithm updates per year. You may have heard of Google Panda or Google Penguin algorithm "penalties". Today, those are built into Google's core updates.

MasoodDidThat commented: So the gist is there is no menthod to check the algorithmic penalty from google...one can understand from ranking and traffic change on their website +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hello. :-P

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I think I'll add that, when it comes to core web vitals, the most important thing to do is ensure you have fast running javascript, stylesheets, images, etc. It doesn't really matter what language is on the backend as long as it's coded efficiently to be fast loading. You might also wish to use a performance CDN such as Cloudflare or Fastly to cache your content.

MasoodDidThat commented: Thanks, that was helpful +0
Biiim commented: Basically what I would say, Node JS is FAST when I was playing with it, Java I hate but I dont think it would have to be slow if you program it right +9
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

This system doesn't allow replies in the way that Reddit and such do. Odd, but you own this.

DaniWeb was started back in 2002 at a time when flat forums were far more popular than threaded forums. Threaded discussions were more closely associated with bulletin board systems and Usenet back then. Even though DaniWeb actually began as a threaded forum, we found that early users back then were confused when the platform was designed as anything beyond a simple question followed by a flat list of responses. Fast forward a couple of decades, and sites like Reddit successfully introduced threaded discussions to Gen Z. It didn't seem quite right for us to follow suit and reintroduce threaded discussions here, especially as we're primarily a Q&A site. I would say that flat forums are a bit of a mix between Stack Overflow, which is one question followed by answers to that question that are never meant to engage with one another, and Reddit, which is one question followed by one or more follow-up questions and answers of infinite depth.

rproffitt commented: Dare I? sure. ok boomer. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

OpenAI rips content, no one bats an eye. Deepsink does same, "They are ripping off our work."

I don't know why you think that. In the SEO publishing industry, us publishers have been very vocally complaining that OpenAI, Google, etc. have been stealing our content for at least 2 years now.

I think the difference is, as I pointed out in my previous post here, us publishers have a symbiotic/codependent relationship with OpenAI, Google, etc. because it's those services that send us the majority of our web traffic.

When it comes to some random Chinese company that we aren't relying on for our own business model, we can take action to shoo them away without repercussions. We can't afford to do that with OpenAI.

Sending away AI spiders isn't a technical problem at all. That's why I don't understand your whole poisoning with gibberish nonsense. It's a business problem for us publishers. Not a technical problem at all.

rproffitt commented: Also: OpenAI Claims DeepSeek Plagiarized Its Plagiarism Machine +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I understand that's what you deployed, rproffitt. My question was directed to Masood.

rproffitt commented: This system doesn't allow replies in the way that Reddit and such do. Odd, but you own this. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Bitcoin and Ethereum are the biggies.

Here's a comprehensive list.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I think you might be confusing Java with Javascript. Is your website built with Javascript technologies such as Node.js?

rproffitt commented: "Tomcat, web server and servlet container that's used to host Java-based web applications." That's what we deployed. +17
MasoodDidThat commented: Yes Dani, By Java i meant Javascript, my website is hosted by Vercel server +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

"Kiss my shiny metal ***"

Seriously?!

rproffitt commented: OpenAI rips content, no one bats an eye. Deepsink does same, "They are ripping off our work." +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

To Pebble's point, I genuinely believe that the **** that was spewed in the first post of this thread is not any more sophisticated than those chain messages circulating Facebook that say things like copy and paste the sentence, "I don't give Facebook the authority to blah or the copyright to blah" into a FB post, thinking it will be legally binding.

rproffitt commented: Today it's clear that "Rule Of Law" is fantasy south of Canada +17
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

As Harry points out, my first guess would be to check robots.txt and ensure you aren't blocking any pages from SEMRush. Also make sure you aren't using a CDN or proxy like Cloudflare that is blocking it from their side.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Hi and welcome to DaniWeb! Thanks for joining.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Many places ban or remove AI generated content.

We are one of them! :)

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

If Biim had provided a working solution, I’ll mark this question as solved.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

The creator of Nepenthes says that it is ineffective against OpenAI which I take to mean that OpenAI is ignoring robots.txt.

As mentioned, Nepenthes uses the spoofing technique. Spoofing does not rely whatsoever on bots following robots.txt.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

The OpenAI bot appears to be a bad bot.

Specifically, I would bet quite a large sum of money that the people who are complaining they can't get OpenAI to respect their robots.txt file either have a syntax error in their file, and/or aren't naming the correct user agents. I've seen people mistakingly try to reference a user agent called "OpenAI"! https://platform.openai.com/docs/bots/

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

The OpenAI bot appears to be a bad bot.

This is not my experience. OpenAI respects my robots.txt file perfectly. I do want to add, though, that robots.txt files are very finicky, and I have seen many, many times people blaming the bots when the problem lies with a syntax or logic error in their robots.txt.

Nepenthes and Iocaine do not spew garbage across the web. They feed garbage to bots that access the protected sites.

The technique you're referring to is called spoofing, and it's what happens when you serve one set of content up to certain user agents or IP addresses, and a different set of content up to other user agents or IP addresses. It's still considered spewing garbage across the web. That garbage is being fed into Google. Into Wikipedia. Into the Internet Archive. Into ChatGPT. And, ultimately, it will end up being consumed by innocent users of the web.

The creator of Nepenthes says that it is ineffective against OpenAI which I take to mean that OpenAI is ignoring robots.txt.

I would say it's ineffective against OpenAI because OpenAI can detect the content thrown at it is nonsensical, and/or they're being delivered spoofed content, and they choose to actively ignore it.

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

When you price and design a site for an expected human load, and then you get overwhelmed by bots, you can throw more money at it or you can take action against the bots.

It's true that the majority of websites on the Internet today spend more bandwidth on bots than they do on human visitors. However, there are both bad bots and good bots, and they are not created equally.

In my meagre understanding of all things web related, robots.txt is supposed to specify which pages of a website should be crawled or not crawled by bots.

This is true. The primary difference between good bots and bad bots is that good bots respect your robots.txt file, which dictates which part of your site the specific bot is allowed to crawl, as well as how often it is able to be crawled, while bad bots tend to ignore this file.

However, that does not mean it's not possible to tame bad bots. Bad bots (and even good bots) can easily be tamed by serving them the appropriate HTTP status code. Instead of a 200 OK, you would send them a 429 to indicate a temporary block for too many requests, or a 403 forbidden if your intent is to permanently block the bot.

Good bots (and even most bad bots) tend to understand the intent of the status codes (e.g. 429 means try again later, but at a slower crawl speed), and, either way, you …

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

Sorry, I'm confused. Do you want to decode the JSON in PHP or in Javascript?

Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

If you're not a part of the solution, you're a part of the precipitate.

I think this sounds terrible. The global population is, more and more, relying on AI to serve up accurate answers. There's already the gigantic problem of hallucinations as well as AI consistently spewing out false information that sounds entirely believable, and therefore spreading false information.

How is making the problem worse going to help with your mission of turning the world into a better place?

rproffitt commented: AI appears to be making things worse. Better for the robber barons, not so much for us. +0
Dani 4,653 The Queen of DaniWeb Administrator Featured Poster Premium Member

I’m not nearly as much of a conspiracy theorist. I also don’t think that spamming Facebook with nonsensical posts is going to make the world a better place.