Start by reading SearchEngineWatch.com and SearchEngineLand.com
Join and participate in the forums over at WebmasterWorld.com
Those are a great start :)
Start by reading SearchEngineWatch.com and SearchEngineLand.com
Join and participate in the forums over at WebmasterWorld.com
Those are a great start :)
Anyone out there use Hootsuite or Buffer or one of the others? Are you an agency or just manage your own social?
I have no idea what type of counting code you need, nor how that would be related to Bootstrap (the CSS framework). Please take more time to help us to help you.
It sounds like the Kadence theme has its own additional CSS that is overriding some of the CSS you're using for your custom header.
IMHO all businesses in this day and age should be using some form of digital marketing, even if it's just being on Yelp or having a website. Are you posing this question to the community, or just sharing what digital marketing is? I'm confused by the point of your post.
Python works as a first language, but I would also recommend Javascript just because it is being used just about everywhere nowadays!
However, I guess the bigger question is, as a math major, what’s your objective to learning programming? If you want to make web apps, you might take a very different path than doing some backend calculations for something or other.
Either way, have you taken any computer science courses? Specifically, I would recommend an intro to data structures course.
Are you implying that a single person would constantly rotate across positions involving software architect, coder, writing documentation, and doing QA? I'm not a fan of this strategy. Firstly, different people have different strengths and weaknesses. You want to put each person in a position in which they will thrive at doing what they're best at. A good coder isn't necessarily a good software architect. A good software architect isn't necessarily good at English grammar and writing documentation. I think, instead, you want each single person you hire to do one specific thing, excel at that one specific thing, and be better than anyone else at that one specific thing.
I have coded and maintained DaniWeb by myself for twenty years. I have always tried to write code that adheres closely to standards. I try to make my code easy to read, and document it well enough that, should I need to change something, I can easily figure out why I did something the way I did 3 or 5 years ago, should I no longer remember.
Begin by going to Search Console and setting up a new property, or bringing up an existing property you have alredy set up.
Click on Sitemaps in the left, and add a new sitemap URL. My recommendation is to create different sitemap files for each section of your website, and then submit each of the multiple sitemap files individually. This way you can keep track of indexing for each section of your site, and determine if and why there are any sub-sections of your site that are performing inadequately in Google Search. I would also create a sitemap index file that lists out all of the sitemap files you're using, and then add that to the bottom of your website's robots.txt file as so:
Sitemap: https://www.mysite.com/sitemap.xml
You errors are showing undefined variables on line 821 and 825 of VK.php, and then a fatal error on line 267 of fpdf.php. What code are on and around those lines?
I’m glad you were able to recover your important files.
Sounds like a lot of work either way.
Digital marketing is both a lot of work, as well as a lot of skill involved. That's why it's a multi-billion dollar industry. I did forget to add, that separate from SEM, another huge arm of digital marketing is social media marketing. That, as well, has both organic as well as PPC branches, but involves a skillset much different from the skillset involved for SEM. I have a lot of experience with SEM, but very, very minimal with social media marketing.
Organically means that the traffic came from Google's search results organically, as opposed to paid or sponsored with Google Ads, Facebook Ads, etc.
With paid ads, you're pretty much on a hamster wheel constantly having to fund the beast to keep up your website's traffic levels. On the other hand, building traffic organically means figuring out a way to rank highly in search engines for a particular set of keywords. Typically this involves a significant amount of upfront cost and effort, in exchange for long term gains. However, it's important to keep on top of it, and continue to invest in your strategy/strategies over time. It's definitely playing the long game, and can take anywhere from 6 months to 2 years to see a return on your investment. With paid ads, there's definitely a lot of upfront effort involved there as well, plus a lot of skill required, but you can start seeing gains almost immediately. You'll also be continually paying into it. The amount you will be paying will be proportional to your skill and effort invested.
This isn't really the place for any of this this, though, since your question is essentially Digital Marketing 101, and this is the digital marketing forum, so if you have additional questions, please start a new thread.
However, as a very basic general overview:
Digital Marketing: Using technology to further your marketing efforts, typically on the Internet
Search Engine Marketing (SEM): Using search engines to further your marketing efforts
Search Engine Optimization …
Hello there! I can try my best to help you as I'm pretty good with this kind of thing. However, I'm a bit confused what you're referring to. I'm on your website's homepage, and I don't know what swipe effect you're referring to? I'm on a macOS laptop using Chrome.
I wish I had the answer to your question as well!!
Here's an article that should help you: https://www.geeksforgeeks.org/how-to-generate-pdf-file-using-php/
Please let me know if there's something specific in that article that is confusing you, and I can try my best to explain it in better detail.
Ugh, I'm so sorry that happened. I've had that happen with two different external hard drives over the years ... both were LaCie drives from a very long time ago. Unfortunately, I never took any steps to try to recover them, and lost the data on them at the time. I hope you're able to figure something out in this case.
OK, so I have some more time right now to explain in greater detail. When you post a tweet, you're app is basically acting on behalf of an end-user. Twitter, or X, or whatever they call themselves now, needs your end-user to go through the process of authenticating and then authorizing your app to tweet on their behalf ... kinda like Sign In with Facebook or Sign In with Google type of thing.
So an end-user of your app will click a link within your app to connect your app to twitter. They will then be redirected to twitter to authenticate (e.g. log into their twitter account, if they aren't already logged in). They'll then be asked to authorize your app to tweet on their behalf. They'll click accept, and it will redirect back to your app along with an access token. Your app can then use that access token in the cURL request to tweet.
It seems as if you have to use OAuth to authenticate on behalf of your user (or whomever you want your application to post tweets on the behalf of). How did you generate the authorization bearer_token? It seems as if that's just a token to authenticate on behalf of your app, not on behalf of an end-user of your app/Twitter, which is what you need to do here.
When I go here I see that they offer four different authentication methods. It looks like OAuth 1.0a as well as OAuth 2.0 Authorization Code Flow with PKCE are the two that allow you to authenticate on behalf of a Twitter account.
Do you have experience with using OAuth to authenticate on behalf of an end-user?
Does the web server have an updated SSL certificate installed?
I wholeheartedly agree with cored0mp when it comes to library support. As a PHP’er, it’s been a lifesaver for me knowing that anything I might need, and any popular API I might want to use, can find documentation and extensive support on. At least for everything I’ve tried to do over the past 20 years, I’m not so sure I could say the same for Python.
If you hadn't said anything, I would have assumed an ice-free driveway meant one of those heated driveways that melts the ice and snow so you don't have to shovel your car out. But that's the New Yorker in me!
I say 2024 is going to be the year of even more AI.
Happy new year!!
Enjoy them in good health :)
What is a good way to make sure all the pages are in Google?
Create a sitemap XML file that lists all of the pages you want indexed in Google. Use noindex meta tags for thin content (navigational pages, login pages, etc.) that wouldn't be a great experience for Google searchers, or that don't have enough high quality, unique content on them. Build backlinks from other reputable sites in your niche to various pages on your site.
I briefly answered your question here by saying:
To prevent cross-site scripting attacks, you similarly want to make sure that all text derived from user-generated input is HTML escaped. If using PHP, you can use the
htmlspecialchars()
function.
It really is that simple, if you're using PHP. I do see you tagged this code Javascript, so here's a quick JS function I found on this page of SO that you can use:
function escape(s) {
let lookup = {
'&': "&",
'"': """,
'\'': "'",
'<': "<",
'>': ">"
};
return s.replace( /[&"'<>]/g, c => lookup[c] );
}
console.log(escape("<b>This is 'some' text.</b>"));
As you can see, all it takes to escape HTML code is to replace &, ", ', < and >. Ampersand, double quote, single quote, less than, and greater than are kinda like reserved characters in HTML. Once you escape those, no one can inject user-generated content to execute HTML or JS on your page that you don't want it to, which is what an XSS attack is.
It has to do with the scope of a javascript variable. It means that the variable is hoisted (aka “lifted up”) to the top of the code block it’s in.
This has nothing to do with web hosting, and is only relevant if you are a javascript developer.
Google now considers my robots.txt as valid. Same one I had from the beginning.
Your .htaccess file would look something like:
RewriteEngine On
RewriteRule ^([a-z]+)-([a-z]+)-([a-z]+)$ news.php?slug=$1-$2-$3 [L]
This is untested, but it should work.
If I were you I would test also having only one sitemap index file in robots.txt and also removing the comments (maybe something weird happens in their URL parser so I would check one time as simple as possible). Of course making small edits and recheck it takes time because you have to request for a recrawl.
Fortunately, requesting a recrawl through Search Console takes only moments. Unfortunately, removing the comments didn't work, having just one sitemap file didn't work, and changing that one sitemap file to point to a sitemap instead of a sitemap index didn't work either.
Why do you have your sitemaps in your robots.txt file? Don't you declare them explicitly in Google Search Console ?
Sitemaps are a valid part of the robots.txt protocol, and used not just by Google, but also by Bing, DuckDuckGo, and other smaller engines.
The best way to prevent SQL injection attacks is to make sure that all strings passed into SQL queries, especially if they are derived from user-generated input, are properly escaped. If using MySQL with PHP, you can use the mysqli::real_escape_string()
function. Other databases have equivalent functions.
To prevent cross-site scripting attacks, you similarly want to make sure that all text derived from user-generated input is HTML escaped. If using PHP, you can use the htmlspecialchars()
function.
The new robots.txt tester that is now built into Google Search Console is finding errors with the last two rows of my robots.txt file:
Sitemap: https://www.daniweb.com/latest-sitemap.xml
Sitemap: https://www.daniweb.com/sitemap.xml
The error is: Invalid sitemap URL detected; syntax not understood
Any idea what could be wrong here?
Use a CDN to load your images and fonts. I recommend Cloudflare, which has a free version, but you can also use Fastly or one of the other popular options out there. Also, limit the amount of heavy Javascript you're using.
I don't understand what Alicia means by unique characters? What makes those characters unique? Are they reserved?
What does SAP stand for India?
Probably the same thing it stands for in other countries.
Hi Julia and welcome to DaniWeb!
You can link to your project from within your post signature, which can be set up here.
We used to have a Share Your Projects forum a long time ago, but unfortunately it became a huge spam fest, where people didn't really care about receiving feedback, they just wanted to quickly advertise or get a link for SEO. It ended up becoming increasingly frustrating for members because people would take the time to reply, and the person would never return, because they didn't really care about being a part of the community once they got their link. So that's why we ultimately got rid of it and made DaniWeb completely a no-soliciting space (whether your website is paid or free). Sorry for the rant :)
Soooo, on that note, I do have a question. Are you a dev yourself? When you said you assembled a squad of devs, did you hire a consulting/dev shop? A bunch of freelancers? What was the hiring process like? How did you decide what the initial dev cost investment should be before even having a product?
Another solution might be to just show smaller, low resolution versions of images, and only the full resolution if the person has paid for it.
It appears this Google bug has been resolved. For me, anyways.
Aside from everything posted above, sometimes large changes to a complex website can take 6-9 months to be recognized by Google.
Manu_18, can you show your code? Better yet, start a new topic and show your code and the error message. This way your question doesn't get lost inside a 2 year old topic.
Do you do drop shipping?
While Google officially states that they don't follow 'nofollow' links
I'm actually going to correct you there for anyone else who stumbles upon this thread. Google announced a year or so that "nofollow" went from being a directive that they will always follow to instead being a hint that they will take into consideration, but might ignore (and therefore follow the link) if there are other signals that are indicating it might be a good URL to crawl.
The best way to ensure they don't follow your internal nofollow links is to ensure that all internal links to the URL you don't want them to crawl, are nofollow'ed, and there are no external links on any other sites pointing to the URL either.
Basically it's the time that the web browser spends converting the front-end HTML, javascript, and CSS into something that the end-user sees. You want to make sure not to block it with javascript plugins and such that take time away from the web browser showing something to the user.
Hi everyone new!
Does anyone have anecdotal evidence for or against Google following internal nofollow links? (Irrespective of what they claim)
Do you find that most of your clients are at least already using a CSS framework or Sass, etc?
I, personally, use Bootstrap.
The only thing I changed 2 weeks ago (per this thread) was Google Analytics from async to deferred.
Now, Desktop is passing and Mobile is failing (the opposite of before). I really don't think it has anything to do with the change I made. As I alluded to before, it is always constantly in flux based on the geographic locations of the visitors Google sends to us that month.