Menu
Menu
DaniWeb
Log In
Sign Up
Read
Contribute
Meet
Search
Search
About 1,000 results for
robots.txt
- Page 1
“WHEN WILL I GET MY ROBOT?!”
Community Center
1 Week Ago
by Johannes C.
…his ['Trumpet Player Automaton'](https://cyberneticzoo.com/
robots
/1810-automaton-trumpet-player-friedrich-kaufmann-german/) … on YouTube, software constraints still prevent
robots
from operating autonomously in unstructured environments where… less efficient than biological muscles. This limits
robots
' ability to perform precise, fluid movements. …
Re: “WHEN WILL I GET MY ROBOT?!”
Community Center
1 Week Ago
by Reverend Jim
Never mind the
robots
. They've been promising me flying cars since 1950. Where is my flying car? ![Flying_Cars.jpg](https://static.daniweb.com/attachments/3/6e2832c60bf9dbca44507231ce074d07.jpg)
Re: Anti-Hotlinking Script for WP on Apache (.htaccess) - Linkspam Prevention
Programming
Web Development
3 Weeks Ago
by amastyteam
Thank you for such usefull post!
Re: Robots for agricultural harvesting
Community Center
1 Month Ago
by QuintinFields
that's pretty cool stuff with agricultural
robots
! It’s amazing how these machines are changing the game … better over time!Therefore, I hope that in the future
robots
will replace humans in all complex physical jobs.
Re: Do backlinks to disallowed URLs help the domain overall?
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by Chris Hüneke
… signals work, particularly regarding URLs disallowed by
robots
.
txt
: Impact of Backlinks to Disallowed URLs: URLs disallowed… by
robots
.
txt
can still accumulate PageRank from incoming backlinks. …to the disallowed URLs. While URLs disallowed via
robots
.
txt
can't directly pass PageRank to other pages …
Anti-Hotlinking Script for WP on Apache (.htaccess) - Linkspam Prevention
Programming
Web Development
1 Month Ago
by Chris Hüneke
… integradted into this Script to put the **whitelist inside a .
txt
file**. This also was easier for the client, as he… nohotlink.html to the Main Page, tested with noindex, nofollow
robots
tag, but it was all a mess. If anybody is…
Extract and Count Reviews/AggregateRating Script (Python)
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by Chris Hüneke
…. Output: The total reviews per category are saved in result.
txt
. ## Requirements ## Python libraries: requests, beautifulsoup4, re. ## Special Notes ## Review…(f"{category_name}: {total_reviews} reviews\n") with open('result.
txt
', 'w', encoding='utf-8') as file: file.writelines(results)…
Extract Schema.org Data Script (Python)
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by Chris Hüneke
… prompted. 3. Output: The extracted data is saved in schema_data.
txt
. ## Features ## * Extracts JSON-LD data from webpages. * Identifies and…file def save_to_file(url, title, schema_types, formatted_data, field_count, filename='schema_data.
txt
'): try: with open(filename, 'w', encoding='utf-8') as …
Python Script for scraping Schema.org JSON Code from a specific URL
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by Chris Hüneke
… prompted. 3. Output: The extracted data is saved in schema_data.
txt
. ## Features ## * Extracts JSON-LD data from webpages. * Identifies and counts…
Re: Do backlinks to disallowed URLs help the domain overall?
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by sgtamilan
No, backlinks to disallowed URLs do not benefit your domain. Search engines ignore these links, so focus on building quality backlinks to allowed pages for better overall domain authority and ranking.
Re: Should we noindex our JSON files?
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by Chris Hüneke
Yes, it’s generally a good practice to set JSON files that are not meant to be displayed as standalone content to noindex. This prevents these files from being indexed, still allowing Google to see and index the rendered content generated by these JSON files. This way, you ensure that users find the relevant content without having irrelevant …
Re: Extract and Count Reviews/AggregateRating Script (Python)
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by Chris Hüneke
Sorry, i got that wrong. I can't post scripts in Digital Marketing section, right?
Re: Extract and Count Reviews/AggregateRating Script (Python)
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by Dani
No, just in Web Development, but you can use the seo tag.
Re: Extract and Count Reviews/AggregateRating Script (Python)
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by abuislam
Great script! It’s a practical solution for aggregating review counts across categories. Automating the process for accurate Schema.org markup can definitely improve SEO and visibility. Thanks for sharing!
Re: How to enable gpedit on Windows 10 & 11
Hardware and Software
Microsoft Windows
1 Month Ago
by abuislam
If Windows 11 update KB5023706 is causing BSODs, you can enable the Group Policy Editor (gpedit.msc) on Windows Home with a script. Save it as enable-gpedit.bat, run it as admin, and reboot to manage updates more effectively.
Re: Long Paths/File names/Ext's
Programming
Software Development
1 Month Ago
by rory jackson
Hi Mr Freaky, Yeah it can be a real headache, I tried LongPath Tool Program to good effect thanks Rory
Re: Extract Schema.org Data Script (Python)
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by Chris Hüneke
Sorry, i got that wrong. I can't post scripts in Digital Marketing section, right?
Re: Python Script for scraping Schema.org JSON Code from a specific URL
Digital Media
Digital Marketing
Search Engine Strategies
1 Month Ago
by suwaidi
API response as a JSON string. Finally, that string can be parsed into a Python dictionary through json.
Robots.txt
Digital Media
Digital Marketing
Search Engine Strategies
17 Years Ago
by SuperTodd
I dont want any of the pages to be blocked from the Search Engines. Do i need to place a blank
robots
.
txt
file please advise thanks
Re: Robots.txt
Digital Media
Digital Marketing
Search Engine Strategies
17 Years Ago
by stymiee
No. Only place a blank
robots
.
txt
if you don't want 404 errors in your logs/stat software. Otherwise there is no need for one.
Re: Robots.txt
Digital Media
Digital Marketing
Search Engine Strategies
17 Years Ago
by Harry smith
You want each and every page of your site should be index by every spider you can use a simple two line code in your
robots
.
txt
User-agent: * Disallow: Here * indicates all spiders are allowed to crawl and the disallow section is placed empty, this indicates every portion of the site is free for them to crawl.
Re: Robots.txt
Digital Media
Digital Marketing
Search Engine Strategies
17 Years Ago
by Dani
... or you can just not have a
robots
.
txt
file to achieve the same effect.
Re: Robots.txt
Digital Media
Digital Marketing
Search Engine Strategies
17 Years Ago
by BamaStangGuy
A
robots
.
txt
file is for blocking content from search engines. This is helpful if you do not want search engines to "waste" their time attempting to index such things that: [list]Won't give you any traffic from SERPs such[/list] [list]Only accessible by members[/list]
Robots.txt
Hardware and Software
Networking
11 Years Ago
by tomibaw
How **
Robots
.
txt
** is to be placed in the web site?
Re: Robots.txt
Hardware and Software
Networking
11 Years Ago
by DarkMonarch
mine is here: ../var/html/www/
robots
.
txt
robots.txt and sitemap.xml
Digital Media
Digital Marketing
Search Engine Strategies
10 Years Ago
by cmps
… like to know what happens if I wrote in the
robots
.
txt
: User-agent: * Disallow: / Sitemap: http://www.example.com/sitemap.xml…... In this case, as I understand what
robots
.
txt
and sitemap.xml do; First,
robots
.
txt
disallow Search Engine (Let's talk about Google… and check it or it should also appear on the
robots
.
txt
? Thank you
Re: robots.txt and 302 redirects
Digital Media
Digital Marketing
Search Engine Strategies
13 Years Ago
by joeyoungblood
blocking the URL in the
robots
.
txt
doesn't do much good these days. Google will still … want and rank it for what they want. noindex meta
robots
tag is far more useful. you say it worked, but… it. in late June google posted about using
robots
.
txt
vs noindex and stated that
robots
.
txt
was nolonger their endorsed method. [url]http…
Robots.txt explanation with Video.
Digital Media
Digital Marketing
Search Engine Strategies
14 Years Ago
by stephen186
Those who are still confused about
robots
.
txt
. Here is a link from mattcutts. [url]http://www.mattcutts.com/blog/
robots
-
txt
-remove-url/[/url]
Re: Robots.txt explanation with Video.
Digital Media
Digital Marketing
Search Engine Strategies
14 Years Ago
by mahgobb
[QUOTE=stephen186;1010704]Those who are still confused about
robots
.
txt
. Here is a link from mattcutts. [url]http://www.mattcutts.com/blog/
robots
-
txt
-remove-url/[/url][/QUOTE] Thanks for the sharing of this link, indeed, it clear me some doubts!
Robots.txt Checker
Programming
Web Development
9 Years Ago
by hacxx
If a site has the file "
robots
.
txt
", it displays all the directories and pages on the website that they wish to keep anonymous from the search engine spiders. This may lead to hidden pages. With
Robots
.
txt
Checker you can access the site directly or check if it is indexed in google
1
2
3
17
Next
Last
Search
Search
Forum Categories
Hardware/Software
Recommended Topics
Programming
Recommended Topics
Digital Media
Recommended Topics
Community Center
Recommended Topics
Latest Content
Newest Topics
Latest Topics
Latest Posts
Latest Comments
Top Tags
Topics Feed
Social
Forums
Top Members
Meet People
Community Functions
DaniWeb Premium
Newsletter Archive
Markdown Syntax
Community Rules
Developer APIs
Connect API
Forum API Docs
Tools
SEO Backlink Checker
Legal
Terms of Service
Privacy Policy
FAQ
About Us
Advertise
Contact Us
© 2024 DaniWeb® LLC