It has to do with the type of website that is trying to gain traffic. However, for the most part, content sites are pretty frustrated that AI bots are scraping all of their content and then using it to directly answer a searcher's question, without the searcher ever having to visit the website. This is especially tough on content-based websites with high quality, well-researched content that generate revenue through advertising. Money and effort was put into creating the content and it's being served up to searchers for free.
However, for many of these content-based websites, DaniWeb included, we're in a Catch-22 because we still want to rank by search engines, especially sites whose business models rely on almost all traffic being driven from search. In order to appear in Google's search results, we are required to allow Googlebot to scrape all of our content and use it for its AI engine.