Some of my users are getting the following error message:

500 Internal Server Error - The size of the response header is too large. Contact your ISA server administrator. (12216) Internet Security and Acceleration Server

Could this be from a cookie that is too big?

Recommended Answers

All 20 Replies

Apache error? The error you listed appears to be generated from a Microsoft Windows ISA server??

Yes, it seems to be a browser-generated error message. Our web server is Apache-based.

My confusion was that you posted this in the linux forum regarding an error with ISA. So is the problem on your end because you run windows ISA servers (firewall/reverse proxy) or is it the users that connect to you from within an organization that has ISA servers providing the proxy services for them.

The problem is on the user's end. Sorry for the confusion.

I dropped support for ISA back with v2004. I'd have to make a few calls with some colleages. Will report back if i find some useful info.

The end users are sitting behind an older ISA running as an HTTP proxy. They get this error as a result of a setting in the ISA server.

They can fix their issue by modifying a registry key in the ISA server.

Lookup this key:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\W3Proxy\Parameters]

Change the entry for MaxRequestHeadersSize to be dword:00032000
Change the entry for MaxResponseHeadersSize to be dword:00032000 as well.

These settings can be increased even more if needed.

Once they do that and restart ISA services, they should be ok.

Or they could just upgrade to a current proxy ;)

Thanks! :)

Unfortunately the problem is that we are not in a constrained enviornment where we can expect our visitors to make any changes in order to view our site. Is there anything we can do on our end to fix this issue? What is causing it??

So, I got some initial feedback back as well. The registry settings that CimmerianX provided is documented on a few forums out there. Unfortunately, with regard to the cause of the problem, I wasnt able to get much information back. In a scenario such as this, I would typically open a premier support case with Microsoft so that a capture could be analyzed by a support engineer to determine the cause of this issue.

This issue could be specific to the version of ISA that visitor is passing through, may have been fixed in new versions of ISA.

Unfortunatley, there isnt a clear documented response on the Internet regarding the cause of this issue, or at least, I couldnt find one and my ISA contacts have not encountered this as of yet.

If they can't increase the size of the headers they can receive, if it is important enough can you reduce the size of the header sent? Is there an acceptable parameter it can be reduced to?

As you can see, I'm a little out of my water here. :-)

Let me rephrase my question: Would sending smaller cookies resolve the problem?

The typical maximum size for a cookie is 4096b, are Daniweb's cookies much bigger than this?

Going off some data from MS, and crosschecking with other browsers, 4096b is a safe figure. It is also not recommended to send more than 50 per domain. (Opera doesn't like more than 30)

Please double-check these figures. Although I've had a good rummage around, it's not always easy to identify how current these statistics are.

This site, Cookie Limits Tests, has some interesting info and stats, as well as a cookie tester.

Sorry if this is a little off-topic, but this site, BrowserShots, will visit your website using a vast array of Browsers and take screenshots for you. It may be just a little helpful in your testing.

The largest cookie we have is about 2KB. We have about 5 cookies in total.

This Wiki is an interesting read, and it specifically discusses Header Fields and Responses. I know you won't need to read most of that because you work with it every day. But, I just wondered if the problem could possibly be related to any of the details sent as "Common Non-standard Response Headers"

Finally, is there any data of whatever description requesting to be cached on their server, or vice versa, that is being rejected? A "Response Header" there is no place for will be a "Response Header" too long. Caching is discussed in the same article here,"Avoiding Caching".

Based on what is reported online, this issue can be resolved by tuning the configuration of the ISA server. I would not suggest that the appropriate action is to modify anything on the daniweb side unless you are getting reports that there is an issue being experienced by users in various scenarios. If the only set of users that have reported this are behind an ISA server, i think it be appropriate to specifically figure out what it is the this ISA server does not like. For all you know, the users behind this ISA server are experiencing the same issue when visiting other sites.

Agree from JorgeM. Don't fix their problem by changing your code that works for 99.9% of the visitors out there. The issue is documented to be a misconfigured ISA. I would poilitly point that fix out to them as being "in their best interest to implement".

I would poilitly point that fix out to them as being "in their best interest to implement".

But the problem is that we are not in a closed environment. Someone comes in from a Google search. The page doesn't load. They leave, never to be heard from again. Thankfully, I was able to hear about once such documented case of this happening, where the person was able to troubleshoot for me, but for every one time I find out about it, it happens 100,000 other times that I don't know about.

Understood that as a web admin, you want to make the site as compatible for as many people as possible. But you will always have issues with end users who don't configure their systems correctly.... but I suppose that's obvious.

The issue here is that the HTTP header is refused by ISA. Nothing in the HTML is causing this. So you would need to look at the fields you have defined and try to reduce the overall data being sent.

Start here, looking at the response fields:
https://en.wikipedia.org/wiki/List_of_HTTP_header_fields

Then you use that knowledge to look at your apache configs and just try to streamline down any unnecessary data...

We looked at that link here. Using your knowledge and experience, what would you say could be the likely culprit in that list if Daniweb chose to make changes in their site handling to accommodate older config's?

I captured the HTTP as I just browsed around the site a bit. Most of what we receive is pretty standard:

HTTP/1.1 200 OK

Date: Sat, 19 Jan 2013 19:37:48 GMT

Server: Apache/2.2

X-Powered-By: PHP/5.3.10

Set-Cookie: csrf_cookie=d79d17617ac5f0432075c0a336080fe6; expires=Sat, 19-Jan-2013 21:37:48 GMT; path=/; domain=www.daniweb.com

Vary: Accept-Encoding

Content-Encoding: gzip

Content-Length: 5786

Keep-Alive: timeout=10, max=400

Connection: Keep-Alive

Content-Type: text/html; charset=UTF-8
--followed by the compressed items---

That's just a quick sample. Nothing I saw is screaming "I'm too large" at me. The largest header size was 32,215 when requesting GET /js/FusionCharts/Charts/FusionCharts.js HTTP/1.1.

The real way to look at this would be to try and replicate the failing environment. But we would need to know the version of ISA, patch level, and ideally, a packet capture on that side at client and server level.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.