0

Hello,

I admit I’n not sure about which forum to post my question to.

Anyway, here’a my "problem" - not as much a problem (as my website is served alright) but it bugs me.

My website (www.rosebrossut.fr) was created in XHTML and with character encoding ASCII.

In the meta tags both the mime-type and character encoding is declared accordingly (application/xhtml+xml and us-ascii) but when I look in Firefox both are identified as text/html and windows-1252, respectively. Trying to validate the code using Validator.nu yields the same result and thus any XML markup and the subsequent us-ascii declaration results in a fatal error message.

I tried two ways of forcing the correct mime declaration (not convinced by those myself) but without any success.

1st idea: adding header('Content-type: application/xhtml+xml'); as first line to my PHP code.
2nd idea: adding AddType application/xhtml+xml .php to my .htaccess file.

Can anyone explain this behaviour and (even better) give me an idea to solve this issue, please!?

Thank you very much!

Edited by Alba Ra

2
Contributors
3
Replies
12
Views
3 Years
Discussion Span
Last Post by Alba Ra
0

us-ascii and windows-1252 are largely identical and so the tools you've tried may just be substituting someone's prferred choice for labeling the charset identifier. The document MIME-type doesn't (directly) affect the character encoding for HTML/XHTML documents, so neither of the two ideas you posted would have any effect.

When I ran your page through the W3C validator it complained because you had non-ASCII characters in the comments at the end of the document. So my advice would be to try changing the <meta> 'http-equiv' to 'charset=utf-8', which I think may be your best choice in the long run.

0

Thanks for responding.

Largely identical is not exactly identical. Validator.nu breaks off the validation because Fatal Error: Changing encoding at this point would need non-streamable behavior.

Changing character encoding is a way to get around the problem but does not solve the problem. The W3C Validator does not seem to have the problem.

I prefer to keep us-ascii to better control the character display (i.e. I would have to mix UTF characters and HTML entities because I want specific spaces and other similar non-visual characters, else how to input &#8197; in UTF). This way a validator would tell me when I missed a character (i.e. à instead of &agrave;).

Unrelated though possibly connected remains the issue why the output is text/html and not application/xhtml+xml. (Yes, it will not affect the character encoding but both issue affect the validation process.)

0

Okay, one issue solved - or rather: it was not an issue at all.

Added header('Content-type: application/xhtml+xml'); as first line of the PHP file will produce the right mime type! (I use Aptana3 for some of the work and I managed to edit the local file instead of the file on the server. Stupidity is a bug that DaniWeb will have difficulty solving, right?)

Remains the character encoding…

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.