META CONTENT-TYPE seems to be ignored
When a web site specifies META HTTP-EQUIV="CONTENT-TYPE" CONTENT="text/html; charset=Windows-1254 in the page header, and I load that page, I see that the View -> Text Encoding -> is set for Unicode.. And the page appears broken.. Shouldn't old web sites like this "appear" normally if they specified the charset in the header? If I manually select "Automatic" in the View->Text Encoding menu. then THAT page appears fine, but as soon as I click something else on the same site, the story repeats -- back to Unicode and broken characters, even though every page of the web site specifies Windows-1254 as the encoding.
Tất cả các câu trả lời (2)
I have an old page that specifies
<meta http-equiv="content-type" content="text/html; charset=iso-8859-1">
and Firefox shows View > Text Encoding > Western as expected. But that's also my default for U.S. English so it doesn't prove that Firefox followed the tag.
Can you provide a link to a page that demonstrates the problem? Posting a clickable link can delay the appearance of your reply while the post sits in the link spam moderation queue. You can break the link before the .com (or other top level domain) so the link isn't clickable.
Note that it is possible that the server sends the files as Unicode utf-8 as is common these days and in that case the server likely prevails and the meta tag can be ignored.