@Kian: What bothers me is mainly the usage of the paging file, not the CPU. FrontPage 2003 is nothing like its predecessors, and I like it better because it's simpler/lighter and it supports Hebrew well, even if it has less features.
quote:
Originally posted by Choli
what is that Hebrew web char. encoding? any special way of encoding??
I don't know what you know and don't know about this business, but I'll explain it anyway...
The actual character encoding I'm talking about is simply 1-byte ASCII: A certain range of characters in the second half of the ASCII table is reserved for international symbols - and each character set has different letters for the same range. If I type some Hebrew words in Notepad, save them as ASCII and send to you, you would see Latin characters (no matter what OS we have, it's still ASCII). That's why this method is problematic, and Unicode replaced it.
Unicode shows exactly the same thing everywhere. Indeed,
universal code. So why not use Unicode on the Web? I suppose it might not work on pre-Unicode systems (I've never tried it). But it's also a waste of space and bandwidth, as Unicode characters are dual-byte (so they can contain all of the letters in the world).
So what's the solution? Using good old ASCII, and specifying a character set so the Web browser knows what letters to display. So I just save the PHP file as plain ASCII, and then Patchou could add the following line inside the <head> section of the Web pages which display the translation:
<META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=windows-1255">
This will tell the HTTP server to include that information when sending the file's header to Web browsers. This will tell the Web browser what character set should be used to display the page. This is the method used by most Hebrew Web sites, and all the main browsers support it.