'W' stands for Wide character version, 'A' for ansi version. The difference is in how they handle strings (note that this can be an internal thing though, like with the GetWindowLong API; it doesn't nessecairly mean the API needs string parameters).
To know the exact difference see the UNICODE vs. ANSI subject and how characters are used and coded in both of them.
Way to much to properly explain here, but some important articles to get started:
Unicode
Unicode in the Windows API
Unicode and ANSI Functions in Windows
Unicode in Win32s and Windows 95
Also, if you look up APIs in the MSDN Library, you will always see if there is a Wide character version and an Ansi version available or not (in the
Requirements section). If they are available, you should always use the unicode version unless you have a specific reason not to (eg: you're programming for a 16-bit system, or for legacy purposes, etc).
This is one more reason why you should always look APIs up in the MSDN Library when using them, even if they seem trivial and an easy to use.
---------------
Important: SetWindowLong isn't correct either, it should also be
SetWindowLongW for the same reasons.
---------------
Even more importantly:
Javascript code:
var Get = Interop.Call("user32", "GetWindowLong", PlusWnd.Handle, -20); // -20 for extended window styles
Get += (PlusWnd.Button_IsChecked("ChkTop") ? 262016 : -262016); // toggle APPWINDOW / TOOLWINDOWInterop.Call("user32", "SetWindowLong", PlusWnd.Handle, -20, Get); // set new flag
You should never ever do the above to toggle a flag setting! It will never work properly.
You should use
boolean arithmetics with the bitmask of the flags you whish to set/remove, not normal arithmetics!!! Otherwise, if the style flags already included the specific flag you whish to set, you will calculate the wrong value when you add the value of the bit in question using normal arithmetics. The same for when you whish to remove a flag when you simply substract the bitvalue instead of doing a boolean operation.
Say the initial value of the styles is 15 (binary 1
111), and you whish to set the 3rd bit (which is already set, but you don't know that).
If you do A = A + 4 the result will actually be 19 (binary
10011), not 15 (binary 1
111).
So, instead you should do A = A OR 4
(4 in binary is 0100)
...which will result in 15 (binary 1
111) if the initial value was already 15 (binary 1
111).
...which will result in 6 (binary 0
110) if the initial value was 2 (binary 0
010).
Vice versa, if you whish to remove the 3rd bit and your inital value has already the 3rd bit removed, for example value 10 (binary 1
010):
If you do A = A - 4 the result will actually be 6 (binary
0110), not the expected 10 (binary 1
010).
Instead you should do A = A AND 11
(11 is the mask where all bits are set except the ones you want to remove, in binary it is 1011, thus the inverse of 4, see above)
...which will result in 10 (binary 1
010), if the initial value was already 10 (binary 1
010).
...which will result in 9 (binary 1
001), if the initial value was 13 (binary 1
101).
Thus if you want to set a specific flag:
Result = Original
OR flagvalue
If you want to remove a specific flag:
Result = Original
AND (
NOT flagvalue)
So, instead you should use an AND or OR operation. eg:
To set a flag (in this case WS_EX_APPWINDOW) use 'OR' with the flag's value:
Get = Get
| 0x40000
To remove a flag (in this case WS_EX_APPWINDOW) use 'AND' with a bitmask of the bits you whish to keep (otherwise known as the inverse of the flag value):
Get = Get
& ~0x40000
or
Get = Get
& 0xFFFBFFFF
EDIT: typo fixed (! should be ~)