Categories:
.NET (961)
C (387)
C++ (185)
CSS (84)
DBA (8)
General (31)
HTML (48)
Java (641)
JavaScript (220)
JSP (109)
JUnit (31)
MySQL (297)
Networking (10)
Oracle (562)
Perl (48)
Perl (9)
PHP (259)
PL/SQL (140)
RSS (51)
Software QA (28)
SQL Server (5)
Struts (20)
Unix (2)
Windows (3)
XHTML (199)
XML (59)
Other Resources:
ASCII, Unicode, UTF-8 and UTF-16
How many bits are used to represent Unicode, ASCII, UTF-16, and UTF-8 characters?
✍: FYIcenter
Unicode requires 16 bits and ASCII require 7 bits. Although the ASCII character set uses only 7 bits, it is usually represented as 8 bits. UTF-8 represents characters using 8, 16, and 18 bit patterns. UTF-16 uses 16-bit and larger bit patterns.
2007-03-03, 7104👍, 0💬
Popular Posts:
How can you check to see whether a symbol is defined? You can use the #ifdef and #ifndef preprocesso...
.NET INTERVIEW QUESTIONS - What is the difference between System exceptions and Application exceptio...
How do we create DCOM object in VB6? Using the CreateObject method you can create a DCOM object. You...
How Oracle Handles Dead Locks? - Oracle DBA FAQ - Understanding SQL Transaction Management Oracle se...
What is XSLT? XSLT is a rule based language used to transform XML documents in to other file formats...