1.

How Many Bits Are Used To Represent The Unicode, Ascii, Utf-16, And Utf-8 Characters In Java Programming?

Answer»

Unicode requires 16 BITS and ASCII require 7 bits. ALTHOUGH the ASCII character SET uses only 7 bits, it is usually REPRESENTED as 8 bits. UTF-8 represents characters using 8, 16, and 18 bit patterns. UTF-16 uses 16-bit and larger bit patterns.

Unicode requires 16 bits and ASCII require 7 bits. Although the ASCII character set uses only 7 bits, it is usually represented as 8 bits. UTF-8 represents characters using 8, 16, and 18 bit patterns. UTF-16 uses 16-bit and larger bit patterns.



Discussion

No Comment Found