<<Previous Post << Complete Tutorial>> Next Post>>
In the previous articles, I have explained about Data Types and Literals in Java.
In this article, I am going to explain about the Type Casting in Java.
Java for Testers – What is Type Casting?
As explained in the previous articles, there are different data types and also different literals in Java.
When the Data Type of the Variable and Literal Type of the Value to be assigned to the variable match
If the data type of the variable and literal value type match, there is no need to covert the literal value for assigning it to the variable.
Example:
int a = 123;
In the above example, the int is the data type of the variable ‘a’ and 123 is the integer literal value. As the data type of the variable ‘a’ and the literal type of the value 123 match, Java will directly assign the literal value 123 to the int data type declared variable ‘a’.
When the Data Type of the Variable and the Literal Type of the value don’t match
But there may be some situations where the data type of the variable is different from the literal type of the value to be assigned to the variable.
In this case, any of the below two things may happen:
- Java will automatically convert the literal value to the variable’s data type value
- Java will give a compiler error as the conversion is not possible between the two different types (i.e. Variable and Value types)
I would like to give the examples for the above two cases.
Java automatically converts the literal value type to the variable’s data type value
For example, when the literal value assigned to the variable is of character type say ‘p’ and the data type of the variable is of int type as shown below:
int a = ‘p’;
In the above example, the variable ‘a’ is of int data type and the literal value assigned to the variable is of character type i.e. ‘p’
In this case, Java will automatically convert the character literal value ‘p’ to the related ASCII integer value 112 (Refer ASCII table here) and then assign to the variable ‘a’ of int data type.
Below is the practical demonstration:
Java will give a compiler error as the conversion is not possible between the literal value type and variable’s data type
For example, when the literal value assigned to the variable is of decimal type say 123.456 and the data type of the variable is of int type as shown below:
int a = 123.456;
In the above example, the variable ‘a’ is of int data type and the literal value assigned to the variable is of decimal type i.e. 123.456
In this case, Java will not automatically convert and will give a compiler error as shown in the below practical demonstration:
Explicitly Type Cast when the Java is not automatically Type Casting the literal value type to the variable’s data type
This may not work with all the different possibilities of literal types and variable’s data type types, but there are few cases where we can explicitly typecast.
For example, we can explicitly typecast the decimal literal value to int type before assigning it to the variable of int data type as shown below:
int a = (int) 123.456;
In the above example, the decimal value 123.456 is explicitly type casted to int type using (int) and the converted value 123 will be assigned to the variable ‘a’ of int type.
Below is the practical demonstration:
Here concludes this article on Type Casting in Java.
In the next article, I will explain more about the Type Casting in Java using different examples.
Next Steps:
- > To learn more about Java, continue to the next post (Click on Next Post link below)
- > Check complete Java Tutorial Contents here (Click here)
Please leave your questions/comments/feedback below:
Happy Learning ?
Arun Motoori (www.QAFox.com)
On a mission to help the Testing Community in all possible ways.
<<Previous Post << Complete Tutorial>> Next Post>>