Load hive table into spark using Scala

Load hive table into spark using Scala

Requirement

Assume you have the hive table named as reports. It is required to process this dataset in spark. Once we have data of hive table in spark data frame we can further transform it as per the business needs. So let’s try to load hive table in spark data frame.


Solution

Please follow the below steps:-

Step 1: Sample table in hive

Let’s create table “reports” in the hive. I am using bdp schema in which I am creating a table.
Enter in to hive CLI and use below commands to create a table.

 
 
create schema bdp;
create table bdp.reports(id int,days int,year int);
INSERT INTO TABLE bdp.reports VALUES (121,232,2015),(122,245,2015),(123,134,2014),(126,67,2016),(182,122,2016),(137,92,2015),(101,311,2015);

Please refer below screen shot.

Step 2: Check table data

Please enter below command to see the records which you inserted.Please refer below screen shot for reference.

 
 
select * from bdp.reports;

Please refer below screen shot for reference.

Step 3 : Data Frame Creation

Go to Scala CLI using below command

 
 
spark-shell

Please check whether SQL context with hive support is available or not.
In below screenshot, you can see that at the bottom “Created SQL context (with Hive support).
SQL context available as sqlContext.” Is written.It means that you can use the sqlContext object to interact with the hive.

Now create a data frame hiveReports using below command.

 
 
var hiveReports = sqlContext.sql("select * from bdp.reports")

You have to pass your hive query in it. Whatever data is return by this query, will be available in a data frame.

Step 4: Output

Check whether dataset report is loaded into data frame hiveReport or no using below command

To check schema :-
 
 
hiveReports.printSchema()

To see the data:-
 
 
hiveReports.show()

It will show same output which we got in step 2.
Please refer below screenshot.

You can use this data frame further to join with another dataset, filter or to perform transformation as per needs.

Keep learning.

66
0

Join in hive with example

Requirement You have two table named as A and B. and you want to perform all types of join in ...
Read More

Join in pyspark with example

Requirement You have two table named as A and B. and you want to perform all types of join in ...
Read More

Join in spark using scala with example

Requirement You have two table named as A and B. and you want to perform all types of join in ...
Read More

Java UDF to convert String to date in PIG

About Code Many times it happens like you have received data from many systems and each system operates on a ...
Read More
/ java udf, Pig, pig, pig udf, string to date, udf

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.