Requirement
You have two table named as A and B. and you want to perform all types of join in spark using python. It will help you to understand, how join works in pyspark.
Solution
Step 1: Input Files
Download file Aand B from here. And place them into a local directory.
File A and B are the comma delimited file, please refer below :-
I am placing these files into local directory ‘sample_files’
cd sample_files ls -R *
Step 2: Loading the files into Hive.
To load the files into hive,Let’s first put these files into hdfs location using below commands.
hadoop fs -mkdir bdps/sample_files hadoop fs -mkdir bdps/sample_files/A hadoop fs -mkdir bdps/sample_files/B hadoop fs -put A/A.txt bdps/sample_files/A/ hadoop fs -put B/B.txt bdps/sample_files/B/
you can check the files in hdfs using below command.
hadoop fs -ls -R hdfs://sandbox-hdp.hortonworks.com:8020/user/root/bdps/sample_files/
Now let’s create two hive table A and B for both the files,using below commands:-
CREATE SCHEMA IF NOT EXISTS bdp; CREATE EXTERNAL TABLE IF NOT EXISTS bdp.A (id INT, type STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE LOCATION 'hdfs://sandbox-hdp.hortonworks.com:8020/user/root/bdps/sample_files/A'; CREATE EXTERNAL TABLE IF NOT EXISTS bdp.B (id INT, type STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE LOCATION 'hdfs://sandbox-hdp.hortonworks.com:8020/user/root/bdps/sample_files/B';
Let’s check whether data populated correctly or not using below commands :-
select * from bdp.A; select * from bdp.B;
Step3: Loading Tables in pyspark
Now enter into pyspark using below command ,
pyspark
Note : I am using spark version 2.3
use below command to load hive tables in to dataframe :-
A=spark.table("bdp.A") B=spark.table("bdp.B")
and check data using below command :-
A.show() B.show()
Let’s understand join one by one
A. Inner Join:
Sometimes it is required to have only common records out of two datasets. Now we have two table A & B, we are joining based on a key which is id.
So in output, only those records which match id with another dataset will come. Rest will be discarded.
Use below command to perform the inner join.
inner_df=A.join(B,A.id==B.id)
Expected output:
Use below command to see the output set.
inner_df.show()
Please refer below screen shot for reference.
As you can see only records which have the same id such as 1, 3, 4 are present in the output, rest have been discarded.
B. Left Join
this type of join is performed when we want to look up something from other datasets, the best example would be fetching a phone no of an employee from other datasets based on employee code.
Use below command to perform left join.
left_df=A.join(B,A.id==B.id,"left")
Expected output
Use below command to see the output set.
left_df.show()
Now we have all the records of left table A and matched records of table B.
C. Right Join
This type of join is performed when we want to get all the data of look-up table with only matching records of left table.
Use below command to perform right join.
right_df=A.join(B,A.id==B.id,"right")
Expected output
Use below command to see the output set.
right_df.show()
Now we have all the records of right table B and matched records of table A.
D.Full Join
When it is needed to get all the matched and unmatched records out of two datasets, we can use full join. All data from left as well as from right datasets will appear in result set. Nonmatching records will have null have values in respective columns.
Use below command to perform full join.
full_df=A.join(B,A.id==B.id,"full")
Expected output
Use below command to see the output set.
full_df.show()
Now we have all matched and unmatched records in output as shown below.
Wrapping Up
Joins are important when you have to deal with data which are present in more than a table. In real time we get files from many sources which have a relation between them, so to get meaningful information from these data-sets it needs to perform join to get combined result.
Don’t forget to subscribe us. Keep learning.
Don’t miss the tutorial on Top Big data courses on Udemy you should Buy