How to calculate Rank in dataframe using python with example

Requirement :

You have marks of all the students of class and you want to find ranks of students using python.

Given :

A pipe separated file which contains roll number and marks of students : below are the sample values :-

R_no marks
101 389
102 412
103 435

Please download sample file marks

Solution :

Step 1 : Loading the Raw data into hive  table

As you can see we have our raw data into file which is pipe separated. I am keeping the raw file into class8 directory which is just created . Use below commands:-

put file into local directory
 
  1. cd bdp/projects/
  2. mkdir class8
  3. cd class8
  4. cat marks.txt

Now I am putting this file into Hdfs location after creating stu_marks directory into Hdfs location.

Use below commands:-

 

put local file into hdfs
 
  1. hadoop fs -mkdir bdps/stu_marks
  2. hadoop fs -put marks.txt bdps/stu_marks/
  3. hadoop fs -ls bdps/stu_marks/

Now we will load this file into hive table . Please refer this post if you have trouble loading file into hive table .You need to give pipe (|) as delimiter .

I have created one hive table  named as “Studnt_mrks” on top of this data .

Which have two columns and both of them are of Int type.

Please use below commands one by one to create a hive table on top of it.

 

Create hive table
 
  1. CREATE SCHEMA IF NOT EXISTS bdp;
  2. CREATE EXTERNAL TABLE IF NOT EXISTS bdp.class8_marks
  3. (roll_no INT,
  4. ttl_marks INT)
  5. ROW FORMAT DELIMITED
  6. FIELDS TERMINATED BY '|'
  7. STORED AS TEXTFILE
  8. LOCATION 'hdfs://sandbox-hdp.hortonworks.com:8020/user/root/bdps/stu_marks';

As you can see above the Location which I gave here is the path where my hdfs file is present.

Step 2: – Loading hive table into Spark using python

First open pyspark shell by using below command:-

pyspark shell
 
  1. pyspark

Once the CLI is opened .Use below commands to load the hive table:-

Load hive table in spark
 
  1. stu_marks=spark.table("bdp.class8_marks")

here you can see stu_marks is the data frame which contains the data of hive table: – You can see the data using show command :-

to see the data
 
  1. stu_marks.show()
  2.  

Step 3 : Assign Rank to the students

Now let’s come to the actual logic to find the rank of the students :-

Please use below command to import the required functions

imports
 
  1. from pyspark.sql.functions import rank
  2. from pyspark.sql.window import Window
  3. from pyspark.sql.functions import col

 

assigning ranks
 
  1. ranked=stu_marks.withColumn("rank",rank().over(Window.orderBy(col("ttl_marks").desc())))

In above command I am using rank function over marks . As we want to rank higher if one has score higher marks, So we are using desc .

Below command will give you the expected results ,In which rank of student is assigned against roll no.

to see the final output
 
  1. ranked.show()

You can save the result as per your requirements .

Wrapping Up:

Here we have understood how the rank functions works and what is the simple use case of this function. We can assign rank partition wise as well ,For that you have to use partition by in over clause .

Rank can be used if you want to find the result of n’th rank holder .You can filter based on the required rank. If you are looking for the same code in scala instead of python .Please read this blog post.

Don’t forget to subscribe our blog.

Load CSV file into hive AVRO table

Requirement You have comma separated(CSV) file and you want to create Avro table in hive on top of it, then ...
Read More

Load CSV file into hive PARQUET table

Requirement You have comma separated(CSV) file and you want to create Parquet table in hive on top of it, then ...
Read More

Hive Most Asked Interview Questions With Answers – Part II

What is bucketing and what is the use of it? Answer: Bucket is an optimisation technique which is used to ...
Read More
/ hive, hive interview, interview-qa

Spark Interview Questions Part-1

Suppose you have a spark dataframe which contains millions of records. You need to perform multiple actions on it. How ...
Read More

Leave a Reply