Load multi character delimited file into hive

Load multi character delimited file into hive

Requirement

You have a file which is delimited by multiple characters (%$) and you want to create a table in the hive on top of it.

 

Solution

Step 1: Sample File

Create a sample file named as sample_1.txt.
Download from here sample_1
(You can skip this step if you already have a file, just place it into a local directory.)
Put content in that file, delimited by a (%$). If you have created a file in windows then transfer it to your Linux machine via WinSCP.
Please refer below screenshot.


I have a local directory named as /root/local_bdp/posts/load-multi-character-delimited-file-into-hive, so I have placed a sample_1.txt file in that directory.

Step 2: Copy File to HDFS

Run Below commands in the shell for initial setup.
First, create a Hdfs directory hv_mlt_chr using below command.

 
 
hadoop fs -mkdir hv_mlt_chr

Put the file in Hdfs using below command

 
 
  1. hadoop fs -put /root/local_bdp/posts/load-multi-character-delimited-file-into-hive/sample_1.txt hv_mlt_chr/

Check whether file is available in Hdfs or not using below command

 
 
  1. hadoop fs -ls hv_mlt_chr/

 

NOTE: – For me, the default Hdfs directory is /user/root/
You can check the content of file using below command

 
 
  1. hadoop fs -cat hv_mlt_chr/sample_1.txt

Step 3: Create Hive Table and Load data

Now you have file in Hdfs, you just need to create an external table on top of it.
Use below hive scripts to create an external table userTrck in schema bdp. Run below script in hive CLI.

 
 
CREATE SCHEMA IF NOT EXISTS bdp;
CREATE EXTERNAL TABLE bdp.userTrck
(id STRING,
u_type STRING,
s_code STRING)
ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe'
WITH SERDEPROPERTIES ("field.delim"="%$")
LOCATION 'hdfs://sandbox.hortonworks.com:8020/user/root/hv_mlt_chr';

You can change the delimiter as per your requirement.In this case, delimiter should be %$ , hence %$ is written.Make sure that you have written serde2.MultiDelimitSerDe.

Step 4: Verify data

Please check whether data is showing in a table or not using below command.

 
 
  1. select * from bdp.userTrck;

Wrapping Up

Sometimes data comes from a system which gives multi character delimited file. It may come from website or any other logging system, generally we use multi character as delimiter when data already have (or may have) all possible special symbol, that time separating one column with other would be tough, in that case we can use multi characters as delimiter which is impossible to occur in the data as value somewhere in the string.

Keep Learning.

22
0

Join in hive with example

Requirement You have two table named as A and B. and you want to perform all types of join in ...
Read More

Join in pyspark with example

Requirement You have two table named as A and B. and you want to perform all types of join in ...
Read More

Join in spark using scala with example

Requirement You have two table named as A and B. and you want to perform all types of join in ...
Read More

Java UDF to convert String to date in PIG

About Code Many times it happens like you have received data from many systems and each system operates on a ...
Read More
/ java udf, Pig, pig, pig udf, string to date, udf

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.