<?xml version="1.0" encoding="UTF-8"?>        <rss version="2.0"
             xmlns:atom="http://www.w3.org/2005/Atom"
             xmlns:dc="http://purl.org/dc/elements/1.1/"
             xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
             xmlns:admin="http://webns.net/mvcb/"
             xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
             xmlns:content="http://purl.org/rss/1.0/modules/content/">
        <channel>
            <title>
									BIG DATA PROGRAMMERS Discussion Board - Recent Topics				            </title>
            <link>https://bigdataprogrammers.com/qa-forum/</link>
            <description>Q-A Forum For All Big Data Programmers</description>
            <language>en-US</language>
            <lastBuildDate>Thu, 23 Apr 2026 22:33:40 +0000</lastBuildDate>
            <generator>wpForo</generator>
            <ttl>60</ttl>
							                    <item>
                        <title>Q微875093提供办理美国ASU毕业证ASU成绩单亚利桑那州立大学毕业证成绩单</title>
                        <link>https://bigdataprogrammers.com/qa-forum/hdfs/q%e5%be%ae875093%e6%8f%90%e4%be%9b%e5%8a%9e%e7%90%86%e7%be%8e%e5%9b%bdasu%e6%af%95%e4%b8%9a%e8%af%81asu%e6%88%90%e7%bb%a9%e5%8d%95%e4%ba%9a%e5%88%a9%e6%a1%91%e9%82%a3%e5%b7%9e%e7%ab%8b%e5%a4%a7/</link>
                        <pubDate>Fri, 28 Jun 2019 07:46:10 +0000</pubDate>
                        <description><![CDATA[Q微信875093404诚信专业办理美国毕业证,成绩单.美国各大学毕业证书成绩单一比一制作
#澳洲文凭毕业证#英国毕业证#英国文凭#加拿大文凭毕业证#新西兰毕业证#法国毕业证#德国毕业证#
铸就十年，十年品质！信誉！实体公司！可以视频看办公环境样板，如需办理真实可查
可以先到公司面谈，勿轻信小中介黑作坊，没有实体公司不能面谈的！
◆全套服务：毕业证、成绩单、真实回国人员...]]></description>
                        <content:encoded><![CDATA[<p>Q微信875093404诚信专业办理美国毕业证,成绩单.美国各大学毕业证书成绩单一比一制作<br />
#澳洲文凭毕业证#英国毕业证#英国文凭#加拿大文凭毕业证#新西兰毕业证#法国毕业证#德国毕业证#<br />
铸就十年，十年品质！信誉！实体公司！可以视频看办公环境样板，如需办理真实可查<br />
可以先到公司面谈，勿轻信小中介黑作坊，没有实体公司不能面谈的！<br />
◆全套服务：毕业证、成绩单、真实回国人员证明、真实教育部认证。让您回国发展信心十足！<br />
◆可以提供钢印、水印、烫金、激光防伪、凹凸版、最新版的毕业证、百分之百让您绝对满意、<br />
◆印刷，DHL快递毕业证、成绩单7个工作日，真实大使馆教育部认证1个月。为了达到高水准高效率，<br />
◆请您先以QQ或微信的方式，对我们的服务进行了解后，如果有帮助再进行电话咨询。<br />
◆留学回国服务负责人Homer联系方式如下:QQ：875093404 微信：875093404<br />
旧金山艺术大学斯蒂文斯理工学院韦恩州立大学纽约城市大学科罗拉多大学圣何塞州立大学旧金山州立大学<br />
洪堡州立大学密苏里州立大学圣地亚哥大学加州大学河滨分校加州大学欧文分校加州大学戴维斯分校罗格斯大学<br />
阿拉巴马大学丹佛大学伍斯特理工学院加州大学圣塔芭芭拉分校德克萨斯大学奥斯汀分校马里兰大学雪城大学<br />
迈阿密大学马萨诸塞大学波士顿分校俄勒冈州立大学俄勒冈大学俄亥俄州立大学俄亥俄大学南方卫理公会大<br />
坎伯兰大学北卡罗来纳大学奥本大学美国爱荷华州立大学爱荷华大学加利福尼亚州立大学纽约理工学院<br />
密苏里大学哥伦比亚分校加州大学圣克鲁兹分校亚利桑那州立大学亚利桑那大学纽约州立大学石溪分校<br />
美国东北大学美国大学密苏里大学圣路易斯分校密歇根州立大学普渡大学匹兹堡大学佛罗里达州立大学<br />
佛罗里达大学宾州州立大学华盛顿州立大学华盛顿大学伊利诺伊大学威斯康星大学波士顿大学波士顿学院<br />
纽约大学南加州大学密西根大学加州大学洛杉矶分校乔治城大学卡内基梅隆大学加州大学伯克利分校莱斯大学<br />
翰霍普金斯大学康奈尔大学圣路易斯华盛顿大学西北大学芝加哥大学杜克大学哥伦比亚大学宾夕法尼亚大学加州理工学院</p>]]></content:encoded>
						                            <category domain="https://bigdataprogrammers.com/qa-forum/"></category>                        <dc:creator>wu jing</dc:creator>
                        <guid isPermaLink="true">https://bigdataprogrammers.com/qa-forum/hdfs/q%e5%be%ae875093%e6%8f%90%e4%be%9b%e5%8a%9e%e7%90%86%e7%be%8e%e5%9b%bdasu%e6%af%95%e4%b8%9a%e8%af%81asu%e6%88%90%e7%bb%a9%e5%8d%95%e4%ba%9a%e5%88%a9%e6%a1%91%e9%82%a3%e5%b7%9e%e7%ab%8b%e5%a4%a7/</guid>
                    </item>
				                    <item>
                        <title>Unable to parse XML data in Hive</title>
                        <link>https://bigdataprogrammers.com/qa-forum/hive/unable-to-parse-xml-data-in-hive/</link>
                        <pubDate>Thu, 22 Nov 2018 03:04:58 +0000</pubDate>
                        <description><![CDATA[Hi All,I am trying to parse some XML data with the xpath functions in Hive and receiving below error.What i have done so far:Sample Data:&lt;Store&gt;
    &lt;Version&gt;1.1&lt;/Version&gt;...]]></description>
                        <content:encoded><![CDATA[<p>Hi All,</p><p>I am trying to parse some XML data with the xpath functions in Hive and receiving below error.</p><p>What i have done so far:</p><p>Sample Data:</p><pre class="lang-xml prettyprint prettyprinted"><code><span class="tag">&lt;Store&gt;</span>
    <span class="tag">&lt;Version&gt;</span><span class="pln">1.1</span><span class="tag">&lt;/Version&gt;</span>
    <span class="tag">&lt;StoreId&gt;</span><span class="pln">16695</span><span class="tag">&lt;/StoreId&gt;</span>    
    <span class="tag">&lt;Bskt&gt;</span>
      <span class="tag">&lt;TillNo&gt;</span><span class="pln">4</span><span class="tag">&lt;/TillNo&gt;</span>
      <span class="tag">&lt;BsktNo&gt;</span><span class="pln">1753</span><span class="tag">&lt;/BsktNo&gt;</span>
      <span class="tag">&lt;DateTime&gt;</span><span class="pln">2017-10-31T11:19:34.000+11:00</span><span class="tag">&lt;/DateTime&gt;</span>
      <span class="tag">&lt;OpID&gt;</span><span class="pln">50056</span><span class="tag">&lt;/OpID&gt;</span>
      <span class="tag">&lt;Itm&gt;</span>
        <span class="tag">&lt;ItmSeq&gt;</span><span class="pln">1</span><span class="tag">&lt;/ItmSeq&gt;</span>
        <span class="tag">&lt;GTIN&gt;</span><span class="pln">29559</span><span class="tag">&lt;/GTIN&gt;</span>
        <span class="tag">&lt;ItmDsc&gt;</span><span class="pln">CHOCALATE</span><span class="tag">&lt;/ItmDsc&gt;</span>
      <span class="tag">&lt;ItmProm&gt;</span>
          <span class="tag">&lt;PromCD&gt;</span><span class="pln">CM</span><span class="tag">&lt;/PromCD&gt;</span>
        <span class="tag">&lt;/ItmProm&gt;</span>
      <span class="tag">&lt;/Itm&gt;</span>
      <span class="tag">&lt;Itm&gt;</span>
        <span class="tag">&lt;ItmSeq&gt;</span><span class="pln">2</span><span class="tag">&lt;/ItmSeq&gt;</span>
        <span class="tag">&lt;GTIN&gt;</span><span class="pln">59653</span><span class="tag">&lt;/GTIN&gt;</span>
        <span class="tag">&lt;ItmDsc&gt;</span><span class="pln">CORN FLAKES</span><span class="tag">&lt;/ItmDsc&gt;</span>
      <span class="tag">&lt;/Itm&gt;</span>
        <span class="tag">&lt;Itm&gt;</span>
        <span class="tag">&lt;ItmSeq&gt;</span><span class="pln">3</span><span class="tag">&lt;/ItmSeq&gt;</span>
        <span class="tag">&lt;GTIN&gt;</span><span class="pln">42260</span><span class="tag">&lt;/GTIN&gt;</span>
        <span class="tag">&lt;ItmDsc&gt;</span><span class="pln"> MILK CHOCOLATE 162GM</span><span class="tag">&lt;/ItmDsc&gt;</span>
        <span class="tag">&lt;ItmProm&gt;</span>
          <span class="tag">&lt;PromCD&gt;</span><span class="pln">MTSRO</span><span class="tag">&lt;/PromCD&gt;</span>
          <span class="tag">&lt;OfferID&gt;</span><span class="pln">11766</span><span class="tag">&lt;/OfferID&gt;</span>
        <span class="tag">&lt;/ItmProm&gt;</span>
      <span class="tag">&lt;/Itm&gt;</span>
    <span class="tag">&lt;/Bskt&gt;</span>
    <span class="tag">&lt;Bskt&gt;</span>
      <span class="tag">&lt;TillNo&gt;</span><span class="pln">5</span><span class="tag">&lt;/TillNo&gt;</span>
      <span class="tag">&lt;BsktNo&gt;</span><span class="pln">1947</span><span class="tag">&lt;/BsktNo&gt;</span>
      <span class="tag">&lt;DateTime&gt;</span><span class="pln">2017-10-31T16:24:59.000+11:00</span><span class="tag">&lt;/DateTime&gt;</span>
      <span class="tag">&lt;OpID&gt;</span><span class="pln">50063</span><span class="tag">&lt;/OpID&gt;</span>
      <span class="tag">&lt;Itm&gt;</span>
        <span class="tag">&lt;ItmSeq&gt;</span><span class="pln">1</span><span class="tag">&lt;/ItmSeq&gt;</span>
        <span class="tag">&lt;GTIN&gt;</span><span class="pln">24064</span><span class="tag">&lt;/GTIN&gt;</span>
        <span class="tag">&lt;ItmDsc&gt;</span><span class="pln">TOMATOES 2KG</span><span class="tag">&lt;/ItmDsc&gt;</span>
        <span class="tag">&lt;ItmProm&gt;</span>
          <span class="tag">&lt;PromCD&gt;</span><span class="pln">INSTORE</span><span class="tag">&lt;/PromCD&gt;</span>
        <span class="tag">&lt;/ItmProm&gt;</span>
      <span class="tag">&lt;/Itm&gt;</span>
      <span class="tag">&lt;Itm&gt;</span>
        <span class="tag">&lt;ItmSeq&gt;</span><span class="pln">2</span><span class="tag">&lt;/ItmSeq&gt;</span>
        <span class="tag">&lt;GTIN&gt;</span><span class="pln">81287</span><span class="tag">&lt;/GTIN&gt;</span>
        <span class="tag">&lt;ItmDsc&gt;</span><span class="pln">ROTHMANS BLUE</span><span class="tag">&lt;/ItmDsc&gt;</span>
        <span class="tag">&lt;ItmProm&gt;</span>
          <span class="tag">&lt;PromCD&gt;</span><span class="pln">TF</span><span class="tag">&lt;/PromCD&gt;</span>
        <span class="tag">&lt;/ItmProm&gt;</span>
      <span class="tag">&lt;/Itm&gt;</span>
    <span class="tag">&lt;/Bskt&gt;</span>
  <span class="tag">&lt;/Store&gt;</span>  </code></pre><p> </p><p>1) Create an external table to read the entire XML data in a single column.</p><pre>CREATE EXTERNAL TABLE poc_scanp_xml_single(xmldata STRING) LOCATION '/DEV/TEST/nanda_test';</pre><p>2) Create a view on top </p><pre>CREATE VIEW xmldataview (version,storeid,basket_dtm,basket_num,till_number,item_sequence,gtin_number,item_desc,promo_code,offer_id)<br />	AS SELECT<br />	xpath_string(xmldata,'Store/Version/text()'),<br />	xpath_string(xmldata,'Store/StoreId/text()'),<br />	xpath_string(xmldata,'Store/Bskt/DateTime/text()'),<br />	xpath_string(xmldata,'Store/Bskt/BsktNo/text()'),<br />	xpath_string(xmldata,'Store/Bskt/TillNo/text()'),<br />	xpath_string(xmldata,'Store/Bskt/Itm/ItmSeq/text()'),<br />	xpath_string(xmldata,'Store/Bskt/Itm/GTIN/text()'),<br />	xpath_string(xmldata,'Store/Bskt/Itm/ItmDsc/text()'),<br />	xpath_string(xmldata,'Store/Bskt/Itm/ItmProm/PromCD/text()'),<br />	xpath_string(xmldata,'Store/Bskt/Itm/ItmProm/OfferID/text()')<br />	FROM POC_SRC_IGA_SCAN_BASKET_ITEM_PROMO_SINGLE;<br /><br />Error while reading the data from view:<br /><br /></pre><p>Bad status for request TFetchResultsReq(fetchType=0, operationHandle=TOperationHandle(hasResultSet=True, modifiedRowCount=None, operationType=0, operationId=THandleIdentifier(secret='\x8c\x8c\xd4\xa6\xa8\x84F\x14\x80\xbb\x01\xc8$\x85\xaa\x9b', guid='~}\\\xef%\x0eI\x99\xbc\xff\x85p /\xbe5')), orientation=4, maxRows=100): TFetchResultsResp(status=TStatus(errorCode=0, errorMessage='java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to execute method public org.apache.hadoop.io.Text org.apache.hadoop.hive.ql.udf.xml.UDFXPathString.evaluate(java.lang.String,java.lang.String) on object org.apache.hadoop.hive.ql.udf.xml.UDFXPathString@6dfa6b18 of class org.apache.hadoop.hive.ql.udf.xml.UDFXPathString with arguments { &lt;Store&gt;:java.lang.String, Store/Version/text():java.lang.String} of size 2', sqlState=None, infoMessages=, statusCode=3), results=None, hasMoreRows=None)</p><p> </p><p> </p><pre><br /><br /><br /></pre>]]></content:encoded>
						                            <category domain="https://bigdataprogrammers.com/qa-forum/"></category>                        <dc:creator>Nanda kumar</dc:creator>
                        <guid isPermaLink="true">https://bigdataprogrammers.com/qa-forum/hive/unable-to-parse-xml-data-in-hive/</guid>
                    </item>
				                    <item>
                        <title>Problem statement - help needed MapReduce Program</title>
                        <link>https://bigdataprogrammers.com/qa-forum/hdfs/problem-statement-help-needed-mapreduce-program/</link>
                        <pubDate>Tue, 06 Nov 2018 22:43:34 +0000</pubDate>
                        <description><![CDATA[An online song website has millions of data in filesI had provided with 31 days of data in flat file (songid,userid,timestamp)I have to pick Top 100 Trending songs for 25-31 dated which stre...]]></description>
                        <content:encoded><![CDATA[<p>An online song website has millions of data in files</p><p>I had provided with 31 days of data in flat file (songid,userid,timestamp)</p><p>I have to pick Top 100 Trending songs for 25-31 dated which streamed online trending means most frequently played songs</p><p>Need to develop amap reduce program for this</p><p>Thank You</p><p> </p>]]></content:encoded>
						                            <category domain="https://bigdataprogrammers.com/qa-forum/"></category>                        <dc:creator>Shaikh Shahid</dc:creator>
                        <guid isPermaLink="true">https://bigdataprogrammers.com/qa-forum/hdfs/problem-statement-help-needed-mapreduce-program/</guid>
                    </item>
							        </channel>
        </rss>
		