The latest service changes have not yet been reflected in this content. We will update the content as soon as possible. Please refer to the Korean version for information on the latest updates.
Available in VPC
Hive external table are the tables created with the external keyword when creating tables in Hive. Hive External Table does not store data in the directory specified by the hive.metastore.warehouse.dir property, but it stores data in the path set when the table is created.
Since tables are created based on data that already exists in Hadoop, you can create them by simply setting the schema.
You can also see examples related to Hive External Tables in the Use Hive guide.
This guide explains how to use CSV data stored in Object Storage by connecting it to Hive External Table provided by Cloud Hadoop.
Prepare sample data
Use the temperature data provided by the Korea Meteorological Administration Meteorological Data Open Portal as sample data to test Hive External Table.
The following describes how to prepare sample data.
- Download Seoul data and Busan data.
- The search condition information of the sample data is as follows:
- Category: ground
- Region/Branch: Seoul or Busan
- Component: temperature
- Period: select Day and set as
2011년 ~ 2021년. - Condition: mark the checkboxes for Month and Day
- Check the content of the downloaded CSV file. You can see that from the top, the header content starts at Line 12, and the temperature data starts at Line 13.

Problems with viewing Korean characters can be solved by changing the encoding. For details, check the notes about problems with viewing Korean characters in Create Hive External Table.
- Remove the unnecessary header part from the downloaded CSV file and rename the file using the following command.
$ sed 1,12d extremum_20230620155750.csv > data1.csv $ sed 1,12d extremum_20230620155821.csv > data2.csv
Upload data to Object Storage
The following describes how to upload data to Object Storage.
- Create a
live-test-bucketbucket in Object Storage. - Create the
hivedatadirectory in the created bucket, and upload the sample data (CSV) file inside.

For more information on creation of buckets, see the Object Storage guide.
Create Hive External Table
The following describes how to create Hive External Tables.
-
Connect to the edge node of the Cloud Hadoop cluster via SSH.
- For more information about how to connect via SSH to cluster nodes, see Connecting to cluster nodes vis SSH guide.
-
Create an External Table using the Hive client command.
- Once the
weathertable is created, and you can check the data by running theSELECTquery.
CREATE EXTERNAL TABLE weather ( no STRING, area STRING, day STRING, avg FLOAT, max FLOAT, maxTime STRING, min FLOAT, minTime STRING, diff FLOAT ) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE LOCATION 's3a://live-test-bucket/hivedata'; SELECT count(*) FROM weather; +-------+ | _c0 | +-------+ | 7913 | +-------+ SELECT * FROM weather WHERE day = '2011-01-01'; +-------------+---------------+--------------+--------------+--------------+------------------+--------------+------------------+---------------+ | weather.no | weather.area | weather.day | weather.avg | weather.max | weather.maxtime | weather.min | weather.mintime | weather.diff | +-------------+---------------+--------------+--------------+--------------+------------------+--------------+------------------+---------------+ | 159 | Busan | 2011-01-01 | -1.1 | 4.1 | 14:55 | -5.8 | 06:40 | 9.9 | | 108 | Seoul | 2011-01-01 | -6.8 | -2.9 | 14:57 | -10.4 | 01:54 | 7.5 | +-------------+---------------+--------------+--------------+--------------+------------------+--------------+------------------+---------------+ - Once the
Problems with viewing Korean characters
The Korean characters can't be viewed correctly because the data provided by the Korea Meteorological Administration is encoded with euc-kr rather than utf-8. If you change the encoding of the Hive external table to euc-kr as shown below, then the characters is shown correctly.
ALTER TABLE weather SET TBLPROPERTIES('serialization.encoding'='euc-kr');
SELECT * FROM weather WHERE day = '2011-01-01';
+-------------+---------------+--------------+--------------+--------------+------------------+--------------+------------------+---------------+
| weather.no | weather.area | weather.day | weather.avg | weather.max | weather.maxtime | weather.min | weather.mintime | weather.diff |
+-------------+---------------+--------------+--------------+--------------+------------------+--------------+------------------+---------------+
| 159 | Busan | 2011-01-01 | -1.1 | 4.1 | 14:55 | -5.8 | 06:40 | 9.9 |
| 108 | Seoul | 2011-01-01 | -6.8 | -2.9 | 14:57 | -10.4 | 01:54 | 7.5 |
+-------------+---------------+--------------+--------------+--------------+------------------+--------------+------------------+---------------+
Delete Hive External Table
The following describes how to delete Hive External Tables.
- Delete the Hive External Table created using the following command.
- Below is an example of deleting the Hive External Table ("
weather") created earlier.
- Below is an example of deleting the Hive External Table ("
DROP TABLE weather;
- Run the SELECT query on the deleted table. You can see that the table has been successfully deleted, even though an error has occurred.
SELECT * FROM weahter;
Error: Error while compiling statement: FAILED: SemanticException [Error 10001]: Line 1:14 Table not found 'weahter' (state=42S02,code=10001)

Even if the Hive External Table is deleted, the CSV file in Object Storage is not deleted, but stored safely.