Using Data Forest CLI
    • PDF

    Using Data Forest CLI

    • PDF

    Article Summary

    Available in VPC

    Data Forest provides Command Line Interface (CLI). Users may run deep learning solutions and programs they want by getting GPU assigned dynamically when they use Data Forest CLI. This guide introduces how to use Data Forest CLI based on a usage scenario.

    Preparations

    Creation of Data Forest account and notebook as well as cluster environment configuration is required.

    1. Connect to the console in NAVER Cloud Platform.
    2. Click the [Create Account] button from Services > Big Data & Analytics > Data Forest > Accounts menu.
    3. Create an account that will submit a Data Forest app.
    4. Click the Services > Big Data & Analytics > Data Forest > Notebooks > [Create notebook] button and create notebook.
    5. Configure the environment so that access to a Data Forest cluster is enabled.

    Using Data Forest CLI

    Step 1. Check notebook information

    The following describes how to check the notebook details.

    1. From the NAVER Cloud Platform console, click the Services > Big Data & Analytics > Data Forest menu, in that order.

    2. Click the Notebooks menu on the left. You can see the list of the notebooks you've created.
      df-notebook-detailnb-vpc_ko

      NameDescription
      Notebook nameName of created notebook
      AccountAccount that owns notebook
      Notebook typeType of created notebook. Only provides Jupyterlab type as of now
      ConditionNotebook node's condition
      Server specificationsServer specifications of notebook node
      VPCVPC in which notebook is created
      SubnetSubnet applied to notebook node
      Creation timeDate and time when the notebook is created
    3. Click the [button] at the end of the notebook list to see the notebook details.

      ItemDescription
      Account nameAccount that owns notebook
      Notebook typeType of created notebook. Only provides Jupyterlab type as of now
      Notebook nameName of created notebook
      Notebook IDUnique notebook ID
      Server specificationsServer specifications of notebook node
      VPCVPC in which notebook is created
      SubnetSubnet applied to notebook node
      ACGACG applied to notebook node
      Additional storageAdditional storage information
      DomainDomain assigned to public IP
      Authentication key nameName of the authentication key applied to the notebook
      SSH access accountOS account name for directly accessing notebook node using SSH
      Set userInformation on user settings applied to the notebook
      BucketObject storage bucket information

    Access notebook web page

    The following describes how to access a notebook's web page:

    1. Before proceeding, ensure that JupyterLab's port 80 is added to the ACG of the notebook.
    2. From the Notebooks menu, click Go to domain from the notebook's details screen.
      • If the notebook nodes are created within a Public Subnet, web pages can be accessed directly using the public IP without requiring additional tunneling settings.
    3. Once the login screen of the JupyterLab web page appears, enter your password to log in.
      • The password is the one set in the Access Password field on the user settings screen during the notebook creation process.
      • If you forgot your password or need to change your password, click the user settings [details/reset] button on the details screen of the notebook to reset your password.
        df-notebook-jupyter-login

    Step 2. Downloading Data Forest CLI

    Access to the terminal from a notebook.
    df-cli_2-1

    Download the Data Forest CLI execute file.

    $ wget http://dist.kr.df.naverncp.com/repos/df/notebook/static/dfctl
    $ chmod +x dfctl
    

    Step 3. Proceed with authentication and access approval

    1. Request login via [notebook ID] by checking the check notebook details page.
    $ ./dfctl login -i {notebook ID}
    
    1. Check the authentication request information.
      df-cli_3-1
    ItemDescription
    deviceCodeTemporary code of the currently approving CLI
    userCodeCode to check the requesting user
    verificationUriTemporary URL to proceed authentication
    1. Access the verificationUri and the check the deviceCode information from CLI and enter the userCode.
      df-cli_3-2

    2. Proceed the NCP SSO authentication from the current page.
      df-cli_3-3-0

      df-cli_3-3

    Caution

    The login step may be omitted if there is a recent access history in the browser cookie.

    1. Click the [Approve authentication] from the device access approval screen.
      df-cli_3-4

    2. You can check the access allowed printed from CLI once the access has been approved.
      df-cli_3-5

    Data Forest CLI

    The commands available in Data Forest CLI are as follows:

    Usage:

    $ ./dfctl [Command] [SubCommand]...
    

    Commands:

    • hdfs

    Data Forest CLI: hdfs

    chmod

    Usage:

    $ ./dfctl hdfs chmod [FLAGS] FILE
    

    Flags:

    • -R, --recursive: applies to all the directories and the sub directories that exists within the directories.

    Output:

    • Edits the system file or access permission of the object.

    <example>

    [forest@2e11777d04cf ~][edge-df_beta]$ ./dfctl hdfs chmod 777 /user/test-df/dst_file
    

    chown

    Usage:

    $ ./dfctl hdfs chown [FLAGS] FILE
    

    Flags:

    • -r, --recursive: applies to all the directories and the sub directories that exists within the directories.

    Output:

    • Edits the ownership of the file or directory.

    <example>

    [forest@2e11000d04cf ~][edge-df_beta]$ ./dfctl hdfs chown -r forest /user/test-df
    

    get

    Usage:

    $ ./dfctl hdfs get [REMOTE_FILE] [LOCAL_FILE]
    

    Flags:

    • N/A

    Output:

    • Copies the file from remote to local.

    <example>

    [forest@2e11000d04cf ~][edge-df_beta]$ ./dfctl hdfs get /test/df /test/local
    

    ls

    Usage:

    $ ./dfctl hdfs chmod [FLAGS] FILE
    

    Flags:

    • -a, --all: shows hidden files or directories.
    • -H, --human-readable-a, --all: improves the readability of the file size by using K, M, and G units.
    • -l, --long: prints the detailed information.
    • -R, --recursive: applies to all the directories and the sub directories that exists within the directories.

    Output:

    • Displays the file list of the current directory.

    <example>

    [forest@2e11777d04cf ~][edge-df_beta]$ ./dfctl hdfs ls
    default user dir: /user/test-df
    dst_file
    src_file
    test
    

    mkdir

    Usage:

    $ ./dfctl hdfs mkdir [FLAGS] DIR
    

    Flags:

    • -p, --parents: creates the parent path as well.

    Output:

    • Create a new directory.

    <example>

    [forest@2e11000d04cf ~][edge-df_beta]$ ./dfctl hdfs mkdir /user/test-df/test
    

    mv

    Usage:

    $ ./dfctl hdfs mv [FLAGS] [SRC_FILE] [DST_FILE]
    

    Flags:

    • -n, --no-clobber: does not overwrite when the name of the file being transferred already exists.
    • -T, --no-target-directory: handles the transferred files as general files.

    Output:

    • Transfers to another directory or edits the file or directory name.

    <example>

    [forest@2e11777d04cf ~][edge-df_beta]$ ./dfctl hdfs mv /user/test-df/src_file /user/test-df/dst_file2
    [forest@2e11777d04cf ~][edge-df_beta]$ ./dfctl hdfs ls /user/test-df
    dst_file2
    

    put

    Usage:

    $ ./dfctl hdfs put [LOCAL_FILE] [REMOTE_FILE]
    

    Flags:

    • N/A

    Output:

    • Copies the file from local to remote.

    <example>

    [forest@2e11000d04cf ~][edge-df_beta]$ ./dfctl hdfs put /test/local /test/df
    

    rm

    Usage:

    $ ./dfctl hdfs rm [FLAGS] FILE
    

    Flags:

    • -R, --recursive: applies to all the directories and the sub directories that exists within the directories.
    • -f, --force: ignores if the file does not exist.

    Output:

    • Delete the file.

    <example>

    [forest@2e11000d04cf ~][edge-df_beta]$ ./dfctl hdfs rm /user/test-df/a
    

    touch

    Usage:

    $ ./dfctl hdfs touch [FLAGS] FILE
    

    Flags:

    • -c, --no-create: edits the time of the file and the file is not created.

    Output:

    • Creates a valid blank file.

    <example>

    [forest@2e11000d04cf ~][edge-df_beta]$ ./dfctl hdfs touch /user/test-df/a
    

    Was this article helpful?

    Changing your password will log you out immediately. Use the new password to log back in.
    First name must have atleast 2 characters. Numbers and special characters are not allowed.
    Last name must have atleast 1 characters. Numbers and special characters are not allowed.
    Enter a valid email
    Enter a valid password
    Your profile has been successfully updated.