Manage tasks
    • PDF

    Manage tasks

    • PDF

    Article Summary

    Available in Classic and VPC

    Task management describes how to create and manage tasks in the Tuning menu and how to check the learning status. It also introduces how to use Inference Test to evaluate the performance and accuracy of trained tasks and how to create a test app.

    Create a new task

    You can test a part of a pre-trained model as retraining it to your dataset by creating a new job.

    Note

    To create a tuning job, you need a dataset proper for the upload conditions, and the more the number of cases of data, the more improved the performance of the tuning completion. For more details of dataset, see Dataset.

    The following describes how to create a new task.

    1. In the NAVER Cloud Platform console, click the Services > AI Services > CLOVA Studio JP menus, in that order.
    2. Click the My Product menu.
    3. Click the [Go to CLOVA Studio JP] button.
    4. Click the Tuning menu.
    5. Click the [Create] button for the type you want to work on.
      • Tuning provides a total of 7 types, and you can check the description and application examples of the task types in the form of cards.
    6. When you see the pop-up window, click the [Create] button on selecting the model engine.
      • You cannot change the model engine later.
      • For a more detailed description of the model engine, see Engine.
    7. Enter a task name.
    8. Please upload datasets after clicking on the file upload area and checking the personal information and harmful information guide.
      Caution

      Be sure to upload the dataset file in compliance with its specifications to ensure normal learning progress and performance. See Dataset for more details.

      • For .csv files, you can download the dataset format file by clicking Download format.
      • If you upload the dataset successfully, you can see the dataset file name and size in the file upload area.
      • Even if you upload the dataset successfully, when improvements are recommended for optimal performance, you can view the caveats by clicking Check cautions.
      • If you fail to upload the dataset, you can see the file name and reason for failure in the file upload area. You can check specific errors on clicking View details.
    9. Click the [Next] button.
    10. Check the expected token usage and click the [Learn] button in the Token Calculation pop-up window.
    11. Check the contents of the pop-up window while learning is waiting, and then click the [OK] button.
    Note
    • The more the number of tokens in the user dataset, the longer time it takes to tune, and the more cost may be incurred.
    • Prior to training, it may take up to 6 hours to wait for training to secure a GPU for training and pre-process the data.

    Dataset

    Please check the specifications, precautions, and examples for dataset files and create the dataset files correctly.

    Dataset file specifications and precautions

    Dataset file specifications and precautions are as follows.

    Common

    • Enter the a file name between 2 to 30 characters.
    • Only .csv and .jsonl file extensions are supported.
    • The encoding format of the file supports UTF-8 format.
    • You can only upload files with a file size of 50 MB or less.
    • Please enter at least 1000 valid data.
    • For document classification tasks, the number of datasets per category should be equal, and we recommend a minimum of 200 cases. In addition, we recommend configuring a classification label by a single word excluding spacing and special characters.
      • <example> 30% positive (300 cases), 30% negative (300 cases), 40% neutral (400 cases)
    • When you need to break the line, please use ‘\n.’
    • You are responsible for any problems arising from uploading datasets containing personal information.

    .csv file

    • The first row must accurately contain "Text" and "Completion" and it must consist of only two columns.
    • Be sure to delete blank rows and columns.
    • Each row (pair of texts and completions) of the file must not exceed 1000 characters, including spaces. If it exceeds, only a part of the dataset is uploaded.

    .jsonl file

    • Each line must consist of {"Text": "Input value", "Completion": "Desired result"}, and "Input value" and "Desired result" must contain at least one character.
    • Please enclose double quotation marks with '"'.

    Example of composing a dataset file

    Composing examples of a dataset file is as follows:

    • For dialog tasks, please fill out the dataset as follows to ensure optimal performance.
      • Input 3 or more sentences into the Text column, and input 1 sentence into the Completion column.
      • Unify the speaker of the Completion column to one person
      • We recommend configuring so that the speeches are continued between the Text column and the Completion column.
      • The agents of speech (speakers) are limited to 2 people.
      • Make sure to specify the subject of speech before the statement (<example> 'Customer:', 'Seller:')

    Check learning status

    The following describes how to check the learning status.

    1. In the NAVER Cloud Platform console, click the Services > AI Services > CLOVA Studio JP menus, in that order.
    2. Click the My Product menu.
    3. Click the [Go to CLOVA Studio JP] button.
    4. After selecting [User Name] in the upper right corner, click [My Task] > Tuning.
      tuning_mypage.png
      • Create a new task: Click to move to the [New Task] tab
      • i-clovastudio_card/i-clovastudio_list: Click to change sorting
      • Waiting for learning: Waiting for learning, you can click to see a pop-up window of waiting for learning
      • Learn in progress: Learning is in progress, you can click to check the estimated time required
      • Learning completed: Learning is complete, you can click to check and test task information
      • Suspend learning: The state of stopping learning

    Check task information to complete learning

    Click a completed task to view its information.

    clovastudio-tuning_info_ko

    • Creation date and time: Date and time of creation of a new task
    • Learning completion date: Date and time of completion of learning
    • Workflow ID: An ID for identifying a task in learning
    • Problem Type: Task type
    • Model Engine: Type of learned language model
    • Dataset: Dataset file name used for learning
    • Train Loss: A figure that shows how proper the model is to the dataset. The lower the figure is, the less the error from the correct answer.
    • Tokens Used: Number of tokens actually used

    Suspend learning

    The following describes how to suspend learning.

    Caution
    • If you stop learning a task that is being trained, you may be charged usage fees for used tokens depending on the learning progress.
    • You cannot resume the suspended learning.
    Note

    You can suspend only tasks waiting for learning or learning in progress.

    1. In the NAVER Cloud Platform console, click the Services > AI Services > CLOVA Studio JP menus, in that order.
    2. Click the My Product menu.
    3. Click the [Go to CLOVA Studio JP] button.
    4. After selecting [User Name] in the upper right corner, click [My Task] > Tuning.
    5. Please click on the task to stop learning.
    6. Click the [Stop] button.
    7. Check the contents of the pop-up window of suspending learning, and then click the [Stop] button.
      • If you stop the pending training, you will lost all previously uploaded dataset files.
      • If you stop training in progress, you will lost previously uploaded dataset files and jobs, and the token that was notified when creating the job will be used.

    Utilize trained tasks

    Provides an Inference test function to evaluate performance and accuracy before creating a test app for a trained task. After completing the test, you can create a test app, and you can also share your work via a share URL.

    Inference Test

    The following describes how to test with the Inference Test function.

    1. In the NAVER Cloud Platform console, click the Services > AI Services > CLOVA Studio JP menus, in that order.
    2. Click the My Product menu.
    3. Click the [Go to CLOVA Studio JP] button.
    4. After selecting [User Name] in the upper right corner, click [My Task] > Tuning.
    5. Among the trained tasks, click the desired task to test.
    6. After entering the input value in the Input area, click the [Run] button.

    clovastudio-tuning_interfacetest_ko

    • You can enter up to 2000 characters including spaces.
    • You can see the completion generated through learning in the Output area.
    • Please verify performance and accuracy through sufficient testing.
    Note
    • We recommend inputting values in a similar length and format to the "Text" of the dataset used for the task.

    • During the conversation Inference test, please enter the following utterances.

      • Please enter the same number of utterances in the same pattern as the utterances in "Text" of the dataset. <example> If the number of utterances in "Text" is 3, enter 3 utterances in the input area

        InputOutput
        Correct example:Customer: When will it be delivered? Seller: You mean what you ordered yesterday? Customer: Yes, that's right.Seller: It will be delivered tomorrow.
        Wrong example:Customer: When will it be delivered? Seller: It will be delivered tomorrow.Seller: Please wait a little longer.
      • Please enter the utterance in a similar format to the utterance in "Text" of the dataset, including the subject of the utterance. <example>

        InputOutput
        Correct example:Customer: When will it be delivered?Seller: It will be delivered tomorrow.
        Wrong example:When will it be delivered?It will be delivered tomorrow.
    • Some of the uploaded datasets are used to verify the performance of the tuned model, so the inference test results may not match the contents of the user dataset.

    Create test app

    The following describes how to create the test app.

    1. In the NAVER Cloud Platform console, click the Services > AI Services > CLOVA Studio JP menus, in that order.
    2. Click the My Product menu.
    3. Click the [Go to CLOVA Studio JP] button.
    4. After selecting [User Name] in the upper right corner, click [My Task] > Tuning.
    5. Click the task you want to create a test app for.
    6. Click the [Create test app] button.
    7. Enter a test app name, and then click the [Create] button.
      • You can create a test app, and see a test app pop-up window.

        clovastudio-playground_testapp_ko

        • You can check the API information of the test app, and you can set whether to use AI Filter. (See CLOVA Studio JP API Guide for details on the API.)
        • As for the code type, curl and python are provided.
        • You can click [Copy] to copy the API information to the clipboard.
        • You can click [Reissue] to reissue API Gateway Key.
        • You can click View guide to check the AI Filter guide.
    Note

    Was this article helpful?

    Changing your password will log you out immediately. Use the new password to log back in.
    First name must have atleast 2 characters. Numbers and special characters are not allowed.
    Last name must have atleast 1 characters. Numbers and special characters are not allowed.
    Enter a valid email
    Enter a valid password
    Your profile has been successfully updated.