Test
    • PDF

    Test

    • PDF

    Article Summary

    Available in Classic and VPC

    Sufficient testing needs to be performed in order to increase the user satisfaction for a chatbot. Once model learning is completed, a validation step is required to test whether it operates as the chatbot producer intended, and whether there are any items that need to be improved. NAVER Cloud Platform's chatbot builder provides an advanced test tool, and you can test the learned conversations yourself after setting up a test environment.
    Conversation model learning has to be completed in order to conduct tests, so please build it first. The test can't be conducted properly without building the chatbot.

    Test type

    • Manual test: Test by entering questions manually.
    • Automatic test: Test automatically by uploading an Excel file. You can upload a list of questions in a specified format and check the answers extracted through the automatic test at once.
    • Quality test: This is an advanced feature of the automatic test that enables you to check the data quality by automatically conducting tests whenever the beta version changes.

    Manual test

    You can test the learned conversations after selecting the test environment yourself, such as specifying messengers to link, date/time conditions, etc.
    The following describes how to test a built chatbot manually.

    1. From the NAVER Cloud Platform console, click the Services > CLOVA Chatbot > Domain menus, in that order.
    2. Click the [Run builder] button of the domain you want to run the chatbot builder.
    3. From the chatbot builder, click the Manual test menu.
    4. Set up a test environment.
      • Test environment: You can select a beta or service environment to conduct the test.
      • Date/Time: You can conduct the test by setting a specific date/time.
      • Messenger: You can select a messenger to use for the test.
      • Context: You can set up a specific context and start the test. If the test needs to be done from the mid-flow of a chatbot service, then you can select a specific context from which the test can be started. However, the welcome message is not supported if you run the test after setting up a context.
      • User variable: You can set a specific value for a user variable and start the test.
      • Voice: You can set up a specific voice and start the test. It's recommended to match this to the voice settings of CLOVA AiCall before starting the test.
    5. Select a test start method.
      • Start with welcome: You can start the test from the welcome message.
      • Start right away: You can skip the welcome message and start testing right away.
    6. Enter questions when the test window appears.
      • For how to test, refer to Conduct test.
      • Click the [View settings] button to check the test environment settings during the test.
      • Click the [Reset] button to change settings and then run the test again.
    7. Check the test result.
    Note

    If you run a manual test within the chatbot builder while the domain is AiCall, then the values of built-in variables *cicRequest.session.callInfo.callee and *cicRequest.session.callInfo.caller are empty, which makes testing difficult. However, you can test scenarios using the callee and caller information if you set up these values when setting up the test.

    Conduct test

    Enter the test query in the test window manually to conduct the test.

    • If you're building a FAQ chatbot service, then you should test whether a variety of user utterances that have not been registered as learning sets are matched to the appropriate conversations with the correct intentions. This can be considered as testing the chatbot service's intention analysis performance.
    • If you're building a chatbot service in a goal-oriented task type that performs specific tasks, then it's recommended to conduct a scenario flow test as well as the intention analysis performance test. It's a test to see if the chatbot executes a defined scenario as appropriate to a variety of circumstantial flows the user initiates, or if there are any omitted part in the scenario's flow.

    chatbot-chatbot-4-4_01_en

    View test result

    You can see the analyzed information, information on the response results, and log information in the right tab of the test page. Use this information to check if the chatbot works as intended by the creator. Click the speech bubble of the turn you want to see from the left test page to see the turn's information, as well as the last turn's information.

    Analyzed information tab
    You can view the information analyzed during the turn.

    chatbot-chatbot-4-4_02_en

    • Entity: You can view the analyzed entity information in the user's utterance.
    • Slot: If a task has started, then you can view the information as to which entity has filled the slot.
    • Context: You can view the information of context and number of counts deducted or added in the turn.
    • User variable: You can view the information of user variables updated in the turn.
    • Voice settings: If you've set up a voice, then you can click "View more" under the speech bubble to see it.

    Response result
    You can view the information of answer/message the system has returned. You can view the match causes if it's matched to a specific conversation, or a failure or no response message has been returned.
    chatbot-chatbot-4-4_03_en

    • Exact match: This refers to a case where a general question data identical to the entered test query is registered in the conversation and found a match.
    • Regular expression match: This refers to a case where the entered test query's pattern matched a regular expression question registered in the conversation.
    • Intent classifier match: This refers to a case where the intent analyzed by the intent classifier for the entered test query matched an intent question registered in the conversation.
    • Model match: This refers to a case where the conversation model analyzed and judged the entered test query to be similar to the matched conversation.
    • Conversation transfer: This refers to a case where the conversation was transferred using the Conversation transfer feature among follow-up actions.

    View log
    You can view logs. In case of a failure, you will be able to receive more accurate answers if you submit the raw log of the turn with your inquiry to the customer center.

    Automatic test

    Write the questions that need to be tested in an Excel file and upload it, and you'll be able to download the result in an Excel file.
    The following describes how you can conduct an automatic test.

    1. From the NAVER Cloud Platform console, click the Services > CLOVA Chatbot > Domain menus, in that order.
    2. Click the [Run builder] button of the domain you want to run the chatbot builder.
    3. From the chatbot builder, click Automatic test > [Batch test] tab.
    4. Open the test settings area and set the date/time condition and messenger condition.
    5. Click the [Import] button to register the test file.
      • Only files in the XLS and XLSX format can be uploaded.
      • If you don't have a test file, then click the [Download test template] button to download the template and create the test file. Enter expected questions in column A of the template file.
    6. When the test message pop-up window appears, click the [Confirm] button.
    7. To view the job result, click the [Manage job] button.
    8. Click the Excel file link of the job details item from the job list.
      • You can view the result from the uploaded test file.

    Quality test

    You can check the chatbot quality of each version by performing a quality test. Quality test automatically conducts quality evaluation whenever beta version changes. Register important questions to be managed in the domain and conversation names that you want to match to those questions as quality evaluation data. After the evaluation, you can check the pass rate and download a detailed report from the Manage job page.
    The following describes how you can conduct a quality test.

    1. From the NAVER Cloud Platform console, click the Services > CLOVA Chatbot > Domain menus, in that order.
    2. Click the [Run builder] button of the domain you want to run the chatbot builder.
    3. From the chatbot builder, click the Automatic test menu > [Quality test] tab.
    4. Click the [Change settings] button in Quality test settings.
    5. Change the quality test service area to ON.
    6. Click the [Upload] button in the quality test file area.
    7. Upload the quality test file in the Upload quality test file window, and click the [OK] button.
      • Only files in the XLS and XLSX format can be uploaded.
      • If you don't have a test file, then click the [Download quality test template] button to download the template and create the test file. (Enter queries to test in column A, and conversation name to be matched as correct in column B)
    8. Select the date/time condition and messenger to conduct the test.
      • Select "No selection" to remove the date/time condition settings.
    9. Click the [Save] button in Quality test settings.
    10. When a pop-up window appears, click the [Confirm] button.
      • Once the chatbot is built, the quality evaluation will be conducted automatically.
      • The pass rate will be displayed in the quality test window, and the tested files can be viewed from Job result.

    Was this article helpful?

    Changing your password will log you out immediately. Use the new password to log back in.
    First name must have atleast 2 characters. Numbers and special characters are not allowed.
    Last name must have atleast 1 characters. Numbers and special characters are not allowed.
    Enter a valid email
    Enter a valid password
    Your profile has been successfully updated.