Cloud Log Analytics overview

Prev Next

Available in Classic and VPC

You can collect, analyze, and store various logs that are created while using the services provided by NAVER Cloud Platform, using Cloud Log Analytics of NAVER Cloud Platform.

Cloud Log Analytics features

Cloud Log Analytics provides the following features:

  • Real-time log collection: you can collect all log files created by NAVER Cloud Platform services as text in real time, including Servers, and view them on the dashboard.
  • Search logs: you can search all logs collected from multiple services by applying various options, such as search by time, keyword search, and Lucene Query search.
  • Log data storage and downloading: 2 or more copies of collected data are distributed and saved in independent repositories that are logically separated, and they can be downloaded as an Excel file or exported to Object Storage.
  • Provision of various features for log management and analysis (feature coming soon): it provides delivery of real-time keyword notifications, custom dashboards, RESTful API, and integrated log collection from NAVER Cloud Platform services.

Cloud Log Analytics user guides

Cloud Log Analytics is available in Korea, U.S., Singapore, Japan, and Germany Regions, and the same service is provided in each Region. This guide will walk you through the information you need to start using Cloud Log Analytics.

Cloud Log Analytics related resources

We offer various related resources as well as the user guides so you can better understand Cloud Log Analytics. If you're a developer or marketer in need of details while you're considering applying Cloud Log Analytics for your company or establishing data related policies, then make good use of the following resources:

Cloud Log Analytics FAQs

You can have your questions answered quickly by referring to the answers in the FAQs before reading the user guide. If your questions are not resolved in the FAQs, see the user guides to find the information you want.

Q. What kind of data can I collect?
A. Cloud Log Analytics can collect any kind of log data file that is created in text format. Templates are provided for frequently-used logs, but logs without templates can also be collected using the custom log feature. For more information on available log templates, see Cloud Log Analytics prerequisites.

Q. How is stored data managed?

  • 2 or more copies of your data are distributed and saved in independent repositories that are logically separated.
  • The data is then deleted in the following cases:
    • If the retention period passes, the data from the past up until the previous day is deleted every day.
    • If the size of the data exceeds the storage capacity, up to 30% of the oldest data is deleted sequentially.
    • All data is deleted when Cloud Log Analytics is unsubscribed.

Q. Can I download all saved logs?
A. You can download the all data through Export to Object Storage. For more information, see Export to Object Storage or Export Log.

Q. I can't unsubscribe.
A. If you use the log management features of Cloud Log Analytics in PaaS in NAVER Cloud Platform such as Cloud DB for MySQL, Cloud DB for MSSQL, and Cloud DB for MongoDB, you cannot unsubscribe from Cloud Log Analytics. You need to first unsubscribe from or return the applicable service before unsubscribing from Cloud Log Analytics.

Q. Can I try it as a test?
A. The Standard pricing plan of Cloud Log Analytics provides the log collection feature and an average monthly log storage capacity of 1 GB for free. For more information, see Usage fees.

Q. Can the log of the Auto Scaling server be automatically sent to Cloud Log Analytics?
A. If you create an Auto Scaling group by using an image of a server collected and set in Cloud Log Analytics, the log of the created server by Auto Scaling is transferred to Cloud Log Analytics without any separate settings. For more information, see Collect logs.

Q. During automatic export, a "no permission of object storage" error occurs.
A. Check the following items in order:

  1. Check if the bucket has been deleted
    Check if Object Storage's bucket set as the target for automatic export has not been deleted.
  2. Check the API authentication key
    Check if the API authentication key is registered in [My page > Account management> Authentication key management]. Without the authentication key, Object Storage access is limited.
  3. Check the bucket access permissions settings
    If the access control is set in the bucket of Object Storage, or if the permissions are removed, automatic export may fail. Check if the read/insert permissions are properly granted.

If the problem persists even after checking the above items, contact Customer Center.

Q. During automatic export, a "no access key" issue occurs.
A. The issue occurs when there is no Access Key to access Object Storage. The automatic export feature is internally performed using your API authentication key internally. Therefore, check the following items:

  • Check if the account is deleted
    If the account for which automatic export is configured is deleted or disabled, the issue may occur.
  • Check the validation of the API authentication key
    Check if the valid API authentication key is registered in the account for which automatic export is configured.
    → Path to check: [My page > Account management> Authentication key management]
  • Check the automatic export settings account
    After disabling the automatic export feature in the main account or other sub account, set the automatic export again in the account with the API authentication key. Then, log transfer will perform properly using the authentication key of the account.

Q. The automatic export feature is improved. Then, what are the differences from the previous version?
A. The enhanced automatic export feature has the following main differences compared to the previous method.

  • Log file storage method changed
    All logs are stored in 1 file before, but now they are stored separately by date, log type, or time in the enhanced feature.
    → This allows to segment logs for easy management and analysis.
  • Storage path changed
    The path structure of Object Storage, where logs are stored, is changed in the enhanced feature. When you set the automatic export, check the access permissions and settings according to the new path standard.

The enhanced automatic export feature is for more efficient log management and analysis. Set and use it in a new method.

Q. The Agent installation has been completed and the "Finish Installation" message is displayed, but logs are not transferred.
A. If logs are not transferred although the installation is properly completed, check the following items:

  • Check the firewall or ACL settings
    Log transfer may be blocked due to firewall or Access Control List (ACL) settings in the installed device.
    Check if the port to the log server is blocked.
  • Check if Filebeat is running
    To check if the Agent is properly running, enter the following command:
ps -ef | grep filebeat

If the process is not running, restart it manually.

  • Check the configuration file
    To check if the configuration file has errors, enter the following command.
cat /etc/filebeat/filebeat.yml

Check if the log collection path, output target, and format are set correctly.

Q. While installing Agent in Windows Server 2019, the "missing configuration file" error occurs. How can I resolve this?
A. The error may occur if the configuration file of filebeat or winlogbeat is not properly downloaded.

  • Check if the following configuration file exists first.
  • Configuration file path
C:\Program Files\cla_windows\winlogbeat-<version>-windows-x86_64\winlogbeat.yml

The configuration file is automatically downloaded upon installation. If the file download fails, the log transfer is not properly performed.

  • Solutions
  1. Stop the service and delete the existing installation file.
    • Run the following commands in PowerShell:
Stop-Service filebeat
Stop-Service winlogbeat
cd "C:\Program Files"
del cla_windows 
  1. Restart the installation script.
    • Restart the Agent installation script. After that, check if the 2 following configuration files are properly created:
filebeat.yml
winlogbeat.yml
  1. Check the configuration file content.
    • The configuration file must include the following collection server address (hosts).
hosts: ["collect.vcla.gov-ntruss.com:5043"]

※ If the configuration file is empty or the above content does not exist, log collection may fail.

  1. Restart service.
    • When the settings are properly completed, restart the service with the following commands:
Start-Service winlogbeat
Start-Service filebeat

If needed, providing error messages or log file paths to the Customer Center will help you more quickly.