Linespider is a Web crawler that provides a wide range of search results for LINE services while complying with the Robots Exclusion Protocol.
About LINE's web data collection policy and robots.txt
1. Linespider is a Web crawler managed by LINE. The user agents are as follows :
⋅Mozilla/5.0 (compatible; Linespider/1.1; +https://lin.ee/4dwXkTH)
⋅Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Linespider/1.1; +https://lin.ee/4dwXkTH) Chrome/W.X.Y.Z Safari/537.36
The user agent string "W.X.Y.Z" contains the version number of the browser used by the user agent.
2. Linespider adheres to the Robots Exclusion Protocol outlined in "robots.txt". For more information about robots.txt, please see
here.
By adding "User-agent:Linespider" to the robots.txt file, you can apply the data collection limitations of the file to Linespider.
3. robots.txt is a standard for placing limitations on searches for websites and data. When a website has robots.txt installed, Linespider will only crawl pages within the limitations set forth in the file.
4. Even if information is published on the internet, it does not mean that the copyright holder or website owner has given permission to publish, copy, or use information obtained using any search method, including Web crawlers. Installing the robots.txt file makes it clear that you are imposing data collection limitations on crawlers.
5. In an effort to protect information and user rights, LINE will conform to these robots.txt standards, and follow the directives when crawling websites with this file installed.