Chyan Yang, Hsien‐Jyh Liao and Chung‐Chen Chen
The purpose of this paper is to explain the Creative Common license (CC license) a digital copyright license, which can clearly express the scope of copyright granted by the…
Abstract
Purpose
The purpose of this paper is to explain the Creative Common license (CC license) a digital copyright license, which can clearly express the scope of copyright granted by the owners and therefore help users, including crawlers and software robots, to comprehend the scope of authority and then collect digital contents via the internet legally. However, both the complex format and difficulty in embedding the digital codes in a binary file impede the spread of CC licenses. This paper seeks to propose a new protocol CCFE, based on the CC license, to solve the above problems.
Design/methodology/approach
Instead of embedding the CC licensing information in the body of a CC file, CCFE attaches the authentication information in the file extension. The syntax of CCFE to verify the validity of CCFE is illustrated.
Findings
CCFE allows the authorization data to be embedded and is consequently preserved in the process of duplication and transmission. Thus the portability of the authentication method is magnified. In addition, users can use general search engines, like Google, to find the CC licensed documents.
Originality/value
The paper points out the disadvantages of the current CC license and explains a new protocol. Furthermore, it explains how this new paradigm can be used for constructing an online digital library and how librarians can use software robots to collect digital contents on the internet within copyright guidelines.
Details
Keywords
The Robots.txt and Robots Meta tags constitute a set of instruments that can be used to instruct software robots. However, the current version of Robots.txt and Robots Meta tags…
Abstract
Purpose
The Robots.txt and Robots Meta tags constitute a set of instruments that can be used to instruct software robots. However, the current version of Robots.txt and Robots Meta tags are both too simple and ambiguous in an internet world with many potential conflicts, especially in terms of copyright and trespass to chattels. This paper seeks to propose an amendment to the Robots.txt and Robots Meta tags to solve the problems.
Design/methodology/approach
Instead of following personal experience, this paper surveys several predominant cases in an attempt to find general principles that can be used as guidelines to amend the Robots.txt and Robots Meta tags.
Findings
According to several court cases, the Robots.txt and Robots Meta tags can not only be used to simply allow or refuse the software robots, but also expressing the online copyright authorization policies of webmasters. Any robot following the given policies can prevent possible conflicts, and undoubtedly, any robot ignoring these may be in breach of the law. In terms of adapting to their new roles successfully, the Robots.txt and Robots Meta tags need some supplements and adoption; as a result, the webmasters can express their will more explicitly and avoid unnecessary disputes in relation to copyright authorization scope and trespass to chattels as well.
Originality/value
This paper reveals the new function of the Robots.txt and Robots Meta tags. Based on this new function, this paper points out the disadvantages of the current Robots.txt and Robots Meta tags and proposes new a comprehensive amendment based on this new function.