Tuan-Dat Trinh, Peter Wetz, Ba-Lam Do, Elmar Kiesling and A Min Tjoa
This paper aims to present a collaborative mashup platform for dynamic integration of heterogeneous data sources. The platform encourages sharing and connects data publishers…
Abstract
Purpose
This paper aims to present a collaborative mashup platform for dynamic integration of heterogeneous data sources. The platform encourages sharing and connects data publishers, integrators, developers and end users.
Design/methodology/approach
This approach is based on a visual programming paradigm and follows three fundamental principles: openness, connectedness and reusability. The platform is based on semantic Web technologies and the concept of linked widgets, i.e. semantic modules that allow users to access, integrate and visualize data in a creative and collaborative manner.
Findings
The platform can effectively tackle data integration challenges by allowing users to explore relevant data sources for different contexts, tackling the data heterogeneity problem and facilitating automatic data integration, easing data integration via simple operations and fostering reusability of data processing tasks.
Research limitations/implications
This research has focused exclusively on conceptual and technical aspects so far; a comprehensive user study, extensive performance and scalability testing is left for future work.
Originality/value
A key contribution of this paper is the concept of distributed mashups. These ad hoc data integration applications allow users to perform data processing tasks in a collaborative and distributed manner simultaneously on multiple devices. This approach requires no server infrastructure to upload data, but rather allows each user to keep control over their data and expose only relevant subsets. Distributed mashups can run persistently in the background and are hence ideal for real-time data monitoring or data streaming use cases. Furthermore, we introduce automatic mashup composition as an innovative approach based on an explicit semantic widget model.
Details
Keywords
A previous paper by the present author described the pros and cons of using the three largest cited reference enhanced multidisciplinary databases and discussed and illustrated in…
Abstract
Purpose
A previous paper by the present author described the pros and cons of using the three largest cited reference enhanced multidisciplinary databases and discussed and illustrated in general how the theoretically sound idea of the h‐index may become distorted depending on the software and the content of the database(s) used, and the searchers' skill and knowledge of the database features. The aim of this paper is to focus on Google Scholar (GS), from the perspective of calculating the h‐index for individuals and journals.
Design/methodology/approach
A desk‐based approach to data collection is used and critical commentary is added.
Findings
The paper shows that effective corroboration of the h‐index and its two component indicators can be done only on persons and journals with which a researcher is intimately familiar. Corroborative tests must be done in every database for important research.
Originality/value
The paper highlights the very time‐consuming process of corroborating data, tracing and counting valid citations and points out GS's unscholarly and irresponsible handling of data.
Details
Keywords
Ilse Valenzuela Matus, Jorge Lino Alves, Joaquim Góis, Paulo Vaz-Pires and Augusto Barata da Rocha
The purpose of this paper is to review cases of artificial reefs built through additive manufacturing (AM) technologies and analyse their ecological goals, fabrication process…
Abstract
Purpose
The purpose of this paper is to review cases of artificial reefs built through additive manufacturing (AM) technologies and analyse their ecological goals, fabrication process, materials, structural design features and implementation location to determine predominant parameters, environmental impacts, advantages, and limitations.
Design/methodology/approach
The review analysed 16 cases of artificial reefs from both temperate and tropical regions. These were categorised based on the AM process used, the mortar material used (crucial for biological applications), the structural design features and the location of implementation. These parameters are assessed to determine how effectively the designs meet the stipulated ecological goals, how AM technologies demonstrate their potential in comparison to conventional methods and the preference locations of these implementations.
Findings
The overview revealed that the dominant artificial reef implementation occurs in the Mediterranean and Atlantic Seas, both accounting for 24%. The remaining cases were in the Australian Sea (20%), the South Asia Sea (12%), the Persian Gulf and the Pacific Ocean, both with 8%, and the Indian Sea with 4% of all the cases studied. It was concluded that fused filament fabrication, binder jetting and material extrusion represent the main AM processes used to build artificial reefs. Cementitious materials, ceramics, polymers and geopolymer formulations were used, incorporating aggregates from mineral residues, biological wastes and pozzolan materials, to reduce environmental impacts, promote the circular economy and be more beneficial for marine ecosystems. The evaluation ranking assessed how well their design and materials align with their ecological goals, demonstrating that five cases were ranked with high effectiveness, ten projects with moderate effectiveness and one case with low effectiveness.
Originality/value
AM represents an innovative method for marine restoration and management. It offers a rapid prototyping technique for design validation and enables the creation of highly complex shapes for habitat diversification while incorporating a diverse range of materials to benefit environmental and marine species’ habitats.