Search results
1 – 10 of 12
The purpose of this paper is to characterize library and information science (LIS) as fragmenting discipline both historically and by applying Whitley’s (1984) theory about the…
Abstract
Purpose
The purpose of this paper is to characterize library and information science (LIS) as fragmenting discipline both historically and by applying Whitley’s (1984) theory about the organization of sciences and Fuchs’ (1993) theory about scientific change.
Design/methodology/approach
The study combines historical source analysis with conceptual and theoretical analysis for characterizing LIS. An attempt is made to empirically validate the distinction between LIS context, L&I services and information seeking as fragmented adhocracies and information retrieval and scientific communication (scientometrics) as technologically integrated bureaucracies.
Findings
The origin of fragmentation in LIS due the contributions of other disciplines can be traced in the 1960s and 1970s for solving the problems produced by the growth of scientific literature. Computer science and business established academic programs and started research relevant to LIS community focusing on information retrieval and bibliometrics. This has led to differing research interests between LIS and other disciplines concerning research topics and methods. LIS has been characterized as fragmented adhocracy as a whole, but we make a distinction between research topics LIS context, L&I services and information seeking as fragmented adhocracies and information retrieval and scientific communication (scientometrics) as technologically integrated bureaucracies.
Originality/value
The paper provides an elaborated historical perspective on the fragmentation of LIS in the pressure of other disciplines. It also characterizes LIS as discipline in a fresh way by applying Whitley’s (1984) theory.
Details
Keywords
Rui M. Lima, Erik Teixeira Lopes, Derek Chaves Lopes, Bruno S. Gonçalves and Pedro G. Cunha
This work aims to integrate the concepts generated by a systematic literature review on patient flows in emergency departments (ED) to serve as a basis for developing a generic…
Abstract
Purpose
This work aims to integrate the concepts generated by a systematic literature review on patient flows in emergency departments (ED) to serve as a basis for developing a generic process model for ED.
Design/methodology/approach
A systematic literature review was conducted using PRISMA guidelines, considering Lean Healthcare interventions describing ED patients’ flows. The initial search found 141 articles and 18 were included in the systematic analysis. The literature analysis served as the basis for developing a generic process model for ED.
Findings
ED processes have been represented using different notations, such as value stream mapping and workflows. The main alternatives for starting events are arrival by ambulance or walk-in. The Manchester Triage Scale (MTS) was the most common protocol referred to in the literature. The most common end events are admission to a hospital, transfer to other facilities or admission to an ambulatory care system. The literature analysis allowed the development of a generic process model for emergency departments. Nevertheless, considering that several factors influence the process of an emergency department, such as pathologies, infrastructure, available teams and local regulations, modelling alternatives and challenges in each step of the process should be analysed according to the local context.
Originality/value
A generic business process model was developed using BPMN that can be used by practitioners and researchers to reduce the effort in the initial stages of design or improvement projects. Moreover, it’s a first step toward the development of generalizable and replicable solutions for emergency departments.
Details
Keywords
Bruce Wallace, Lea Gozdzialski, Abdelhakim Qbaich, Azam Shafiul, Piotr Burek, Abby Hutchison, Taylor Teal, Rebecca Louw, Collin Kielty, Derek Robinson, Belaid Moa, Margaret-Anne Storey, Chris Gill and Dennis Hore
While there is increasing interest in implementing drug checking within overdose prevention, we must also consider how to scale-up these responses so that they have significant…
Abstract
Purpose
While there is increasing interest in implementing drug checking within overdose prevention, we must also consider how to scale-up these responses so that they have significant reach and impact for people navigating the unpredictable and increasingly complex drug supplies linked to overdose. The purpose of this paper is to present a distributed model of community drug checking that addresses multiple barriers to increasing the reach of drug checking as a response to the illicit drug overdose crisis.
Design/methodology/approach
A detailed description of the key components of a distributed model of community drug checking is provided. This includes an integrated software platform that links a multi-instrument, multi-site service design with online service options, a foundational database that provides storage and reporting functions and a community of practice to facilitate engagement and capacity building.
Findings
The distributed model diminishes the need for technicians at multiple sites while still providing point-of-care results with local harm reduction engagement and access to confirmatory testing online and in localized reporting. It also reduces the need for training in the technical components of drug checking (e.g. interpreting spectra) for harm reduction workers. Moreover, its real-time reporting capability keeps communities informed about the crisis. Sites are additionally supported by a community of practice.
Originality/value
This paper presents innovations in drug checking technologies and service design that attempt to overcome current financial and technical barriers towards scaling-up services to a more equitable and impactful level and effectively linking multiple urban and rural communities to report concentration levels for substances most linked to overdose.
Details
Keywords