Aniesha Alford, Joshua Adams, Joseph Shelton, Gerry Dozier, Kelvin Bryant and John Kelly
The aim of this paper is to explore the value preference space associated with the optimization and generalization performance of GEFeWSML.
Abstract
Purpose
The aim of this paper is to explore the value preference space associated with the optimization and generalization performance of GEFeWSML.
Design/methodology/approach
In this paper, the authors modified the evaluation function utilized by GEFeWSML such that the weights assigned to each objective (i.e. error reduction and feature reduction) were varied. For each set of weights, GEFeWSML was used to evolve FMs for the face, periocular, and face + periocular templates. The best performing FMs on the training set (FMtss) and the best performing FMs on the validation set (FM*s) were then applied to the test set in order to evaluate how well they generalized to the unseen subjects.
Findings
By varying the weights assigned to each of the objectives, the authors were able to suggest values that would result in the best optimization and generalization performances for facial, periocular, and face + periocular recognition. GEFeWSML using these suggested values outperformed the previously reported GEFeWSML results, using significantly fewer features while achieving the same recognition accuracies statistically.
Originality/value
In this paper, the authors investigate the relative weighting of each objective using a value preference structure and suggest the best weights to be used for each biometric modality tested.
Details
Keywords
That ice‐creams prepared with dirty materials and under dirty conditions will themselves be dirty is a proposition which, to the merely ordinary mind, appears to be sufficiently…
Abstract
That ice‐creams prepared with dirty materials and under dirty conditions will themselves be dirty is a proposition which, to the merely ordinary mind, appears to be sufficiently obvious without the institution of a series of elaborate and highly “scientific” experiments to attempt to prove it. But, to the mind of the bacteriological medicine‐man, it is by microbic culture alone that anything that is dirty can be scientifically proved to be so. Not long ago, it having been observed that the itinerant vendor of ice‐creams was in the habit of rinsing his glasses, and, some say, of washing himself—although this is doubtful—in a pail of water attached to his barrow, samples of the liquor contained by such pails were duly obtained, and were solemnly submitted to a well‐known bacteriologist for bacteriological examination. After the interval necessary for the carrying out of the bacterial rites required, the eminent expert's report was published, and it may be admitted that after a cautious study of the same the conclusion seems justifiable that the pail waters were dirty, although it may well be doubted that an allegation to this effect, based on the report, would have stood the test of cross‐examination. It is true that our old and valued friend the Bacillus coli communis was reported as present, but his reputation as an awful example and as a producer of evil has been so much damaged that no one but a dangerous bacteriologist would think of hanging a dog—or even an ice‐cream vendor—on the evidence afforded by his presence. A further illustration of bacteriological trop de zèle is afforded by the recent prosecutions of some vendors of ice‐cream, whose commodities were reported to contain “millions of microbes,” including, of course, the in‐evitable and ubiquitous Bacillus coli very “communis.” To institute a prosecution under the Sale of Food and Drugs Act upon the evidence yielded by a bacteriological examination of ice‐cream is a proceeding which is foredoomed, and rightly foredoomed, to failure. The only conceivable ground upon which such a prosecution could be undertaken is the allegation that the “millions of microbes ” make the ice‐cream injurious to health. Inas‐much as not one of these millions can be proved beyond the possibility of doubt to be injurious, in the present state of knowledge; and as millions of microbes exist in everything everywhere, the breakdown of such a case must be a foregone conclusion. Moreover, a glance at the Act will show that, under existing circumstances at any rate, samples cannot be submitted to public analysts for bacteriological examination—with which, in fact, the Act has nothing to do—even if such examinations yielded results upon which it would be possible to found action. In order to prevent the sale of foul and unwholesome or actual disease‐creating ice‐cream, the proper course is to control the premises where such articles are prepared; while, at the same time, the sale of such materials should also be checked by the methods employed under the Public Health Act in dealing with decomposed and polluted articles of food. In this, no doubt, the aid of the public analyst may sometimes be sought as one of the scientific advisers of the authority taking action, but not officially in his capacity as public analyst under the Adulteration Act. And in those cases in which such advice is sought it may be hoped that it will be based, as indeed it can be based, upon something more practical, tangible and certain than the nebulous results of a bacteriological test.
Alice Shelton, Samuel Joseph Tromans, Sabyasachi Bhaumik and Reza Kiani
The purpose of this paper is to discuss the challenges of assessment and management of psychotic symptoms in a background of intellectual disability (ID) and treatment-resistant…
Abstract
Purpose
The purpose of this paper is to discuss the challenges of assessment and management of psychotic symptoms in a background of intellectual disability (ID) and treatment-resistant epilepsy caused by a genetic syndrome.
Design/methodology/approach
Ring chromosome 20 [r(20)] syndrome is characterised by the triad of severe refractory epilepsy, mild to severe ID and behavioural problems. This paper describes the presentation of r(20) syndrome in a young woman with moderate ID and treatment-resistant epilepsy, who experiences psychotic symptoms at times of improved seizure control.
Findings
There are several diagnostic possibilities for such a presentation, including psychotic symptoms due to adverse effects of anti-epileptic medications and forced normalisation (alternating psychosis).
Originality/value
This paper advocates judicious use of antipsychotic medication to manage psychotic symptoms, as well as involvement of both patient and close family members throughout all stages of care. It is essential to strike a balance between control of epileptic seizures and psychiatric symptoms, providing an optimal benefit to the patients’ quality of life by meeting their complex needs through a multidisciplinary and multi-agency team input.
Details
Keywords
Travis Edward Shelton, Dylan Joseph Stelzer, Carl R. Hartsfield, Gregory Richard Cobb, Ryan P. O'Hara and Christopher D. Tommila
For many applications, including space applications, the usability and performance of a component is dependent on the surface topology of the additively manufactured part. The…
Abstract
Purpose
For many applications, including space applications, the usability and performance of a component is dependent on the surface topology of the additively manufactured part. The purpose of this paper is to present an investigation into minimizing the residual surface roughness of direct metal laser sintering (DMLS) samples by manipulating the input process parameters.
Design/methodology/approach
First, the ability to manipulate surface roughness by modifying processing parameters was explored. Next, the surface topography was characterized to quantify roughness. Finally, microthruster nozzles were created both additively and conventionally for flow testing and comparison.
Findings
Surface roughness of DMLS samples was found to be highly dependent on the laser power and scan speed. Because of unintended partially sintered particles adhering to the surface, a localized laser fluence mechanism was explored. Experimental results show that surface roughness is influenced by the varied parameters but is not a completely fluence driven process; therefore, a relationship between laser fluence and surface roughness can be incorporated but not completely assumed.
Originality/value
This paper serves as an aid in understanding the importance of surface roughness and the mechanisms associated with DMLS. Rather than exploring a more common global energy density, a localized laser fluence was initiated. Moreover, the methodology and conclusions can be used when optimizing parts via metal additive manufacturing.
Details
Keywords
Joseph Calandro and Robert Flynn
The purpose of this article is to introduce the “Financial Strategy Framework”.
Abstract
Purpose
The purpose of this article is to introduce the “Financial Strategy Framework”.
Design/methodology/approach
During the course of the authors' ongoing financial management research, it was observed that leading academicians frequently integrate finance and strategy in interesting ways. Similarly, many top executives and firms tend to blend strategy and finance in highly productive ways. Nevertheless, the critical functions of finance and strategy are regularly practiced in isolation. This disconnect led the authors to design the framework presented here.
Findings
The key insight of the approach is that the efficient management of the interactions of strategy, resource allocation and performance measurement can generate more value over time than any of these disciplines taken in isolation. The approach presented is not merely the linkage of three critically important disciplines, but rather a way of capitalizing on the interactions of each within a comprehensive framework.
Practical implications
The practical implications of the approach presented are a strategy that clearly guides production, resources that are allocated to efficiently execute strategic initiatives and performance measurement that serves as a cohesive management and feedback system rather than simply a method of reward/punishment. The research implications could be significant in that they could lead to interdisciplinary case and statistical studies.
Originality/value
While this article builds on earlier theoretical and practical work, it is the first to the outline the approach to financial strategy in the manner presented here.
Details
Keywords
Joseph A. Giordano and Lisa Victoravich
This paper aims to examine how introducing irrelevant information into a risk decision scenario leads to less skeptical internal auditor assessments.
Abstract
Purpose
This paper aims to examine how introducing irrelevant information into a risk decision scenario leads to less skeptical internal auditor assessments.
Design/methodology/approach
This paper conducted an internet-based experiment with 157 internal auditors manipulating information relevance. The experiment controlled for individual differences in trait skepticism, perceived information relevance and Chief Information Officer (CIO) warmth.
Findings
Internal auditors exhibit decreased skepticism when irrelevant information contradicts preconceived stereotypes of management, consistent with the dilution effect. When the CIO is described as gregarious, counter to common stereotypes, internal auditors assess risk as less severe compared to when the CIO is described as introverted or when no personality information is provided.
Originality/value
This paper provides insight as to when internal auditor judgment may be compromised.
Details
Keywords
Joseph Calandro and Scott Lane
The property and casualty (P&C) insurance industry has historically focused on the underwriting or combined ratio as the primary measure of operating performance. Many dramatic…
Abstract
The property and casualty (P&C) insurance industry has historically focused on the underwriting or combined ratio as the primary measure of operating performance. Many dramatic changes have occurred in this industry and its operating environment over the past 30 years, which have reduced the importance of the underwriting ratio. An alternative performance measurement system, the insurance performance measure (IPM), is presented and illustrated. The IPM integrates all areas P&C operating activity into a measure more comprehensive that the underwriting ratio.
Details
Keywords
Joseph Calandro and Scott Lane
The purpose of this paper is to introduce the concept of an Enterprise Risk Scorecard.
Abstract
Purpose
The purpose of this paper is to introduce the concept of an Enterprise Risk Scorecard.
Design/methodology/approach
With the accelerating growth in global risk levels leading to an intense current demand for risk management solutions, an analysis was conducted on whether a scorecard framework could be applied to risk measurement. This analysis included a survey of Kaplan and Norton's voluminous and seminal writings on the Balanced Scorecard, in which, surprisingly, relatively little on the measurement of risk was found.
Findings
The findings suggest that a scorecard framework could be an effective risk measurement, management and communication tool. For both design and organizational reasons it is recommended that risk scorecards be separate from performance scorecards.
Research limitations/implications
Utilizing two scorecards – one for performance and a separate one for risk – could provide strategy‐focused organizations with a more comprehensive diagnostic control system. The research implications of this approach could be significant, as it essentially opens up a new field of research.
Originality/value
This is assumed to be the first formal paper on risk and a scorecard framework. Previous work on integrating risk measurement frameworks is very different from the approach proposed here.
Details
Keywords
Joseph Calandro, Scott Lane and Ranganna Dasari
Risk management has grown increasingly popular in recent years due to the recognition that risk should be as actively managed as performance. A key objective of risk management is…
Abstract
Purpose
Risk management has grown increasingly popular in recent years due to the recognition that risk should be as actively managed as performance. A key objective of risk management is to evaluate performance in the context of the relative volatility in which business operations are undertaken. However, accomplishing this has generally proven difficult. This paper aims to present a practical approach for risk‐adjusting performance.
Design/methodology/approach
This paper presents a practical risk‐adjustment methodology that is based on a popular statistical measure. The utility of the approach is demonstrated in two practical examples: the first is an industry example and the second is an M&A example.
Findings
The results of the research suggest that the risk‐adjustment approach presented here could become an important part of both performance management and risk management programs.
Research implications/limitations
The approach detailed in this paper facilitates the practical risk‐adjustment of select performance measures and risk measures. As this is an introductory paper, further research could be conducted on the specifics of the risk‐adjustment process as well as the strategic context in which measures are risk‐adjusted.
Originality/value
This paper introduces a practical approach of risk‐adjusting performance that was inspired by a popular statistical measure, which is demonstrated in two practical examples.
Details
Keywords
Stephanie Anne Shelton and Shelly Melchior
This paper aims to examine how two White teachers, experienced and award-winning veteran educators, navigated issues of race, class and privilege in their instruction, and ways…
Abstract
Purpose
This paper aims to examine how two White teachers, experienced and award-winning veteran educators, navigated issues of race, class and privilege in their instruction, and ways that their efforts and shortcomings shaped both teacher agency and classroom spaces.
Design/methodology/approach
This study’s methodology centers participants’ experiences and understandings over the course of two years of interviews, classroom observations and discussion groups. The study is conceptually informed by Sara Ahmed’s argument that social justice is often approached as something that education “can do,” which is problematic because it assumes that successful enactment is “intrinsic to the term.” Discussing and/or intending social justice replaces real change, and those leading the conversations believe that they have made meaningful differences. Instead, true shifts in thinking and action are “dependent on forms of institutional commitment […and] how it [diversity/social justice] gets taken up” (p. 241).
Findings
Using an in vivo coding approach – i.e. using direct quotations of participants’ words to name the new codes – the authors organized their findings into two discussions: “Damn – Every Time I’m with the Kids, I Just End Up Feeling Frozen”; and “Maybe I’m Just Not Giving These Kids a Fair Shake – Maybe I’m the Problem”.
Originality/value
The participants centered a participatory examination of intersectionality, rather than the previous teacher-mandated one. They “put into action” -xplorations of intersectionality that were predicated on students’ identities and experiences, thus making intersectionality a lived concept, rather than an intellectual one, and transforming students’ and their own engagement.