Corey Seemiller and Rich Whitney
This study used the Delphi technique with 31 seasoned leadership educators who, over the course of two rounds, were tasked to categorize the level of complexity of 60 leadership…
Abstract
This study used the Delphi technique with 31 seasoned leadership educators who, over the course of two rounds, were tasked to categorize the level of complexity of 60 leadership competencies. What resulted was a five-tier taxonomy based on the level of complexity of each of the 60 competencies assessed. The taxonomy also includes four categorical clusters of similar competencies and three domain levels of instructional design. A description of the Delphi technique, results of the study, the taxonomy model, and methods for employing the model are described.
In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of…
Abstract
Purpose
In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.
Design/methodology/approach
In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.
Findings
In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.
Originality/value
In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.
Details
Keywords
John H. Humphreys, Milorad M. Novicevic, Stephanie S. Pane Haden and Md. Kamrul Hasan
Uhl-Bien and Arena (2018) presented a persuasive argument for recognizing the concept of enabling leadership as a critical form of leadership for adaptive organizations. This…
Abstract
Purpose
Uhl-Bien and Arena (2018) presented a persuasive argument for recognizing the concept of enabling leadership as a critical form of leadership for adaptive organizations. This study aims to narratively explore the concept of enabling leadership in the context of social complexity.
Design/methodology/approach
To explore how leaders enable adaptive processes, Uhl-Bien and Arena (2018) called for future research using in-depth case studies of social actors centered on emergence in complex environments. In this in-depth case study, the authors pursue theory elaboration by using a form of analytically structured history process to analyze primary and secondary sources.
Findings
During archival research of Whitney Young, Jr’s largely overlooked and misunderstood leadership in the historic social drama of the 1960s US civil rights movement, the authors discovered compelling evidence to support and extend the theoretical arguments advanced by Uhl-Bien and Arena (2018).
Research limitations/implications
The reflexivity associated with interpretive case approaches confronts the issue of subjectivism. The authors ask readers to judge the credibility of their arguments accordingly.
Originality/value
Using a relational leadership-as-practice lens, the authors interpret the dramaturgical performance Whitney Young, Jr directed to facilitate coherent emancipatory dialogue, affect the social construction of power relations and enable the adaptive space needed for social transformation to emerge.
Details
Keywords
Chester Whitney Wright (1879–1966) received his A.B. in 1901, A.M. in 1902 and Ph.D. in 1906, all from Harvard University. After teaching at Cornell University during 1906–1907…
Abstract
Chester Whitney Wright (1879–1966) received his A.B. in 1901, A.M. in 1902 and Ph.D. in 1906, all from Harvard University. After teaching at Cornell University during 1906–1907, he taught at the University of Chicago from 1907 to 1944. Wright was the author of Economic History of the United States (1941, 1949); editor of Economic Problems of War and Its Aftermath (1942), to which he contributed a chapter on economic lessons from previous wars, and other chapters were authored by John U. Nef (war and the early industrial revolution) and by Frank H. Knight (the war and the crisis of individualism); and co-editor of Materials for the Study of Elementary Economics (1913). Wright’s Wool-Growing and the Tariff received the David Ames Wells Prize for 1907–1908, and was volume 5 in the Harvard Economic Studies. I am indebted to Holly Flynn for assistance in preparing Wright’s biography and in tracking down incomplete references; to Marianne Johnson in preparing many tables and charts; and to F. Taylor Ostrander, as usual, for help in transcribing and proofreading.
Hicham Meghouar and Mohammed Ibrahimi
The purpose of this research is to highlight the financial characteristics of large French targets which were subject to takeovers during the period 2001–2007 and thereafter…
Abstract
Purpose
The purpose of this research is to highlight the financial characteristics of large French targets which were subject to takeovers during the period 2001–2007 and thereafter deduct the implicit motivations of acquirers.
Design/methodology/approach
Using a global sample of 128 French listed companies (64 targets and 64 non-targets), the authors carried out Wilcoxon–Mann–Whitney testing and logistic regression in order to test nine hypotheses likely to discriminate between the two categories of companies (targets and non-targets).
Findings
According to the results, target firms are more unbalanced in terms of growth resources and less rich in liquidity than their peers. They have unused debt capacity, offer greater opportunities for growth than firms in the control group and present low levels of value creation.
Research limitations/implications
The main limitation of this study is regarding the sample size, limited by the exclusive use of large firms (deals of over $100m). The scope of this research could be broadened in future by including medium-sized companies.
Practical implications
The authors believe that their results have two major implications. First, they enable market investors to achieve abnormal returns by investing in predicted targets through a portfolio of high takeover probability firms. Second, CEO of companies that are potentially targeted can assess their takeover likelihood in order to act and to manage such a situation for the benefit of their shareholders.
Originality/value
This research concerns the last wave of takeover prior to the subprime-mortgage financial crisis (2001–2007), a period that has not been sufficiently covered in empirical studies. This research contributes to the existing literature in two main respects. First, the results of this study improve our understanding of motivations for takeovers, particularly in the French context. Second, the introduction of new accounting and financial variables, not previously tested in the literature, enriches the available information concerning the profile of takeover targets.
Details
Keywords
Dana M. Griggs and Mindy Crain-Dorough
The purposes of this paper are to provide a description of AI and to document and compare two applications of AI, one in program evaluation and another in an applied research…
Abstract
Purpose
The purposes of this paper are to provide a description of AI and to document and compare two applications of AI, one in program evaluation and another in an applied research study.
Design/methodology/approach
Focus groups, interviews and observations were used to gather rich qualitative data which was used to detail Appreciative Inquiry's value in evaluation and research.
Findings
AI aided the researcher in connecting with the participants and valuing what they shared. In both studies, the use of AI amassed information that answered the research questions, provided a rich description of the context and findings, and led to data saturation. The authors describe and compare experiences with two applications of AI: program evaluation and a research study. This paper contributes further understanding of the use of AI in public education institutions. The researchers also explore the efficacy of using AI in qualitative research and recommend its use for multiple purposes.
Research limitations/implications
Limitations occurred in the AI-Design Stage by using a positive viewpoint and because both program and partnership studied were new with limited data to use for designing a better future. So, the authors recommend a revisit of both studies through the same 4D Model.
Practical implications
This manuscript shows that AI is useful for evaluation and research. It amplifies the participants' voices through favorite stories and successes. AI has many undiscovered uses.
Social implications
Through the use of AI the authors can: improve theoretical perspectives; conduct research that yields more authentic data; enable participants to deeply reflect on their practice and feel empowered; and ultimately impact and improve the world.
Originality/value
AI is presented as an evaluation tool for a high-school program and as a research approach identifying strengths and perceptions of an educational partnership. In both studies, AI crumbled the walls that are often erected by interviewees when expecting to justify or defend decisions and actions. This paper contributes further understanding of the use of AI in public education institutions.
Details
Keywords
Michael Babula, Max Tookey, Glenn Muschert and Mark Neal
The purpose of this paper is to answer the question, “Can particular types of altruism influence people to make unethical decisions?” The purpose of seeking to answer this…
Abstract
Purpose
The purpose of this paper is to answer the question, “Can particular types of altruism influence people to make unethical decisions?” The purpose of seeking to answer this question is to better understand those cases in personal, public and commercial life whereby a decision-maker is influenced by what is widely perceived to be a positive thing – altruism – to make unethical choices.
Design/methodology/approach
An experiment was designed to test the influence of different categories of altruism on decision-making about whether to find another guilty for a regulatory transgression. This involved the establishment and running of a student panel at a UK university, which was given the task of determining the guilt or otherwise of two students accused of plagiarism – one from a poor background; one from a rich background. Through a survey of both the decision-makers and their judgments, and by analyzing the data using t-tests and Mann–Whitney tests, the associations between different categories of altruism and the decisions made could be ascertained.
Findings
A total of 70.7% of the participants voted “not-guilty” for the poor student, whereas 68.3% voted “guilty” for the wealthy student. This indicated that self-interested, namely, egoistic altruism complemented by social and self-esteem needs gratification was significantly associated with violating foundational ethical principles.
Originality/value
This is the first study to be done that attempts to evaluate the relationships between different categories of altruism and ethical decision-making. The findings here challenge aggregating all forms of empathy together when exploring the antecedents of unethical behavior.
Details
Keywords
Anne Elrod Whitney and Suresh Canagarajah
This essay-conversation brings together two literacy scholars who have worked with religious literacies: Suresh Canagarajah and Anne Elrod Whitney. They discuss not only the…
Abstract
Purpose
This essay-conversation brings together two literacy scholars who have worked with religious literacies: Suresh Canagarajah and Anne Elrod Whitney. They discuss not only the importance of religious literacies research but also their own experiences conducting such research as people of faith themselves.
Design/methodology/approach
The essay is derived from a live interview conversation between the authors, which was later edited along with short introductory and closing material.
Findings
Their conversation addresses religious literacies in disciplinary contexts, in teaching and in the careers of scholars.
Originality/value
This essay offers researchers and practitioners in literacy education a perspective from two scholars whose recent work has treated their own faith explicitly.
Details
Keywords
Dawn M. Russell and Anne M. Hoag
Understanding people and how they factor into complex information technology (IT) implementations is critical to reversing the growing trend of costly IT implementation failures…
Abstract
Understanding people and how they factor into complex information technology (IT) implementations is critical to reversing the growing trend of costly IT implementation failures. Accordingly, this article presents an approach to dissecting the social and organizational influences impacting peoples’ acceptance of technology designed to improve business performance. This article applies the diffusion of innovation theoretical framework to understand and analyze IT innovation implementation challenges. The diffusion approach is applied to two recent cases of implementations of IT supply chain innovations at two aerospace firms, both with complex, global, inter‐firm supply chains. Results indicate that several social and organizational factors do affect the implementation's success. Those factors include users’ perceptions of the innovation, the firm's culture, the types of communication channels used to diffuse knowledge of the innovation and various leadership factors.
Details
Keywords
This study was launched because practitioners of Appreciative Inquiry (AI) instilled awareness for needed AI outcome research. Therefore, the goal of this research was to identify…
Abstract
This study was launched because practitioners of Appreciative Inquiry (AI) instilled awareness for needed AI outcome research. Therefore, the goal of this research was to identify the salient AI processes and levers and the rate of AI success and failure. This study was specific to U.S. municipalities due to a researcher finding AI failure probability therein. In direct opposition, eight U.S. municipalities were identified from the literature as having utilized AI in 14 projects and all were successful even when resistance was present in three applications. A survey revealed 15 AI initiatives identified as successful even when resistance was present in eight, resulting in validation. This study utilized a mixed methods exploratory case study design, sequentially in the mix, consisting of a literature review and application of two unique instruments applied to three populations.