The behaviour of cracked finite elements is investigated. It is shown that spurious kinematic modes may emerge when softening type constitutive laws are employed. These modes are…
Abstract
The behaviour of cracked finite elements is investigated. It is shown that spurious kinematic modes may emerge when softening type constitutive laws are employed. These modes are not always suppressed by surrounding elements. This is exemplified for a double‐notched concrete beam and for a Crack‐Line‐Wedge‐Loaded Double‐Cantilever‐Beam (CLWL—DCB). The latter example has been analysed for a large variety of finite elements and integration schemes. To investigate the phenomenon in greater depth an eigenvalue analysis has been carried out for some commonly used finite elements.
Max A.N. Hendriks and Jan G. Rots
The purpose of this paper is to review recent advances and current issues in the realm of sequentially linear analysis.
Abstract
Purpose
The purpose of this paper is to review recent advances and current issues in the realm of sequentially linear analysis.
Design/methodology/approach
Sequentially linear analysis is an alternative to non‐linear finite element analysis of structures when bifurcation, snap‐back or divergence problems arise. The incremental‐iterative procedure, adopted in nonlinear finite element analysis, is replaced by a sequence of scaled linear finite element analyses with decreasing secant stiffness, corresponding to local damage increments. The focus is on reinforced concrete structures, where multiple cracks initiate and compete to survive.
Findings
Compared to nonlinear smeared crack models in incremental‐iterative settings, the sequentially linear model is shown to be robust and effective in predicting localizations, crack spacing and crack width as well as brittle shear behavior. To date, sequentially linear analysis has not been devised with a proper crack closing algorithm. Besides, of utmost importance for many practical applications, sequentially linear analysis requires an improvement of the algorithm to deal with non‐proportional loadings.
Originality/value
This article gives an up‐to‐date research overview on the applicability of sequentially linear analysis. For the issue of non‐proportional loading, it indicates solution directions.
Details
Keywords
Gives introductory remarks about chapter 1 of this group of 31 papers, from ISEF 1999 Proceedings, in the methodologies for field analysis, in the electromagnetic community…
Abstract
Gives introductory remarks about chapter 1 of this group of 31 papers, from ISEF 1999 Proceedings, in the methodologies for field analysis, in the electromagnetic community. Observes that computer package implementation theory contributes to clarification. Discusses the areas covered by some of the papers ‐ such as artificial intelligence using fuzzy logic. Includes applications such as permanent magnets and looks at eddy current problems. States the finite element method is currently the most popular method used for field computation. Closes by pointing out the amalgam of topics.
Details
Keywords
Examines recent studies of maximum expected disc life and theimplications for archive storage on CD‐ROM. Discusses the manufactureand structure of compact discs, the phenomenon of…
Abstract
Examines recent studies of maximum expected disc life and the implications for archive storage on CD‐ROM. Discusses the manufacture and structure of compact discs, the phenomenon of disc rot, how discs can be ruined in use, and some actions which can reduce the risk of damage to compact discs. Concludes that when CD becomes a widespread archival medium, they will have to be treated as carefully as other media, although the problem of disc rot is beyond the purchasers′ control.
Details
Keywords
Man has been seeking an ideal existence for a very long time. In this existence, justice, love, and peace are no longer words, but actual experiences. How ever, with the American…
Abstract
Man has been seeking an ideal existence for a very long time. In this existence, justice, love, and peace are no longer words, but actual experiences. How ever, with the American preemptive invasion and occupation of Afghanistan and Iraq and the subsequent prisoner abuse, such an existence seems to be farther and farther away from reality. The purpose of this work is to stop this dangerous trend by promoting justice, love, and peace through a change of the paradigm that is inconsistent with justice, love, and peace. The strong paradigm that created the strong nation like the U.S. and the strong man like George W. Bush have been the culprit, rather than the contributor, of the above three universal ideals. Thus, rather than justice, love, and peace, the strong paradigm resulted in in justice, hatred, and violence. In order to remove these three and related evils, what the world needs in the beginning of the third millenium is the weak paradigm. Through the acceptance of the latter paradigm, the golden mean or middle paradigm can be formulated, which is a synergy of the weak and the strong paradigm. In order to understand properly the meaning of these paradigms, however, some digression appears necessary.
Details
Keywords
Lieven Vandevelde, Johan J.C. Gyselinck, Francis Bokose and Jan A.A. Melkebeek
Vibrations and acoustic noise are some of the fundamental problems in the design and exploitation of switched reluctance motors (SRMs). Adequate experimental and analysis methods…
Abstract
Vibrations and acoustic noise are some of the fundamental problems in the design and exploitation of switched reluctance motors (SRMs). Adequate experimental and analysis methods may help to resolve these problems. This paper presents a theoretical analysis of the magnetic force distribution in SRM and a procedure for calculating the magnetic forces and the resulting vibrations based on the 2D finite element method. Magnetic field and force computations and a structural analysis of the stator have been carried out in order to compute the frequency spectrum of the generalized forces and displacements of the most relevant vibration modes. It is shown that for these vibration modes, the frequency spectrum can be predicted analytically. The theoretical and the numerical analyses have been applied to a 6/4 SRM and an experimental validation is presented.
Details
Keywords
The purpose of this paper is to review the historic development of the requirements for sub-floor (also known as “basementless space” or “crawl space”) moisture management in the…
Abstract
Purpose
The purpose of this paper is to review the historic development of the requirements for sub-floor (also known as “basementless space” or “crawl space”) moisture management in the USA, UK and New Zealand (NZ) from 1600s to 1969.
Design/methodology/approach
The review of 171 documents, including legislation, research papers, books and magazines, identified three time periods where the focus differed: 1849, removal of impure air; 1850–1929, the use of ground cover and thorough ventilation; and 1930–1969, the development of standards.
Findings
Published moisture management guidance has been found from 1683, but until the 1920s, it was based on the provision of “adequate” ventilation and, in the UK, the use of impermeable ground cover. Specific ventilation area calculations have been available from 1898 in the UK, 1922 in the USA and 1924 in NZ. These are based on the area of ventilation per unit floor area, area of ventilation per unit length of perimeter wall, or a combination of both. However, it was not until 1937 in the USA, 1944 in NZ and after the period covered by this paper in the UK, that numerical values were enforced in codes. Vents requirements started at 1 in. of vent per square foot of floor area (0.7 per cent but first published in the USA with a misplaced decimal point as 7 per cent). The average vent area was 0.69 per cent in USA for 19 cases, 0.54 per cent in NZ for 7 cases and 0.13 per cent in UK for 3 cases. The lower UK vent area requirements were probably due to the use of ground covers such as asphalt or concrete in 1854, compared with in 1908 in NZ and in 1947 in USA. The use of roll ground cover (e.g. plastic film) was first promoted in 1949 in USA and 1960 in NZ.
Practical implications
Common themes found in the evolution of sub-floor moisture management include a lack of documented research until the 1940s, a lack of climate or site-based requirements and different paths to code requirements in the three countries. Unlike many building code requirements, a lack of sub-floor moisture management seldom leads to catastrophic failure and consequent political pressure for immediate change. From the first published use of performance-based “adequate” ventilation to the first numerical or “deemed to satisfy” solutions, it took 240 years. The lessons from this process may provide guidance on improving modern building codes.
Originality/value
This is the first time such an evaluation has been undertaken for the three countries.
Details
Keywords
Gives an in depth view of the strategies pursued by the world’s leading chief executive officers in an attempt to provide guidance to new chief executives of today. Considers the…
Abstract
Gives an in depth view of the strategies pursued by the world’s leading chief executive officers in an attempt to provide guidance to new chief executives of today. Considers the marketing strategies employed, together with the organizational structures used and looks at the universal concepts that can be applied to any product. Uses anecdotal evidence to formulate a number of theories which can be used to compare your company with the best in the world. Presents initial survival strategies and then looks at ways companies can broaden their boundaries through manipulation and choice. Covers a huge variety of case studies and examples together with a substantial question and answer section.
Details
Keywords
This chapter reviews factors responsible for climate change, impacts of the change on animal health, zoonotic diseases, and their linkage with One-Health program.
Abstract
Purpose
This chapter reviews factors responsible for climate change, impacts of the change on animal health, zoonotic diseases, and their linkage with One-Health program.
Design/methodology/approach
This chapter is based on the available literature related to climate change and its effect on animal health and production from different points. The causes and change forcers of climate change, direct and indirect effects of the change on animal health management, host–pathogen–vector interaction, and zoonotic diseases are included. Inter-linkage between climate change and One-Health program are also assessed.
Findings
Beside natural causes of climatic change, greenhouse gases are increasing due to human activities, causing global climate changes which have direct and indirect animal health and production performance impacts. The direct impacts are increased ambient temperature, floods, and droughts, while the indirect are reduced availability of water and food. The change and effect also promote diseases spread, increase survival and availability of the pathogen and its intermediate vector host, responsible for distribution and prevalence of tremendous zoonotic, infectious, and vector-borne diseases. The adverse effect on the biodiversity, distribution of animals and micro flora, genetic makeup of microbials which may lead to emerging and re-emerging disease and their outbreaks make the strong linkage between climate change and One-Health.
Practical implications
Global climate change is receiving increasing international attention where international organizations are increasing their focus on tackling the health impacts. Thus, there is a need for parallel mitigation of climate change and animal diseases in a global form.
Originality/value
Most research on climate change is limited to environmental protection, however this chapter provides a nexus between climate change, animal health, livestock production, and the One-Health program for better livelihood.
Details
Keywords
Application of the numerical method to the art of Medicine was regarded not as a “trivial ingenuity” but “an important stage in its development”; thus proclaimed Professor…
Abstract
Application of the numerical method to the art of Medicine was regarded not as a “trivial ingenuity” but “an important stage in its development”; thus proclaimed Professor Bradford Hill, accepted as the father of medical statistics, a study still largely unintelligible to the mass of medical practitioners. The need for Statistics is the elucidation of the effects of multiple causes; this represents the essence of the statistical method and is most commendable. Conclusions reached empirically under statistical scrutiny have mistakes and fallacies exposed. Numerical methods of analysis, the mathematical approach, reveals data relating to factors in an investigation, which might be missed in empirical observation, and by means of a figure states their significance in the whole. A simplified example is the numerical analysis of food poisoning, which alone determines the commonest causative organisms, the commonest food vehicles and the organisms which affect different foods, as well as changes in the pattern, e.g., the rising incidence of S. agona and the increase of turkey (and the occasions on which it is served, such as Christmas parties), as a food poisoning vehicle. The information data enables preventive measures to be taken. The ever‐widening fields of Medicine literally teem with such situations, where complexities are unravelled and the true significance of the many factors are established. Almost every sphere of human activity can be similarly measured. Apart from errors of sampling, problems seem fewer and controversy less with technical methods of analysis then on the presentation and interpretation of figures, or as Bradford Hill states “on the application of common sense and on elementary rules of logic”.