• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Book chapter

Capturing bias in structural equation modeling.

P. 3-43.

Equivalence studies are coming of age. Thirty years ago there were few conceptual models and statistical techniques to address sources of systematic measurement error in cross-cultural studies (for early examples, see Clearly & Hilton, 1968; Lord, 1977, 1980; Poortinga, 1971). This picture has changed; in the last decades conceptual models and statistical techniques have been developed and refined. Many empirical examples have been published. There is a growing awareness of the importance in the field for the advancement of cross-cultural theorizing. An increasing number of journals require authors who submit manuscripts of cross-cultural studies to present evidence supporting the equivalence of the study measures. Yet, the burgeoning of the field has not led to a convergence in conceptualizations, methods, and analyses. For example, educational testing focuses on the analysis of items as sources of problems of crosscultural comparisons, often using item response theory (e.g., Emenogu & Childs, 2005). In personality psychology, exploratory factor analysis is commonly applied as a tool to examine the similarity of factors underlying a questionnaire (e.g., McCrae, 2002). In survey research and marketing, structural equation modeling (SEM) is most frequently employed (e.g., Steenkamp & Baumgartner, 1998). From a theoretical perspective, these models are related; for example, the relationship of item response theory and confirmatory factor analysis (as derived from a general latent variable model) has been described by Brown (2006). However, from a practical perspective, the models can be seen as relatively independent paradigms; there are no recent studies in which various bias models are compared (an example of an older study in which procedures are compared that are no longer used has been described by Shepard, Camilli, & Averill, 1981). In addition to the diversity in mathematical developments, conceptual frameworks for dealing with cross-cultural studies have been developed in cross-cultural psychology, which, again, have a slightly different focus. It is fair to say that the field of equivalence is still expanding in both conceptual and statistical directions and that rapprochement of the approaches and best practices that are broadly accepted across various fields are not just around the corner. The present chapter relates the conceptual framework about measurement problems that is developed in cross-cultural psychology (with input from various other sciences studying cultures and cultural differences) to statistical developments and current practices in SEM vis-à-vis multigroup testing. More specifically, I address the question of the strengths and weaknesses of SEM from a conceptual bias and equivalence framework. There are few publications in which more conceptually based approaches to bias that are mainly derived from substantive studies are linked to more statistically based approaches such as developed in SEM. This chapter adds to the literature by linking two research traditions that have worked largely independently in the past, despite the overlap in bias issues addressed in both traditions. The chapter deals with the question to what extent the study of equivalence, as implemented in SEM, can address all the relevant measurement issues of cross-cultural studies. The first part of the chapter describes a theoretical framework of bias and equivalence. The second part describes various procedures and examples to identify bias and address equivalence. The third part discusses the identification of all the bias types distinguished using SEM. The fourth part presents a SWOT analysis (strengths, weaknesses, opportunities, and threats) of SEM in dealing with bias sources in cross-cultural studies. Conclusions are drawn in the final part.

In book

Capturing bias in structural equation modeling.
Edited by: E. Davidov, P. Schmidt, J. Billiet et al. L.: Routledge, 2018.