Вы находитесь на странице: 1из 18

Content analysis

Mean

Chi- square

Correlation

Regression and Anova

Statistical table

Graph

Abstract

Conclusion Chapter 7: Conclusion This study is focused on the critical evaluation on the role and reliability of personality questionnaires in conducting various human resource activities including employee recruitment and appraisal. The research also aimed to identify the pros and cons of screening applicants through a personality test. Primary and secondary resources were used in the study. For the primary data, the researcher opted to conduct a survey using randomly selected HR personnel as participants. A questionnaire, structured in Likert format, was used for data gathering. The answers of the respondents were then processed by computing their corresponding weighted means. The results of the computation were then used as basis for the data analysis. Secondary resources derived from various publications including books and journals were integrated to support the findings. Based from the results of the survey, personality questionnaires play an important role in the recruitment and appraisal of the employees. The respondents agree that this HR tool is capable of identifying essential personal attributes of the applicants, which promote effective hiring and promotion. In addition, personality tests have some other advantages. One of which is its ability to establish good relations among employees through the resolution or prevention of workplace conflicts. Personality tests also support the establishment of culture within the organization, which in turn helps in enhancing the performance an output level of the company. The integration of computer technology in administering personality tests also made this tool a cost-effective means fro recruiting new employees. The use of personality tests also enables companies to save on valuable resources as it reduces the rate of employee turnover. Despite these benefits, personality questionnaires also have certain drawbacks. For instance, the validity and accuracy of the results obtained from these questionnaires are continuously questioned. Considering that applicants can easily fake their personality scores, the results would naturally be affected. Moreover, while this tool is relatively inexpensive, it still requires highly skilled and trained HR staff to ensure correct analysis and interpretation results. Literatures however, noted that despite the training of the personnel, misinterpretation is still very likely; hence, it is difficult to employ a strategy whose outcome is not guaranteed. Other critics also pointed out that the use of personality questionnaire is inappropriate as it exhibits discrimination and violation of ones privacy. In general, personality questionnaire is not a full-proof tool for recruitment, appraisal and other HR procedures. While there may be flaws, literatures had noted that the downsides of personality tests can be addressed. For instance, its relation to legal and discrimination issues can be resolved by ensuring that the questionnaire has been validated. The company must also ensure that the questions in the test are all related to what the company really needs to find out from their applicants. Questions that infringe a persons privacy or suggest discrimination should not be included (Frieswick, 2004). In the article written by Bates (2002), HR professionals also suggest that companies should not fully rely on personality tests alone when hiring or appraising employees. For instance, personality tests should be combined with cognitive test to assess the intelligence of the applicants. Personality tests should not be used to substitute tools that measure the individuals knowledge or capabilities. Hence, it is essential that HR professionals make use of various relevant predictors to improve hiring and promotion outcomes. In conclusion, all HR tools have its own pros and cons; HR staff should then be skilled enough to optimize their benefits and address their flaws.

Recommendation

By: Engr. Mary Rose Florence S. Cobar, Doctor of Philosophy in Education Thesis title: Development of a Source Material in Food Dehydration Craft Technology for the Secondary Schools Recommendations After a thorough analysis of data, the following recommendations are hereby made: This research study suggests that education managers study diffusion theory for three reasons. First, education managers and instruction technologists do not know why most instructional innovations are or not adopted. Some blame teachers and a resistance to change while the others blamed bureaucracies and lack of funding. In the Philippine context, its more a case of lack in funding and political interference, but by and large, schools are commonly viewed as resistant to change. By studying diffusion theory, education managers may be able to explain, predict and account for factors that influence or impede adoption and diffusion of innovations in teaching methods. Therefore, understanding the best way to present innovations for possible adoption of a method is through communication channels. Third, education managers may be able to develop a systematic model for innovative methods in teaching not only the basic courses but in the Makabayan learning area which is one of the study area of this body of research, in simple terms: INNOVATIVENESS = RESOURFULNESS + ADAPTABILITY Given that food dehydration in some aspects is a technological innovation, it is useful to apply the tenets of diffusion theory to understand food dehydrations diffusion in the social system. Diffusion theory provides a framework that helps food dehydration adopted, to be explained, predicted and accounted to by factors that increase or impede the diffusion of innovation. Diffusion theory helps the teachers in the education community identify qualities,ie. relative advantage, compatibility, triability and observability to potential adopters. The diffusion framework also provides a closer look at the communication channels used to spread the word about food dehydration, time span and the characteristics of the adopters. To provide a compelling argument as to the reasons behind the actions of individuals as adopters of an innovation, this study recommends for further research in the actuations of the adopters through the use of the actor-network theory (ANT) perspective. Diffusion theory approach is more of the cause and effect of innovation while actor-network theory traces the maneuvers, compromises, twists and turns of a negotiation as it is translated during the process of adoption. The scope of an actor-network theory (ANT) analysis is to yield a broader understanding relative to the professional development of the teachers concerned or attributed to in this study. In context, diffusion theory posits an innovation (food dehydration) ought to be adopted to be able to be diffused through a system (secondary education), while an actor-network theory approach will be primarily concerned with tracing the complex and contingent factors involved in the overall innovation process and the contributory influence to the education sector. For the source material, an inclusion of setting up a small home-based enterprise of the family size unit and its system operation and management information. This entrepreneurial segment runs parallel to what the Department of Education and the government would or have implemented starting school year 2006-2007 in key pilot areas, that is, business management for students in the secondary school level to prepare them after graduation and beyond. In the food dehydration craft technology segment, the teachersapply the study of science and technology to

that of business management and economicsthat can be diffused to the students by their teachers as a learning paradigm to prepare them options after secondary school. In the food dehydration craft technology area of this research study, the recommendations to the new design conceived are the following: a) a built-in thermometer, hygrometer and psychrometer should be installed to monitor the conditions inside the dehydrator; b) an additional circuit system designed to control the voltage input to the heating element for a stable hot air supply; c) the material of construction to be used should be made of stainless steel so as not to oxidize the food being dried because the prototype unit made use of Aluminum which is not recommended for use in food like fruits having a high acid content; d) the blower fans to be used should be regulated as low, medium and high for better regulation of the relative humidity inside the drying chamber; e) if the prototype dehydrator has been built, experimentations should be done on a variety of foods to test its efficacy to deliver the desired output. From the design simulation, the following materials of construction are needed: Table 16. Table of Specifications

System Design A system involving a small scale food dehydration enterprise requires minimal capital investment and technical and management skills. But changes due to market trends and to keep the business viable, managerial and technical skills are extremely important in any field where income generation is of primary importance, management knowledge is a must and that includes the teachers for whom this research study is attributed. In systems management, emphasis must be in integrating entrepreneurial technology, finance and marketing strategies instead of transfer of technique only and the most ignored factor, gut feel of the economic factors to be considered.

Title page

EXAMPLES OF A RESEARCH DESIGN Descriptive Research

CALL IN THE YEAR 2000: STILL IN SEARCH OF RESEARCH PARADIGMS? Carol Chapelle Iowa State University ABSTRACT Advancements in the design and use of computer-assisted language learning (CALL) activities require that key questions about CALL be identified and effective research methods be used to answer them. In this paper, I suggest looking to research on other types of second language (L2) classroom learning activities for guidance in framing CALL research questions and in discovering relevant research methods. I begin with examples from the CALL literature demonstrating the diverse perspectives (e.g., cognitive psychology, constructivism, and psycholinguistics) which have been suggested as ways of approaching CALL research. I then summarize the research questions and methods of L2 classroom research with emphasis on the "interactionist" approach and discourse analysis. Using three examples --computer-mediated communication, a microworld, and vocabulary in reading-- I will illustrate how similar discourse analysis methods can address essential descriptive and evaluative questions about CALL activities. Finally, I will outline some implications of this perspective for design and investigation of CALL activities. Causal Research

Use of causal models in marketing research: A review John Hulland, Yiu Ho Chow and Shunyin Lam Western Business School, The University of Western Ontario, London, Ont. N6A 3K7, Canada Received 15 January 1995; accepted 5 January 1996. ; Available online 26 February 1999. ABSTRACT Use of causal models in marketing has grown significantly. By combining data and theory, these models provide researchers with more powerful opportunities to advance scientific knowledge. However, such advances can only be achieved if researchers make proper usage of causal modeling techniques. We review causal models published during 19801994. Many aspects of the models investigated meet traditional standards. However, several unsettling findings emerge. In particular, we were able to replicate the results reported for only one in five of the models studied. These shortcomings lead us to suggest a number of improvements for future causal modeling practice. Exploratory Research

Exploratory research: citizen participation, local government and sustainable development in Australia Michael Cuthill Abstract Exploratory research reported in this paper was undertaken in Adelaide, Australia during 1998/99. The purpose of the research is to explore local development practice as evidenced through the experiences and actions of local citizens, community based groups and local government (Neuman, 1994). Results from this first stage research suggest that sustainability initiatives in Australia might best be implemented through a collaborative approach at the local community level involving local citizens working in partnership with local government. Copyright 2002 John Wiley & Sons, Ltd. and ERP Environment.

Correlational Research

Social Science, Rehabilitation and Counseling,Exercise & Occupational Therapy and Vocational Rehabilitation Shawn M. Fitzgerald1, Phillip D. Rumrill, Jr.1, Jason D. Schenker1 Abstract The article describes correlational research designs as a method for testing relationships between or among variables of interest in the lives of people with disabilities. The authors describe conceptual aspects of correlational research, discuss the methods by which researchers select variables for this type of inquiry, explain the primary purposes of correlational studies, and overview data analytic strategies. These discussions are illustrated with examples from the contemporary vocational rehabilitation literature. Comparative Research

Comparative Research Methodology: Cross-Cultural Studies Richard W. Brislina Abstract Cross-cultural research can make contributions to theory development by identifying groups of people who seem not to behave according to established theories and by increasing the range of independent variables available for study in any one culture. A major methodological orientation to such studies, developed over the last 10 years, is the emic-etic distinction. An emic analysis documents valid principles that describe behavior in any one culture, taking into account what the people themselves value as meaningful and important. The goal of an etic analysis is to make generalizations across cultures that take into account all human behavior. Examples of these approaches are given from studies on ingroup-outgroup relations in Greece and the United States; and studies on the need for achievement and its relation to the need for affiliation. A specific method to document emic and etic principles is presented which involves the development of core items to measure concerns in all cultures under study, and culture-specific items which are designed to measure concerns in one culture that may not be appropriate for all cultures under study. The techniques of back-translation and decentering are related to the emic-etic approach, as are the techniques developed by Triandis which involve the development of research instruments within each culture and the use of factor analysis. The most general approach, applicable to all comparative studies, is the plausible rival hypothesis analysis which forces the research to examine each and every potential explanation for any data set. The suggestion is made that the future of cross-cultural research will depend on its contribution to theory in general psychology, and methods (such as those presented here) will only be a means to the major goal of discovering important, central facts about human behavior. Evaluative Research Approaches to Evaluative Research: A Review Francis G. Caro A1 A1 Institute of Behavioral Science, University of Colorado Abstract Social science writings on approaches to the evaluation of action programs are numerous but highly scattered. The present paper is an attempt to identify the major themes in recent social science literature with direct implications for evaluation research. The treatment is focused on situations in which action programs are conducted by formal organizations, and evaluative researchers are directly linked to program administrators. The material can be organized according to four major categories: (1) basic issues in evaluative research; (2) methodology; ( 3 ) administration of evaluative research; and (4) implementation of research findings.

Experimental Research

A Meta-Analysis of Experimental Research on Teacher Questioning Behavior Doris L. Redfield Western Kentucky University Elaine Waldman Rousseau University of Arizona Abstract The meta-analytic technique was used to synthesize experimental research findings on the relationship between level of teacher questioning and student achievement. Twenty studies on teachers use of higher and lower cognitive questions were reviewed. Higher cognitive questions require pupils to manipulate information to create and support a response; lower cognitive questions call for verbatim recall or recognition of factual information. Effect sizes were computed to investigate the impact of program monitoring, experimental validity, and level of teacher questioning. Results show that gains in achievement can be expected when higher cognitive questions assume a predominant role during classroom instruction. Action Research

Improving Learning and Teaching Through Action Learning and Action Research Ortrun ZuberSkenitta* ABSTRACT The purpose of this paper is to present a theoretical framework for action learning and action research for better understanding and improving university learning and teaching. Action research is conceived as a philosophy, a theory of learning, a methodology and a technique. The philosophy includes theories of action, critical theory and personal construct theory. The learning theory encompasses adult learning, experiential learning and doubleloop learning. The methodology is based in the dialectical epistemology and the nonpositivist paradigm. Examples of action research as a technique are the nominal group technique, the repertory grid technique and other tools aiding reflection and group discussion. It is concluded that action research not only advances knowledge, but also improves practice in higher education by. developing people as professionals and personal scientists, and organisations as learning organisations.

PROBABILITY Random Sampling

The Prevalence of Elder Abuse: A Random Sample Survey1 Karl Pillemer, PhD2,3 and David Finkelhor, PhD Abstract In this first large-scale random sample survey of elder abuse and neglect, interviews were conducted with 2020 community-dwelling elderly persons in the Boston metropolitan area regarding their experience of physical violence, verbal aggression, and neglect. The prevalence rate of overall maltreatment was 32 elderly persons per 1000. Spouses were found to be the most likely abusers and roughly equal numbers of men and women were victims, although women suffered more serious abuse. Implications for public policy are discussed. Stratified Sampling

Cranial capacity related to sex, rank, and race in a stratified random sample of 6,325 U.S. military personnel J. Philippe Rushton University of Western Ontario, Canada Available online 12 August 2002. Abstract The issue of whether human populations differ in brain size remains controversial. Cranial capacities were calculated from external head measurements reported for a stratified random sample of 6,325 U.S. Army personnel measured in 1988. After adjusting for the effects of stature and weight, and then, sex, rank, or race, the cranial capacity of men averaged 1,442 and women 1,332 cm3; that of officers averaged 1,393 and enlisted personnel 1,375 cm3; and that of Mongoloids averaged 1,416, Caucasoids 1,380, and Negroids 1,359 cm3. Cluster Sampling Estimating standard errors of accuracy assessment statistics under cluster sampling Stephen V. Stehman * College of Environmental Science and Forestry, State University of New York, Syracuse, USA Received 26 December 1995; Revised 5 August 1996. Available online 8 June 1998. Abstract Cluster sampling is a viable sampling design for collecting reference data for the purpose of conducting an accuracy assessment of land-cover classifications obtained from remotely sensed data. The formulas for estimating various accuracy parameters such as the overall proportion of pixels correctly classified, the kappa coefficient of agreement, and user's and producer's accuracy are the same under cluster sampling and simple random sampling, but the formulas for estimating standard errors differ between the two designs. If standard error formulas appropriate for cluster sampling are not employed in an accuracy assessment based on this design, the reported variability of map accuracy statistics is likely to be grossly misleading. The proper standard error formulas for common map accuracy statistics are derived for one-stage cluster sampling. The validity of these standard error formulas is verified by a small simulation study, and the standard errors computed according to the usual simple random sampling formulas are shown to underestimate the true cluster sampling standard errors by 2070% if the intracluster correlation is moderate.

2-Stage Random Sampling An introduction to the logic, assumptions, and basic analytic procedures of two-stage least squares. James, Lawrence R.; Singh, B. Krishna Psychological Bulletin, Vol 85(5), Sep 1978, 1104-1122. doi: 10.1037/0033-2909.85.5.1104 Abstract Reviews a statistical procedure, 2-stage least squares (2SLS), that potentially could be employed to address a number of salient problems in psychology. The treatment is nontechnical, with emphasis placed on the logic, assumptions, and basic methodological principles of 2SLS. As is shown by both statistical and substantive examples, 2SLS may be employed to (a) examine reciprocal causation, (b) eliminate bias created by random measurement error, and (c) assess the causal effects of reciprocally related dependent variables measured at different points in time. A discussion of the role of structural equations in causal analysis is included with a recommendation that many psychological research endeavors might benefit from thinking in terms of causal systems and nonexperimental inference. (50 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved) Systematic Sampling Systematic Sampling, Temporal Aggregation, and the Study of Political Relationships John R. Freeman Abstract Systematic sampling and temporal aggregation are the practices of sampling a time series at regular intervals and of summing or averaging time series observations over a time interval, respectively. Both practices are a source of statistical error and faulty inference. The problems that systematic sampling and temporal aggregation create for the construction of strongly specified and weakly specified models are discussed. The seriousness of these problems then is illustrated with respect to the debate about superpower rivalry. The debate is shown to derive, in part, from the fact that some researchers employ highly temporally aggregated measures of U.S. and Soviet foreign policy behavior. The larger methodological lessons are that we need to devote more time to determining the natural time unit of our theories and to conducting robustness checks across levels of temporal aggregation. Area Sampling Revised estimates of the annual net flux of carbon to the atmosphere from changes in land use and land management 18502000 R. A. HOUGHTON Abstract Recent analyses of land-use change in the US and China, together with the latest estimates of tropical deforestation and afforestation from the FAO, were used to calculate a portion of the annual flux of carbon between terrestrial ecosystems and the atmosphere. The calculated flux includes only that portion of the flux resulting from direct human activity. In most regions, activities included the conversion of natural ecosystems to cultivated lands and pastures, including shifting cultivation, harvest of wood (for timber and fuel) and the establishment of tree plantations. In the US, woody encroachment and woodland thickening as a result of fire suppression were also included. The calculated flux of carbon does not include increases or decreases in carbon storage as a result of environmental changes (e.g., increasing concentrations of CO2, N deposition, climatic change or pollution). Globally, the long-term (18502000) flux of carbon from changes in land use and management released 156 PgC to the atmosphere, about 60% of it from the tropics. Average annual fluxes during the 1980s and 1990s were 2.0 and 2.2 PgC yr1, respectively, dominated by releases of carbon from the tropics. Outside the tropics, the average net flux of carbon attributable to land-use change and management decreased from a source of 0.06 PgC yr1 during the 1980s to a sink of 0.02 PgC yr1 during the 1990s. According to the analyses summarized here, changes in land use were responsible for sinks in North America and Europe and for small sources in other non-tropical regions. The revisions were as large as 0.3 PgC yr1 in individual regions but were largely offsetting, so that the global estimate for the 1980s was changed little from an earlier estimate. Uncertainties and recent improvements in the data used to

calculate the flux of carbon from land-use change are reviewed, and the results are compared to other estimates of flux to evaluate the extent to which processes other than land-use change and management are important in explaining changes in terrestrial carbon storage. Double Sampling DOUBLE SAMPLING TO ESTIMATE DENSITY AND POPULATION TRENDS IN BIRDS Jonathan Bart and Susan Earnst U.S. Geological Survey Forest and Rangeland Ecosystem Science Center, Snake River Field Station, 970 Lusk Street, Boise, Idaho 83706, USA 1jbart@eagle.boisestate.edu Abstract We present a method for estimating density of nesting birds based on double sampling. The approach involves surveying a large sample of plots using a rapid method such as uncorrected point counts, variable circular plot counts, or the recently suggested double-observer method. A subsample of those plots is also surveyed using intensive methods to determine actual density. The ratio of the mean count on those plots (using the rapid method) to the mean actual density (as determined by the intensive searches) is used to adjust results from the rapid method. The approach works well when results from the rapid method are highly correlated with actual density. We illustrate the method with three years of shorebird surveys from the tundra in northern Alaska. In the rapid method, surveyors covered 10 ha h1 and surveyed each plot a single time. The intensive surveys involved three thorough searches, required 3 h ha1, and took 20% of the study effort. Surveyors using the rapid method detected an average of 79% of birds present. That detection ratio was used to convert the index obtained in the rapid method into an essentially unbiased estimate of density. Trends estimated from several years of data would also be essentially unbiased. Other advantages of double sampling are that (1) the rapid method can be changed as new methods become available, (2) domains can be compared even if detection rates differ, (3) total population size can be estimated, and (4) valuable ancillary information (e.g. nest success) can be obtained on intensive plots with little additional effort. We suggest that double sampling be used to test the assumption that rapid methods, such as variable circular plot and doubleobserver methods, yield density estimates that are essentially unbiased. The feasibility of implementing double sampling in a range of habitats needs to be evaluated. Multi-stage Sampling Multi-stage stochastic linear programs for portfolio optimization George B. Dantzig and Gerd Infanger Abstract The paper demonstrates how multi-period portfolio optimization problems can be efficiently solved as multistage stochastic linear programs. A scheme based on a blending of classical Benders decomposition techniques and a special technique, called importance sampling, is used to solve this general class of multi-stochastic linear programs. We discuss the case where stochastic parameters are dependent within a period as well as between periods. Initial computational results are presented.

NON-PROBABILISTIC Convenience Sampling Targeted Sampling: Options for the Study of Hidden Populations John K. Watters and Patrick Biernacki Abstract This paper describes some of the efforts of an interdisciplinary research team investigating the transmission of human immunodeficiency virus (HIV), the causative pathogen associated with the acquired immunodeficiency syndrome (AIDS) and related conditions. The risk groups studied were injecting drug users and their sexual partners. Due to the clandestine nature of illicit drug use, we were faced with two interrelated problems: developing a scientific method to monitor the spread of the HIV infection among these drug users and their sexual partners, groups generally thought to be especially difficult to reach; and creating a health education intervention that would help stop the epidemic from spreading among this population and through them to other members of the community. The method we developed to sample injecting drug users is called targeted sampling. Although it incorporates some aspects of other well established sampling strategies, it is sufficiently different to be treated as a separate research method. Further, targeted sampling provides a cohesive set of research methods that can help researchers study health or social problems that exist among populations that are difficult to reach because of their attributed social stigma, legal status, and consequent lack of visibility. Purposive Sampling Ignorable and informative designs in survey sampling inference R. A. SUGDEN and T. M. F. SMITH + Author Affiliations Department of Mathematics, University of London, Goldsmiths' College London, U.K. Faculty of Mathematical Studies, University of Southampton Southampton, U.K. Received February 1, 1984. Revision received May 1, 1984. Abstract SUMMARY The role of the sample selection mechanism in a model-based approach to finite population inference is examined. When the data analyst has only partial information on the sample design then a design which is ignorable when known fully may become informative. Conditions under which partially known designs can be ignored are established and examined for some standard designs. The results are illustrated by an example used by Scott (1977). Quality Sampling Water quality sampling: Some statistical considerations A .M. Liebetrau Department of Mathematical Sciences, Johns Hopkins University, Baltimore, Maryland 21218 Abstract Typically, a state, regional, or national water quality (WQ) monitoring system has many objectives. Examples include (1) providing a system-wide synopsis of WQ; (2) determining whether selected WQ parameters show gradual changes over time; (3a) detecting actual or potential WQ problems, (3b) determining the specific causes of actual problems, and (3c) assessing the effect of any corrective action; and (4) enforcing the law. While these purposes have different data requirements, they are interrelated, and data collected for any one can have value for others; for example, data collected for the listed reasons could be used for future long-range planning. Each objective makes certain demands of a WQ sampling network, and these in turn have implications concerning sampling design and statistical analysis of resulting data. In a sampling scheme flexible enough to encompass

multiple objectives, data sets collected for answering questions arising under points 14 need not all have similar characteristics; consequently, a variety of statistical methods are needed. A good sampling design is important for achieving the first two objectives. A working definition of water quality is given initially, and this is followed by the definition of a sampling population. The set of all segments of the drainage network serves as the population for several sampling plans. In particular, the use of probability sampling to design a network of synoptic sampling stations is discussed in detail. Finally, sequential statistical procedures are applied to a specific problem of type 3c. In terms of the amount of data required, sequential methods are quite efficient for detecting changes of a specified magnitude in some WQ parameter. Quota Sampling Ignorable and informative designs in survey sampling inference R. A. SUGDEN and T. M. F. SMITH + Author Affiliations Department of Mathematics, University of London, Goldsmiths' College London, U.K. Faculty of Mathematical Studies, University of Southampton Southampton, U.K. Received February 1, 1984. Revision received May 1, 1984. Abstract SUMMARY The role of the sample selection mechanism in a model-based approach to finite population inference is examined. When the data analyst has only partial information on the sample design then a design which is ignorable when known fully may become informative. Conditions under which partially known designs can be ignored are established and examined for some standard designs. The results are illustrated by an example used by Scott (1977). Judgement Sampling Estimation of Variance Using Judgment Ordered Ranked Set Samples S. L. Stokes Abstract Ranked set sampling has been shown by Dell and Clutter (1972, Biometrics 28, 545-553) to be a useful technique for improving estimates of the mean when actual measurement of the observations is difficult but ranking of the elements in a sample is relatively easy. The technique is extended here to show an estimator of variance, which is asymptotically unbiased regardless of the presence of errors in ranking. Furthermore, the asymptotic efficiency of these estimators, relative to those based on the same number of quantified observations from a random sample, is greater than unity for any underlying distribution, even if ranking errors occur. Snowball Sampling Snowball Sampling: Problems and Techniques of Chain Referral Sampling Patrick Biernacki1 Dan Waldorf2 1San Francisco State University 2Pacific Institute for Research and Evaluation Abstract In spite of the fact that chain referral sampling has been widely used in qualitative sociological research, especially in the study of deviant behavior, the problems and techniques involved in its use have not been adequately explained. The procedures of chain referral sampling are not self-evident or obvious. This article attempts to rectify this methodological neglect. The article provides a description and analysis of some of the problems that were encountered and resolved in the course of using the method in a relatively large exploratory study of ex-opiate addicts.

Вам также может понравиться