SW 522: Basic Social Work Research Terms

admin / February 14, 2018
Question Answer
Evidence Based Practice Model (EBP Model) 3 Es: Evidence, Expertise & Expectations.
An adaptable combination of clinical expertise, research evidence and patient values and beliefs.
Sources of Evidence Journals, Specialized & Commercial organizations, Government publications, Bibliographic databases
Steps of EBP 1) Identify the information you would like to find and implement into practice. (Assessment, intervention, prevention, treatment…)
2)Locate current best research evidence
3)Critically appraise information
4)Integrate the expertise and client preferen
Critical Appraisal Appraise the evidence for its
validity and applicability and identify instances of bias (if any).
Sources of evidence Case studies, anecdotes, personal experience, expert opinion, standards of practice, community standards, patient preference, clinical reasoning, judgment and decision making.
Research An activity designed to answer a question, test a hypothesis, and permit conclusions to be drawn. Contributes to general body of knowledge
Practice Interventions designed to enhance individual well being of client or client system which have a reasonable expectation of success.
Hypothesis Statement from a causal explanation or proposition that is yet to be empirically tested. Must have at least 1 ind. and dep. variable
Independent Variable Variable that you manipulate or change over the course of the experiment
Dependent Variable Variable that you measure
Descriptive: When a study is designed primarily to describe what is going on or what exists. Public opinion polls that seek only to
describe the proportion of people who hold various opinions are primarily descriptive in nature.
Correlational Designed to look at the relationships between two or more variables. A public opinion poll that compares what proportion of males and females would vote for a Dem. or a Rep. studies the relationship between gender and voting preference. Can be +/-
Causal When a study is designed to determine whether one or more variables (e.g., a program or treatment variable) causes or affects one or more outcome variables.
Steps in Research Process • Choose a problem
• Focus the question/hypothesis
• Design the study
• Collect Data
• Analyze Data
• Interpret Data
• Inform others
Theory A set of inter-related statements
with some lawlike generalizations, which
are empirically testable
Scientific Norms • Universalism
• Organized skepticism
• Disinterestedness
• Communalism
• Honesty
Universalism Universalism: research is to be judged only on the basis of the scientific merit
Organized skepticism Scientist should question all evidence and subject each study to intense scrutiny to ensure the method use can stand up to careful examination
Disinterestedness Scientist should accept, even look for evidence that run against their position that is based on high-quality research
Communalism The knowledge must be shared, because it belongs to us all.
Honesty Cultural norm among researchers, must engage in honest research, cheating is taboo!
Quantitative Research Style Measure objective facts
• Focus on variable
• Reliability is key
• Value free
• Independent of context
• Many cases/subjects
• Involves statistical analysis
• Detached researcher
Qualitative Research Style •Construct social reality, cultural meaning
•Focus on interactive processes, events
•Authenticity is key
•Values are present and
•Situationally constrained
•Few cases, subjects
•Thematic analysis
•Researcher is involved
Exploratory Research Explores new topic. Generate new ideas,
conjectures, or hypotheses; Develop techniques for measuring and locating future data; Eg: Exploring Mental Health Service Use Among Black Adolescent at
Risk for Suicidal Behavior
Descriptive Research Describes social phenomena. Presents picture of specific details of social setting; Subject more well defined than exploratory r.; Locate new data that
contradicts past research; Document causal
rshp; Report on situation background/context
Explanatory Research Explains why something occurs. Builds on exploratory and descriptive research; Test thoery’s principle & elaborate on its explanation; Determine which of several explanations is best
Basic Research (Pros) EXPLANATORY research most common; advances knowledge of social world; focuses on refuting/supporting social theories; source of new scientific ideas for applied researchers; seeks answers that affect long term goals
Basic Research (Cons) Considered to be waste of time & $$$ b/c it lacks direct & practical short term application; doesn't help resolve immediate problems
Applied Research (Pros) Try to solve specific policy problems or help practitioners accomplish a task; Rigor and standard vary by use of results; Practical implications are quicker and clearer; mostly DESCRIPTIVE research
Applied Research (Cons) Applied researchers may trade strict scientific standards for quick usable results; theory is less central to the research purpose; Use of the results beyond researchers control ie results may be used unwisely by decision makers and practitioners
Action Research Applied research abolishes line btw research and social action.
-Subjects participate in the
research (community based participatory
research); Equalize power btw the researcher and subjects; Oppose researchers having more control than those they stud
Social Impact Assessment Estimates the likely impact of a planned
change (used for planning, making
choices among alternative policies)
Eg: determine impact of new highway on housing within the area
Evaluation Research "Did it work?"; Measures program efficacy; Frequently DESCRIPTIVE but can be explanatory and exploratory. Greatly expanded in 1960s with many new federal programs.
Eg: Does a law enforcement program of mandatory arrest reduce spouse abuse?
Tools for Applied Research Needs assessment & Cost-benefit analysis
Needs assessment R. collects data to determine major needs and their severity; precursor to the implementation of social intervention.
Issues: R. has to decide who to observe; ppl have multiple needs; ppls needs are hard to translate into policy and interventions
Types of Evaluation Research Summative Evaluation: looks at final program outcome
Formative Evaluation: built-in monitoring/continuous feedback on program and its management
Cost benefit Analysis Benefit of program should outweigh or balance the cost of implementation.
Contingency Evaluation: asks public to assign value to the consequences (+/-) of several proposed actions
Actual Cost evaluation: monetary estimation of consequences
Empowerment Research Constructivist approach where the goal is to produce research that is multidimensional and addresses the need of all marginalized populations. Believes that there can be no completely objective, disinterested assessment ie it focuses on nonmainstream pops
Time Dimension in Research Cross Sectional, Longitudinal, & Case Studies
Types of Longitudinal Studies Time series research; cohort analysis, panel study. More $$$$ that cross-sectional but more powerful when R. answers ?? on social change. Descriptive and Explanatory R. use this method.
Time Series Research The same information is collected on a group of people or other unit across multiple periods of time. R. the effects of strict legislation on gun usage and crime rates.
Panel Study a powerful type of longitudinal study that is often more difficult to conduct—R. observes EXACTLY THE SAME ppl, group, or organization across time periods. Very $$$, but info very valuable; hard to keep track of subjects (death, relocation etc)
Cohort Analysis Similar to a panel study but observes a CATEGORY OF PPL who have similar life experience in a specified time period is studied. Not specific individuals, eg ppl born in the same yr (birth cohort), grad in same yr, get job same yr…
Case Study • In depth analysis of few cases over a duration of time.
• Indv, groups, orgs, event, or geographic units can be used
• Detailed & extensive data
• Similar to qual R., focus key issues, case context & examines how its parts are configured.
Data Collection Techniques • Experimental research (includes quasiexperimental)–> Explanatory R.
• Surveys–> Descrip & Explan R.
• sat_flash_1 Analysis–> *Des*+2Ex
• Secondary Data Analysis–> *Des*+2Ex
• Field Research
• Historical-Comparative research
Quantitative Data Collection Experimental Design, Surveys, sat_flash_1 Analysis, & Secondary Data analysis
Qualitative Data Collection Field Research: based on observations, use obs to refocus general topic of interest; mainly used for descriptive & exploratory R.
Historical Comparative Research: examines aspects of social life in a past historical era or across diff cultures; Desc R.
Theory –A logical system of general concepts that underlies the organization and understanding of human behavior
-Welcomes empirical tests to see if T/F
–Guides further research; Logically consistent; incomplete, recognize uncertainty
Ideology Explanation of the world that lacks critical features of scientific theory
• Offers absolutes—all the answers
• Fixed closed belief and value systems that change very little
• Avoids tests, discrepant findings
• Has contradictions, inconsistencies
Characteristics of Good Theory Parsimonious, Falsifiable, Heuristic
Parsimonious Simple is better; succinct/concise explanation of broader ideas
Falsifiable Several parts of the theory can be empirically tested to foster the evolution of the concepts
Heuristic Building on existing knowledge by continuing to generate testable hypotheses that if confirmed by future research will lead to a much richer understanding of the phenomena under investigation
Empiricist position on Theory The world is made up of hard, settled, observable things that are uncontaminated by theories. What we see is what there is; theory must be tested against hard, real facts to solidify the ideas put forth by the theory
Relativist position on Theory Reality is what we think it is. Believe theory can't be empirically tested b/c facts are themselves influenced by ideas & theories.
Deductive Research Approach Use abstract, logical relationship between concepts and move towards concrete empirical data and evidence. Reinforce theory using data and observations from the creation of testable hypothesis; more quantitative
Inductive Research Approach Being with detailed observations of the world and move towards the construction of abstract generalizations and ideas. Build Theory from ground up. More qualitative
Gilligan’s Theory Of Moral
Development (ETHICS OF CARE)
1) Orientation to Personal
Survival – self-centered approach to life
2) Goodness as Self-Sacrifice –
decision that self sacrifice is “good thing to do’
3)Non-Violent Responsibility – ability to reason out the repercussions of decisions.
Causality – Temporal order: If comes before then/cause before effect
– Association: Two or more variables occur together in a patterned way
– Spuriousness: Eliminations of plausible alternatives
Attributes Values or the categories of a variable. For instance, the variable sex or gender has two attributes: male and female.
Research feasibility – How long the research will take to accomplish?
– Are there are important ethical constraints that need consideration.
– Needs assessment, is it favorable?
– Cost benefit analysis
Conceptualization Process of taking a construct and refining it using a conceptual definition (abstract or theoretical terms)
• Has clear, explicit meaning
• Creating such a definition requires thinking carefully, observing directly, reading literature…
Operationalization Links a conceptual definition to a specific set of measurement techniques or procedures
• Operational definition refer to the specific procedures of the researcher.
Can attach quantitative and qualitative
meanings to the different values of the variabl
Levels of Measurement Nominal, ordinal, interval, ratio
Nominal Variables Classify observations into mutually exclusive categories; membership in one precludes membership in another.
• Examples: Sex, ethnicity, religion, political parties.
Ordinal Variables Variable has an inherent order.
• Ordinal variables not only have mutually
exclusive categories, but the categories have a fixed order.
• Examples: socioeconomic status (low, middle, upper)
• The distance between the classes are not
necessarily the s
Interval Variables Shares the characteristics of ordinal
– Mutually exclusive categories
– Inherent order
• Equal spacing between categories
• Allows each unit to have certain
unique value; no true zero
Ratio Variables Highest, precise measurement; variable attributes can be rank ordered; distance between attributes can be precisely measured; absolute zero exists
Discrete variables Attributes can be measured only with a limited # of distinct, separate categories.
Eg: Sex, race, number of arrest, days presents, household size.
Continuous variables Variables measured on a continuum in which an infinite # of finer gradations between variable attributes are possible
Eg: Age is continuous. It can be measured in weeks, years, days.
Validity How well an idea about reality fits in with actual reality; the ability to generalize findings outside a study; quality of measurement and the proper use of procedures
Reliability Dependability or consistency of the measure of a variable. Test-Retest
It suggest that repeats of the study under similar circumstances will produce the same results each and every time.
Eg: stepping on a scale and getting the same weight every time
Representative reliability refers to reliability across subpopulation or groups of people. Does the indicator give the same answer when applied to different groups.
Intercoder/rater reliability A measure is reliable if multiple coders raters observers agree with each other-it is test by having several people coder measure the same things.
sat_flash_1 Validity/Face Validity Assessing whether there is a logical
relationship between variables and the
proposed measure
Criterion Validity Use criterion to verify by
comparison a construct accuracy.
• Concurrent validity: indicators must be associated with a preexisting instrument
• Predictive validity: an instrument is used
to predict some future event
logically related to a construct
Construct validity Addresses the questions if the measure is valid do the various indicators operate in a consistent manner. Measures need clearly specified conceptual boundaries
Convergent Validity Applies when multiple indicators of the same construct hang together or operate in similar ways
Divergent validity If two construct A and B are different, then measure of A & B should not be associated.
Random Errors Those that neither consistent nor patterned. These errors in the long term tend to cancel themselves out.
Systematic errors Consistent over or understatement of the value of a given variable. These errors tend to accumulate over time.
Survey Research Research that is statistical in nature.
• Appropriate for R. on self-reported beliefs or behaviors.
R. ask multiple ??? in surveys, measure
many variables (often with multiple
indicators), and test several
Types of Survey Research Questionnaires and Interviews; Self reports, pen or pencil measures; Computer assisted Self-Administered Interview
Likert Scale Respondents express attitudes or other responses in terms of several ORDINAL-LEVEL categories (agree/disagree) ranked in a continuum
Contingency Questions Respondent is guided to additional questions based on his/her previous response
Sleeper Questions Questions about nonexistent ppl or events to check if respondents are being truthful. Repeats of the same question tests respondents consistency
Open vs. Closed Ended Questions Open: respondent free to answer questions any way they wish
Closed: Respondent choses from fixed sest of answers
Knowledge Questions Find out whether respondents know about certain topics and issues; can run the risk of making respondent feel/appear ignorant
Recency Effects Respondents are more likely to choose the last alternative offered.
This suggest it is vest to present responses on a continuum with middle or continuum, neutral positions stated in the middle.
Total Response rate Total # of ppl who completed the survey out of the # of possible respondents
Active Response Rate Total # of ppl who completed the survey out of the # of located respondents
Population Specific pools of cases and groups of ppl you want to study
Sampling frame A specific list that closely approximates the elements in the population; eg: telephone, tax records
Population parameter Characteristics of the
population estimated from the sample studied (e.g, avg height, % smokers)
Sampling Error How much the sample deviates from being representative of the population
Non Probability Sampling Qualitative R. using non mathematical theory to select cases to be part of sampling population. Have no predetermined sample size and limited knowledge of population being studied
Haphazard Getting cases based on convenience
Quota Get a preset # of cases in each of several predetermined categories that are representative of the population using haphazard methods
*Proportional quota sampling: cases need to have representative proportions to population
*Nonprop: less restrictive
Purposive All possible cases that fit particular criteria using various methods
Snowball Get additional cases using referral from one or few initial cases and so on…
Extreme case sampling Cases that are substantially different from the dominant pattern (special type of purposive sampling)
Sequential Get cases until no additional information or characteristics can be found
Theoretical Get cases that help reveal features theoretically related to the particular topic of study
Simple random sampling Researcher develop a sampling frame and uses a pure random process to select cases with each sampling element having equal probability of being selected
Stratified Sampling population divided into subpopulations (strata), then the researcher draws samples from the strata.
Then R. control the relative size of each mutually exclusive group, instead of letting the random process control it to guarantee representativeness.

Leave a Reply

Your email address will not be published. Required fields are marked *