Вы находитесь на странице: 1из 25

Use & Evaluation of Behavioural Science Methods: a UK Perspective

George Brander
UNCLASSIFIED British Crown Copyright 2009 / MOD.
Published with the permission of the Controller of Her Britannic Majestys Stationery Office.

Theme of Presentation
A full spectrum of human and behavioural sciences is necessary (within Defence and specifically in support of Information Operations), in order to better understand those individuals and social groups whose attitudes and behaviours form part of the new battle-space. Problems with data availability and attribution of cause and effect mean that, while the methods used tend to resist traditional validation and evaluation approaches, best practice can still be progressed.

Outline
l l

l l

Context of Use (frame) Using human & behavioural science methods in Defence The pyramid of analysis key themes Problems with:

Data, Theories, Methods & Tools, Processes, Outputs

Summary

The Human Factor .

Fighting battles is not about territory, it is about people, attitudes and perceptions. The battleground is there.
General Sir Michael Jackson (Feb 2000)

Context of Use
Individual Individual

Team / Group Information Environment Culture

Wider Context

Analysis & Assessment


H U M A N F A C T O R S
Key Individuals Psychology Occupational Social (Clinical)

Teams / Groups

Social Groupings Cultural contexts

Anthropology

Information Environment Attitudes & Opinions

Media & Marketing Journalism Market Research

adapted from Sherman Kents pyramid

Truth (with error bars), confidence levels & probabilities

Assessment Bias, Flawed analysis & analytic processes (Handicapped Mind)

Analysis

Collation & Evaluation


Incomplete, Missing & Deceptive Data

Assessments Assessment Processes

Methods & Tools

Analysis

Data available

Useable Theories Collation

& Evaluation

Little work on the validity of data. Some in SNA area

Potentially biased Collection

Identifying Bias and Gaps in data


Importance (as assessed by social network analysis metric) A Potential Bias

B D C

Prioritise Potential Gaps

Reporting (in year) appears biased towards three entities Four new entities to bring to attention of analyst, collators and collectors

Volume of reporting

Relative Authority Relative Influence

Problems with data


l

Data available is highly variable

incomplete, erroneous, deceived, biased, etc.

Variability also within subjects

eg individuals vary according to:


l l l l l l

Age Access to resources Educational background Cultural norms Life experiences Health

How to consider the individual?


Work to ensure enhanced direct evidence is available BACKGROUND
Motivation Strengths / Weaknesses Self Image Decision Making

BEHAVIOURAL EVIDENCE (DIRECT)

Explore and evaluate Confidence alternative methods

Traits & Preferences Stress & Coping

Provide training & tools for observers

THIRD PARTY OBSERVATIONS

Theories (an eclectic mix)


In order to operate in the context of:

Highly variable data Individual variability Differing cultural norms

And because self-report methods are inappropriate, we have borrowed from relevant theories and adapted methods & approaches to our domain

Theories & approaches


Motivational Style (McClelland & Winter)
Motivation Self Strengths / Image Evolving Methods Weaknesses

Personality Traits 15 FQ NEO

Leadership Style (Herman)

and Tools
Decision Confidence Making Questionnaires

Personal Preferences MBTI

Frameworks Traits & Preferences Content Analysis Integrative Complexity (Suedfeld)


Stress & Behavioural Coping consistency

over time

Other Theories Life Stages (Eriksson)

Assessments Assessment Processes

Methods & Tools

Analysis

Most validation questions tend to be here.

Data available

Useable Theories Collation

& Evaluation

Little work on the validity of data. Some in SNA area

Potentially biased Collection

Direct Assessment
Assessed Validity Improved Reliability

Validity based upon success of instrument to categorise individuals Designed to screen or select individuals from a population and to tune the performance of selected individuals

Controlled conditions Timed self-report test

Established Instrument

Inter-rater reliability & Utility of assessment

Use of triangulation & Peer review processes Employ facets of valid established instruments that use observables How to evaluate assessment against real world truth? Observed behaviour, third party assessment & case history analysis

Assessment of individual in context

Designed to aid in engagement or negotiation

Remote Assessment

Assessments Assessment Processes


Process considerations seen as the key to effective analysis

Methods & Tools

Analysis

Most validation questions tend to be here.

Data available

Useable Theories Collation

& Evaluation

Little work on the validity of data. Some in SNA area

Potentially biased Collection

Process Issues
l

Focused on analysis & assessment process


common & continuing training, challenge & discussion, peer review, formal review, logbooks & tables of use share & compare approaches across UK Government Behavioural Science Community share & compare approaches with Allies: USA, 5 Eyes, some NATO partners

Linked research studies


Dstl study (MoD Research) - alternative approaches to Validation Exploring novel approaches (triangulation issues): eg: body movement, personal space, behaviour in network Engagement with Academia Collaborative research with other countries. Eg: Australian (DSTO) & Canadian (DRDC) research

Assessments Assessment Processes

The utility of the assessment in supporting decisions.

Process considerations seen as the key to valid analysis

Methods & Tools

Analysis

Most validation questions tend to be here.

Data available

Useable Theories Collation

& Evaluation

Little work on the validity of data. Some in SNA area

Potentially biased Collection

Validity, Reliability and Utility: what do Customers need?

l l l l

Awareness Plausibility Credibility Trustworthiness

Did I know I could ask? Do I agree? Do others agree? Reputation of analyst(s)?

Insight

Implications for action?


(passing the So What? test)

Dealing with Customers


l l

Avoid psychological jargon Manage expectations

aim to advise & forecast rather than predict

l l

Offer recommendations where possible Seek feedback (on accuracy and utility) Seek to assess accuracy of outcome
but, given the lack of Measures of Effect, attributing causality remains very difficult

In summary

Analysis & Assessment


H U M A N F A C T O R S
Key Individuals
Beliefs Motivations Leadership Styles Decision Making Styles

QUALITATIVE

Teams / Groups
Social Network Analysis

Social Groupings Cultural contexts


Cultural Practices Social Structures

Information Environment Attitudes & Opinions

Information Flow Information Preference Gatekeepers / Advisors & Opinion Formers Focus Groups / Polling

QUANTITATIVE

HF analysis is largely QUALITATIVE


l l l l l

Ethnography. Phenomenology Field Research Observation (direct & indirect) Case Study approaches

Traditional (quantitative) approaches to validation may not be sufficient. HF employs multi-disciplinary, multi-methodologies that seek to provide insight, with a suitable degree of rigour, whilst evolving best practice.

Half the money I spend on advertising is wasted, and the trouble is I don't know which half.
William Hesketh Lever,
1st Viscount Leverhulme,
an English Industrialist, philanthropist and colonialist

Questions?

Вам также может понравиться