Академический Документы
Профессиональный Документы
Культура Документы
Digital Village
Throughout eternity, all that is of like form will come around
again everything that is the same must always return in its
own everlasting cycle.....
Passion
passion reason Reason logic
love, fixation
Human
Actions
Human Nature
(good and evil)
Desire
need, want
desire nature altruism, heroism
curiosity, inquiry,
ignorance, malice
Primal Instinct
anxiety, fear, delusion habit Ritual, ceremony, repetition
anger, hate
Emotional Deterministic
The Digital Enterprise
The Digital Enterprise
The Digital Enterprise is all about doing things better today in order to design and
build a better tomorrow - for all of our stakeholders. The Digital Enterprise is driven by
the need for rapid response to changing conditions so that we can create and
maintain a brighter future for all our stakeholders to enjoy. The Digital Enterprise
evolves from analysis, research and development into long-term Forecasting, Strategy
and Planning ranging in scale from the formulation and shaping of Public-sector
Political, Economic and Social Policies to Private-sector Business Programmes, Work-
streams and Digital Projects for organisational change and business transformation
enabling us to envision and achieve our desired future outcomes, goals and objectives
PLAN
Review
Review Foresight
Foresight
Foresight Platform Early Adopters
Strategy
Strategy Launch
Foresight
REVIEW Platform PREPARE
Lifecycle
Evaluate Migrate Data
Foresight Enhance
Foresight Consumers
Digital Foresight
Platform over to new
Platform Platform
Performance EXECUTE Digital Platform
Foresight Foresight
Benefits Realisation - Cash Cow Platform Platform Benefits Realisation Rising Star
Maturity Growth
The Digital Enterprise Methodology
Foresight Planning Methodology: -
Understand business and technology environment Business Outcomes, Goals Objectives and Needs
Understand business and technology challenges / opportunities Business Drivers and Requirements
Gather the evidence to quantify the impact of those opportunities Business Case
Quantify the business benefits of resolving the opportunities Benefits Realisation
Quantify the changes need to resolve the opportunities Business Transformation
Understand Stakeholder Management issues Communication Strategy
Understand organisational constraints Organisational Impact Analysis
Understand technology constraints Technology Strategy and Architecture
1. Provide and Train the client Strategy and Planning Team with a comprehensive, consistent
and complete Strategic Foresight Framework which focuses on the capability to create and
maintain a useful and detailed Future Perspective and Forward View. This is supported by a
Digital Enterprise Architecture Method in order to design, deliver and support a Digital
Strategic Foresight Platform - which is illustrated and described by Architecture Models, and
documented and defined by a Reference Architecture (both Business and Technology),
Business Process Catalogue, Business Services Library and Technology Services Inventory.
3. Mentor, advise and support the Strategy and Planning Team to finalise and agree the
Business Transformation Programme and Project Plans and Digital Platform Solution
Architecture, in order to ensure that the future Strategic Foresight development tools and
Digital Platform software architecture framework delivers industry-leading business agility /
competitiveness and technology flexibility / effectiveness.
Advisory and Training Objectives - Prepare
6. Act as the Digital Architecture Design Authority in order to guide, influence and mentor
the Digital Product Portfolio Team as they deliver the strategic architecture through agile
development improve maintenance capability and efficiency - responsible for the Digital
Platform cooperative resource information collection, analysis, transformation.
4. Train, advise and support the Strategy and Planning Team to design the Digital
Architecture and Technology R&D Pilot Project / Proof-of-Concept (PoC) through all
of the stages of prototype design, development, testing, verification and validation and
plan the phases of implementation for the dominant architecture prototype with the delivery
of Golden Standard artefacts into the Digital Product Portfolio ensuring that future Digital
Development Tools / Digital Framework and Strategic Foresight Architectures deliver
industry-leading business agility / competitiveness and technology flexibility / impact.
5. Mentor, advise and support the Strategy and Planning Team to build and test the Digital
Architecture and Technology R&D Pilot Project / Proof-of-Concept (PoC). Establish a
Lean and Agile Strategic Foresight Epics and Stories Catalogue - that is both flexible and
adaptive to radical technology change and platform replacement across all of the
Technology Domains along with a detailed and complete Technology Mapping to the
client evaluation stack / strategic Digital Technology Platform Components (Social Media
/User Content Analysis, Big Data Analytics, Mobile Platforms, Geospatial Data Science)
Advisory and Training Objectives - Prepare
7. Responsible for all Strategy and Planning Team group activities team building, training,
development, mentoring, cooperative resource information collection, analysis and
transformation through to planning and organising Executive Briefings, Technology
Forums, Special Interest Groups, Workshops, Seminars and Conferences including
selecting the speakers / representative / delegates to attend regional, national and
international Strategic Foresight and Lean / Agile Digital Technology conferences.
8. Train the delivery team in Digital Technology Platform Architecture Model envisioning,
design, development and maintenance - from architecture vision to agile implementation
including CASE Tool architecture design and the Standard Digital Retail Reference Model.
9. Train and develop the Strategy and Planning Team in Digital Technology Platform
Architecture and Components so as to be able to design, development and
maintenance, from lean architecture vision to agile implementation in a collaborative
communication and benefits management strategy in order to drive out / resolve Strategic
Foresight, Digital Strategy, Architecture and Design problems, issues or threats leading
team education and training, coaching, mentoring and development.
Advisory and Training Objectives - Execute
Many of the challenges encountered in managing Digital Enterprise Programmes result from
attempts to integrate the multiple, divergent Future Narratives gathered from lots of different
stakeholders in the Enterprise all with different viewpoints, drivers, concerns, interests and
needs. This may be overcome by developing a shared, collaborative, common Business and
Architecture Vision describing the future state of the Digital Retail Enterprise along with a
Business and Architecture Roadmap to help plan and realise the achievement of that Vision.
11. Establish a Lean Retail 2.0 / Perfect Store Digital Business Architecture (BA) to achieve
Digital Transformation via end-to-end Retail 2.0 / Perfect Store Business Processes.
12. Drive out a Digital Strategic Foresight Business Operating Model (BOM) through the
investigation, discovery, analysis and design of a Digital Retail Process and Business
Services Portfolio, consisting of Architecture Model and Description consisting of Strategic
Foresight documents, data stores, scenarios and use cases .
13. Guide the Strategy and Planning Team to create the Digital Strategic Foresight
Solution Architecture (SA) Model designing a Lean / Agile Strategic Foresight
Software Architecture using Digital Strategic Foresight Epics and Stories from the
strategic architecture prototype - which adapts to radical technology change / platform
replacement across all the Digital Technology Domains - through all of the stages of
design, development, testing, verification and validation and iterative phases of
implementation and the delivery of artefacts into the Digital Portfolio.
Advisory and Training Objectives - Review
15. Review Digital Solution Model business performance Functional Requirements met ?
16. Review Digital Platform technical performance Non-functional Requirements met by the
Digital Technology Platform Components (e.g. Internet Social Media and User Content
Analysis, Big Data Analytics, Mobile Platforms, 4D Geospatial Data Science) ?
17. Review Digital Strategy outcomes, goals and objectives Strategic Requirements met ?
18. Plan / Scope next iteration of the Digital Strategy / Architecture / Technology Platform.
Big Data
Consumers
Understand Evaluate
Investigate
and
Research
Strategic Foresight as
Knowledge Management
Horizon Scanning
Horizon Scanning is an important technique for establishing a sound knowledge
base for planning and decision-making. Anticipating and preparing for the future
uncertainty, threats, challenges, opportunities, patterns, trends and extrapolations is
an essential core component of any organisation's long-term sustainability strategy.
Horizon Scanning may explore novel and unexpected issues as well as persistent
problems or trends. The government's Chief Scientific Adviser is encouraging
Departments to undertake horizon scanning in a structured and auditable manner.
Horizon Scanning enables organisations to anticipate and prepare for new risks and
opportunities by looking at trends and information in the medium- to long-term future.
Socio-
Disruptive Technology
Shock Wave
Demographic Culture Change
Innovation Shock Wave
Data Science
Weak Signal Horizon
- Big Data
Processing Scanning Analytics
Horizon Scanning is used as an overall term for discovering and analysing the future of
the Human World Politics, Economics, Sociology, Religion Culture and War
considering how emerging trends and developments might potentially affect current policy
and practice. This helps policy makers in government to take a longer-term strategic
approach, and makes present policy more resilient to future uncertainty. In developing
policy, Horizon Scanning can help policy makers to develop new insights and to think
about outside of the box solutions to human threats and opportunities.
In contingency planning and disaster response, Horizon Scanning helps to manage risk
by discovering and planning ahead for the emergence of unlikely, but potentially high
impact Black Swan events. There are a range of Futures Studies philosophical
paradigms, and technological approaches which are all supported by numerous
methods, tools and techniques for developing and analysing possible, probable and
alternative future scenarios.
Horizon Scanning, Tracking and Monitoring - Subjects
Economic Shock Waves Geopolitical Shock Waves
1. Money Supply 1. Invasion / War Big Data
2. Commodity Price Economic Geo-political 2. Security / Civil Unrest
3. Sovereign Debt Default Shock Wave Shock Wave 3. Terrorism / Revolution
Analytics
Environment Ecological
Shock Wave Shock Wave
Environment Shocks Ecological Shocks
1. Natural Disasters 1. Population Curves Growth / Collapse
2. Global Catastrophes 2. Extinction-level Events
Horizon Scanning, Tracking and
Monitoring Processes
HORIZON SCANNING, MONITORING and TRACKING
Data Set Mashing and Big Data Global Content Analysis supports Horizon
Scanning, Monitoring and Tracking processes by taking numerous, apparently un-related
RSS and Data Feeds, along with other Information Streams, capturing unstructured Data
and Information Numeric Data, Text and Images loading this structured / unstructured
data into Document and Content Database Management Systems and Very Large Scale
(VLS) Dimension / Fact / Event Database Structures to support both Historic and Future
time-series Data Warehouse for interrogation using Real-time / Predictive Analytics.
These processes use Big Data to construct a Temporal View (4D Geospatial Timeline)
including Predictive Analytics, Geospatial Analysis, Propensity Modelling and Future
Management. that search for and identify Weak Signals, which are signs of possible
hidden relationships in the data to discover and interpret previously unknown Random
Events - Wild Cards or Black Swans. Weak Signals are messages originating from
these Random Events which may indicate global transformations unfolding as the future
Temporal View (4D Geospatial Timeline) approaches - in turn predicating possible,
probable and alternative Future Scenarios, Outcomes, Cycles Patterns and Trends. Big
Data Hadoop Clusters support Horizon Scanning, Monitoring and Tracking trough
Hadoop *Big Data* Collect, Load, Stage, Map Reduce and Publish process steps.
Scenario Planning and Impact Analysis
Monte Carlo Simulation
Non-linear Models
Possible,
Probable Published SCENARIOS and Discovered Probable
Scenarios Scenarios USE CASES Scenarios & Alternative
Futures
Bayesian Analysis
Understand Evaluate
Impact Evaluated Profile
Analysis Scenarios Analysis
Scenario Planning and Impact Analysis
Scenario Planning and Impact Analysis is the archetypical method for futures studies
because it embodies the central principles of the discipline:
The future is uncertain - so we must prepare for a wide range of possible, probable
and alternative futures, not just the future that we desire (or hope) will happen.....
It is vitally important that we think deeply and creatively about the future, else we run
the risk of being surprised, unprepared for, or overcome by events or all of these.....
Scenarios contain the stories of these multiple futures - from the Utopian to the Dystopian,
from the preferred to the expected, from the Wild Card to the Black Swan - in forms which
are analytically coherent and imaginatively engaging. A good scenario grabs our attention
and says, Take a good look at this future. This could be your future - are you prepared ?
As consultants and organizations have come to recognize the value of scenarios, they
have also latched onto one scenario technique a very good one in fact as the default
for all their scenario work. That technique is the Royal Dutch Shell / Global Business
Network (GBN) matrix approach, created by Pierre Wack in the 1970s and popularized by
Schwartz (1991) in the Art of the Long View and Van der Heijden (1996) in Scenarios: The
Art of Strategic Conversations. In fact, Millett (2003, p. 18) calls it the gold standard of
corporate scenario generation.
Strategic Foresight
STRATEGIC FORESIGHT
Communicate Evaluate
10.
COMPLEX STRATEGY
3.
SYSTEMS
FUTURES
and
STUDIES
CHAOS
THEORY
STRATEGIC
Data Load and Model Trials
DISRUPTION FORESIGHT ANALYSIS
R&D + Strategy Discovery Workshops
Tuning and History Matching Tech. Convergence and Innovation
STUDY INPUTS
9. 4.
ECONOMIC NARRATIVE
MODELLING PLANNING METHODS
8.
Forecasting. SCENARIO 5. Qualitative Techniques
Planning PLANNING NUMERICAL Scenario Planning
and and IMACT 7. METHODS Risk Management
Strategy ANALYSIS WEAK 6.
Quantitative Techniques
Models SIGNALS HORIZON
Technical Analysis and
and WILD SCANNING.
Monte Carlo Simulation
Business Waves, Cycles, Patterns, Trends CARDS Horizon Scanning,
Economic Modelling and Econometric Analysis Tracking and Monitoring
Strategic Foresight - Methods
Digital Futures Description Pioneers and Leading
Studies Method Figures
Creative Creative Destruction (Technology Disruption) Joseph Schumpeter Austrian
Destruction describes a "process of industrial mutation that School Economist Capital Theory,
(Technology constantly replaces the economic structure from the flow of capital from older declining
Disruption) within, incessantly destroying the old economy, industries (cash cows) into new and
incessantly creating a new economy in its place. emerging industries (rising stars).
"the process of 'Creative Destruction (Disruption) is a term coined Joseph Alois Schumpeter was an
creative destruction by Joseph Schumpeter in his Capital Theory work Austro-American economist and
is the essence of entitled "Capitalism, Socialism and Democracy" political scientist and a member of the
capitalism (1942) in which he stated that "the process of Austrian (Real) School of Economics.
creative destruction is the essence of capitalism.
Austrian School 'Creative Destruction occurs when the arrival and Joseph Schumpeter briefly served
Capital Theory adoption of new methods of production effectively kills as Finance Minister of Austria during
Disruptive Economic off older, established industries. An example of this is 1919. In 1932 he became a visiting
Change is driven by the introduction of personal computers in the 1980's. professor at Harvard University
Creative Destruction This new industry, founded by Microsoft and Intel, where he remained until the end of
destroyed many mainframe computer companies. In his career. In1942 Schumpeter
doing so, technology entrepreneurs created one of famously wrote in "Capitalism,
Joseph Schumpeter the most important technologies of the last century. Socialism and Democracy" : -
Austrian School Microsoft and Nokia are today, in their turn, now being
Political Economist destroyed as personal computers and laptops are "the harsh winds of creative
replaced by smart phones and tablets from agile and destruction which is the essence
innovative companies such as Apple and Samsung. of all capitalism - are blown in on
the gales of economic change ..
Strategic Foresight - Methods
Digital Futures Description Pioneers and Leading
Studies Method Figures
Disruptive Futurism Disruptive Futurism is an ongoing forward analysis of Disruptive Futurists
the impact of novel and emerging factors of Disruptive
Change on the Environment, Politics, Economy, Society, Prof Peter Cochrane, Iain
Industry, Agronomy and Technology, and how Business Pearson, Jonathan Mitchner,
and Technology Innovation is driving Disruptive Change. David Brown, Ian Neild BT
Thus understanding how current patterns, trends and Futures Laboratories
extrapolations along with emerging agents and catalysts
of change interact with chaos, disruption and uncertainty
(Random Events) create novel opportunities as well as
posing clear and present dangers that threaten the very
status quo of the world as we know it today.....
The corollary of this is to be found in the huge costs Sir Clive Sinclair
and lost opportunities of the innumerable abandoned
technology innovation strategies, cancelled Research Sir Alan Sugar Amstrad
and Development programmes and failed product
launches. Under-achievement by managers may be
attributed to a lack of understanding of the dynamics,
impact and effects of digital technology disruption..
Strategic Foresight - Methods
Digital Futures Description Pioneers and Leading
Studies Method Figures
Futures Studies Futures Studies, Foresight, or Futurology is the Prof. Kies van der Hijden Said
science, practice and art of postulating possible, Business School, University of
probable, alternative and preferable futures. Futures Oxford, author of The Sixth Sense
Studies (colloquially called "Futures" by many of the Richard Slaughter, Pero Micic, Peter
field's practitioners) seeks to understand what is likely Bishop, Andy Hines, Wendy Schultz,
to continue, what is likely to change, and what is a John Smart, Jennifer Gidley, Marie
novel and emerging pattern or trend. In some part, Conway, Karen Marie Arvidsson,
this discipline seeks a systematic pattern, cycle and
trend analysis a foreword extrapolation-based
understanding of both past and present events - in
order to determine the probability and impact of
unfolding future events, patterns and trends, and how
they may be altered by chaos introducing, disruption,
randomness and uncertainty into future outcomes.
Future Envisioning Future Envisioning Future outcomes, goals and Peter Bishop, Andy Hines, John
objectives are discovered via the Strategic Foresight Smart, Pero Micic, Wendy Schultz
analysis process - determined by design, planning and
management - so that the future becomes realistic and
achievable. Possible futures may comply with our
preferred options - and therefore our vision of an ideal
future and desired outcomes could thus be fulfilled.
Horizon Scanning, In order to anticipate a wide range of Future business. Weak Signals and Wildcards -
Tracking and economic, social and political Events from micro- Stephen Aguilar-Milan (1968), later
Monitoring for economic Market phenomena such as forecasting popularised by Ansoff (1990)
Future Events Market Sentiment and Price Curve movements, to
large-scale macro-economic Fiscal phenomena we
can use Weak Signal processing to predict future Wild
Card and Black Swan Events such as Commodity
Price, Monetary System and Debt Default shocks.
Complexity Academic, scientific, social, economic, political and Edward Lorenz, John Henry
Paradigm professional disciplines all have to address the Holland, Edgar Morin, Jennifer
problem of System Complexity in their fields the Gidley, Karen Marie Arvidsson,
behaviour of Complex Systems and Chaos Theory.
Related: -
The Complexity Paradigm is based on the science of
Chaos Theory turbulence, strange attractors, emergence and fractals
Linear Systems modelling complex behaviour using self-organisation
Complex Systems and critical system complexity via non-linear equations
Adaptive Systems with variable starting conditions , in the rich conceptual
Simplexity Paradigm world of Complex Systems and Chaos Theory.
Strategic Foresight - Methods
Digital Futures Description Pioneers and Leading
Studies Method Figures
Global Massive Global Massive Change is an evaluation of global Adam Smith, Thomas Malthus
Change capacities and limitations. It includes both utopian and
dystopian views of the emerging world future state, in
which climate, the environment and ecology are dominated
by human population growth and manipulation of nature:
PLAN
10.
3.
ACTION
RESEARCH
PLANNING
STRATEGIC
FORESIGHT
Data Load and Model Trials DIGITAL R&D + Strategy Discovery Workshops
REVIEW PREPARE Tech. Convergence and Innovation
Tuning and History Matching PLATFORM
LIFECYCLE
9. 4.
STRATEGIC STRATEGY
FORESIGHT DISCOVERY
EXECUTE
8.
Forecasting 5.
FORECAST
and Strategy THREAT
&
Models ANALYSIS
STRATEGY 13.
7.
6. CRYSTAL
VALUE
RISK BALL
CHAIN
Benefits Realisation Benefits Realisation REPORT
Value Chain Analysis Risk Management
Thinking about the Future
Professors Peter Bishop and Andy Hines at the University of Texas Futures Studies School at
the Houston Clear Lake site have developed a definitive Strategic Foresight Framework
This important first step enables public and private sector organisations to define their
Strategic Foresight Study and supportinhg SMACT/4D Digital Business Transformation
purpose. focus, scope and boundaries across all of those Political, Legal, Economic,
Cultural, Business and Technology problem / opportunity domains requiring resolution.
Taking time at the outset of a project, the Strategic Foresight Digital Transformation
Team defines the Digital Study domain, discovers the principle strategy themes,
outcomes, goals and objectives and determines how we might best achieve them.
This may involve staging a wide range of Digital Strategy Programme launch and
SMACT/4D Project kick-off initiatives - organising events for Strategy Discovery,
Communications Channels, Target-setting and Stakeholder engagement planning,
establishing mechanisms for reporting actual achievement against targets so as
the Strategic Foresight Team engage a wide range of stakeholders, presents a
future-oriented, customer-focussed approach and enables the efficient delivery of
Digital Strategy Study artefacts & benefits in planned / managed work streams.
Once the Digital Strategic Foresight Team is clear about the Strategic Foresight
engagement boundaries, purpose, problem / opportunity domains and scope of
the SMACT/4D Digital Technology Study - they can begin to scan both internal
and external sources for any relevant Disruptive Digital content information
describing Digital case studies or sources indicating Digital transformations,
emerging and developing factors and global catalysts of Disruptive change,
extrapolations, patterns and trends and Horizon Scanning, Tracking and
Monitoring to search for, seek out and identify any Weak Signals for Disruptive
Digital Technology indicating potential disruptive Wild Card / Black Swan events.
Here we begin to identify and extract useful information from the mass of the
Digital Research Content that we have searched for and collected. Critical
Success Factors, Strategy Themes and Value Propositions begin to emerge from
Data Set mashing, Data Mining and Analytics against the massed Research
Data which is all supplemented via the very human process of Cognitive
Filtering and Intuitive Assimilation of selected information - through Discovery
Workshops, Strategy Theme Forums, Digital Value Chain Seminars, SMACT/4D
Special Interest Group Events and one-to-one Key Stakeholder Interviews.
The challenge is to determine how much risk we are able to accept as we strive to grow
sustainable stakeholder value. Uncertainty presents both opportunity and risk with the
possibility of either erosion or enhancement of value. Strategic Foresight enables
stakeholders to deal effectively with uncertainty and associated risk and opportunity -
thus enhancing the capability of the Enterprise to build long-term economic value.
Risk Research Risk Identification Scenario Planning & Impact Analysis Risk
Assessment Risk Prioritization Risk Management Strategies Risk Planning
Risk Mitigation
For any given set of Risk Management Scenarios, a prioritization process ranks
those risks with the greatest potential loss and the greatest probability of occurrence
to be handled first and those risks with a lower probability of occurrence and lower
consequential losses are then handled subsequently in descending order of impact.
In practice this prioritization can be challenging. Comparing and balancing the overall
threat of risks with a high probability of occurrence but lower loss -versus risks with
higher potential loss but lower probability of occurrence -can often be misleading.
Enterprise Risk Management
Scenario Panning and Impact Analysis: - In any Opportunity / Threat Assessment
Scenario, a prioritization process ranks those risks with the greatest potential loss and the
greatest probability of occurring to be handled first - subsequent risks with lower probability
of occurrence and lower consequential losses are then handled in descending order. As a
foresight concept, Wild Card or Black Swan events refer to those events which have a
low probability of occurrence - but an inordinately high impact when they do occur.
Risk Assessment and Horizon Scanning have become key tools in policy making
and strategic planning for many governments and global enterprises. We are now
moving into a period of time impacted by unprecedented and accelerating
transformation by rapidly evolving catalysts and agents of change in a world of
increasingly uncertain, complex and interwoven global events.
Scenario Planning and Impact Analysis have served us well as a strategic planning
tools for the last 15 years or so - but there are also limitations to this technique in this
period of unprecedented complexity and change. In support of Scenario Planning
and Impact Analysis new approaches have to be explored and integrated into our
risk management and strategic planning processes.
Back-casting and Back-sight: - Wild Card or Black Swan events are ultra-extreme
manifestations with a very low probability of, occurrence - but an inordinately high impact
when they do occur. In any post-apocalyptic Black Swan Event Scenario Analysis,
we can use Causal Layer Analysis (CLA) techniques in order to analyse and review our
Risk Management Strategies with a view to identifying those Weak Signals which may
have predicated subsequent appearances of unexpected Wild Card or Black Swan events.
Thinking about the Future
7. DIGITAL VALUE CHAIN MANAGEMENT
The prime activity in the Value Chain Management Process is, therefore, is to
challenge the status quo view and provoke the organisation into thinking seriously
about the possibility that future conditions may not continue exactly as they have
always unfolded before - and in fact, future conditions very seldom do not change.
The Strategic Foresight processes should therefore include searching for and
identifying any potential Weak Signals predicating potential future Wild Card and
Black Swan events in doing so, revealing previously hidden factors and catalysts
of change thus exposing a much wider range of challenges, issues, problems,
threats, opportunities and risks than may previously have been considered.
Scenarios are stories about how the future may unfold and how that future will
impact on the way that we work and do business with our staff, business partners,
customers and suppliers. The Digital Strategy Study considers a broad spectrum
of possible scenarios as the only sure-fire way to develop a Digital Technology
Platform that will securely position the Digital Transformation Programme with a
robust strategic response for every opportunity / threat scenario that may transpire.
The discovery of multiple scenarios and their associated opportunity / threat impact
assessments along with the probability of each one materialising covers a wide
range of possible and probable Opportunity / Threat situations describing a broad
spectrum, rich variety of POSSIBLE, PROBABLE and ALTERNATIVE FUTURES
After Scenario Forecasting has laid out a range of potential Future Digital Business
Scenarios, Strategy and Architecture envisioning comes into play generating a
pragmatic and useful Forward View of our preferred Future Business Environment and
Digital Technology Platform thus starting to suggest a range of stretch goals for
moving us forwards towards our ideal Digital Forecasting and Strategy Models
utilising the Digital Strategic Principles and Policies to drive out the desired Vision,
Missions, Outcomes, Goals and Objectives all cross-referenced / mapped to the
proposed Digital Business Architecture Scenarios and Use Cases and Digital Technology
Platform Epics and Stories, Solution Architecture and Component Catalogue
Finally, the Digital Strategy and Business Transformation team migrates and
transforms the desired Vision, Missions, Digital Strategy Themes, Outcomes, Goals
and Objectives into the Strategic Digital Transformation Master Plan, Enterprise
Landscape Models, Strategic Roadmaps and Transition Plans for organisational
readiness, mobilisation and training maintaining Strategic Foresight mechanisms
(Digital Horizon Scanning, Monitoring and Tracking) to preserve our capacity and
capability to respond quickly to fluctuations in internal and external environments
This penultimate phase is about communicating results and developing action agendas for
mobilising strategy delivery through launching Business Programmes that will drive forwards
towards the realisation of Strategic Master Plans and Future Business Models through Digital
Business Transformation, Enterprise Portfolio Management, Technology Refreshment and
Service Management using Cultural Change, innovative multi-tier and collaborative Business
Operating Models, Emerging Digital Technologies (IoT, Smart Devices, Smart Grid and Cloud
Services) Business Process Re-engineering and Process Outsource - Onshore / Offshore.
In this final phase, we can now focus on Key Lessons Learned and maintaining the flow
of useful information from the Digital Strategic Foresight mechanisms and infrastructure
into the Strategy and Planning Team our new Digital Village in order to support an
ongoing lean and agile capability to continually and successfully respond to the volatile
and dynamic internal and external business and technology environment continuing
Disruptive Futures Studies, Digital Strategy Reviews, Horizon Scanning, Economic
Modelling and Econometric Analysis, long-range Forecasting and Business Planning.
We can now also prepare for the launch of the next iteration of the Digital Strategy Cycle,
beginning again with re-launching Phase 1 Digital Strategy Study Framing & Scoping.
Strategy Review: -
Revised Digital Strategy Themes, Outcomes, Goals, Objectives and Requirements
Disruptive Business and Technology Futures Studies and Digital Strategy Reviews
Horizon Scanning, Monitoring and Tracking Systems Reviewed Models
Economic Modelling and Econometric Analysis Systems Reviewed Models
Business Planning and long-range Economic Forecasting Reviewed Models
Reviewed Digital Business Models and Value Propositions, Products and Services
Reviewed Digital Technology Platform Solution / Component Architecture
Reviewed Digital Market Value Proposition Epics and Stories
Reviewed Digital Customer Experience and Journey Scenarios and Use Cases
The Crystal Ball Report is designed to become the shared vision reference point, where all
stakeholders can see how their needs and functions are both addressed and add value to the
overall corporate plan, keeping everyone in the boat, and rowing in the same direction.
Horizon Scanning
Horizon Scanning Environment Scanning
Human Activity Natural Phenomena
Scan and
Identify
Communicate Discover
Understand Evaluate
Investigate
and
Research
Horizon Scanning
Horizon Scanning is an important technique for establishing a sound knowledge
base for planning and decision-making. Anticipating and preparing for the future
uncertainty, threats, challenges, opportunities, patterns, trends and extrapolations is
an essential core component of any organisation's long-term sustainability strategy.
Horizon Scanning may explore novel and unexpected issues as well as persistent
problems or trends. The government's Chief Scientific Adviser is encouraging
Departments to undertake horizon scanning in a structured and auditable manner.
Horizon Scanning enables organisations to anticipate and prepare for new risks and
opportunities by looking at trends and information in the medium- to long-term future.
Socio-
Disruptive Technology
Shock Wave
Demographic Culture Change
Innovation Shock Wave
Data Science
Weak Signal Horizon
- Big Data
Processing Scanning Analytics
Horizon Scanning is used as an overall term for discovering and analysing the future of
the Human World Politics, Economics, Sociology, Religion Culture and War
considering how emerging trends and developments might potentially affect current policy
and practice. This helps policy makers in government to take a longer-term strategic
approach, and makes present policy more resilient to future uncertainty. In developing
policy, Horizon Scanning can help policy makers to develop new insights and to think
about outside of the box solutions to human threats and opportunities.
In contingency planning and disaster response, Horizon Scanning helps to manage risk
by discovering and planning ahead for the emergence of unlikely, but potentially high
impact Black Swan events. There are a range of Futures Studies philosophical
paradigms, and technological approaches which are all supported by numerous
methods, tools and techniques for developing and analysing possible, probable and
alternative future scenarios.
Horizon Scanning, Tracking and Monitoring - Subjects
Economic Shock Waves Geopolitical Shock Waves
1. Money Supply 1. Invasion / War Big Data
2. Commodity Price Economic Geo-political 2. Security / Civil Unrest
3. Sovereign Debt Default Shock Wave Shock Wave 3. Terrorism / Revolution
Analytics
Environment Ecological
Shock Wave Shock Wave
Environment Shocks Ecological Shocks
1. Natural Disasters 1. Population Curves Growth / Collapse
2. Global Catastrophes 2. Extinction-level Events
Horizon Scanning, Tracking and
Monitoring Processes
HORIZON SCANNING, MONITORING and TRACKING
Data Set Mashing and Big Data Global Content Analysis supports Horizon
Scanning, Monitoring and Tracking processes by taking numerous, apparently un-related
RSS and Data Feeds, along with other Information Streams, capturing unstructured Data
and Information Numeric Data, Text and Images loading this structured / unstructured
data into Document and Content Database Management Systems and Very Large Scale
(VLS) Dimension / Fact / Event Database Structures to support both Historic and Future
time-series Data Warehouse for interrogation using Real-time / Predictive Analytics.
These processes use Big Data to construct a Temporal View (4D Geospatial Timeline)
including Predictive Analytics, Geospatial Analysis, Propensity Modelling and Future
Management. that search for and identify Weak Signals, which are signs of possible
hidden relationships in the data to discover and interpret previously unknown Random
Events - Wild Cards or Black Swans. Weak Signals are messages originating from
these Random Events which may indicate global transformations unfolding as the future
Temporal View (4D Geospatial Timeline) approaches - in turn predicating possible,
probable and alternative Future Scenarios, Outcomes, Cycles Patterns and Trends. Big
Data Hadoop Clusters support Horizon Scanning, Monitoring and Tracking trough
Hadoop *Big Data* Collect, Load, Stage, Map Reduce and Publish process steps.
Horizon Scanning
Horizon Scanning
Horizon Scanning Environment Scanning
Human Activity Natural Phenomena
Scan and
Identify
Communicate Discover
Understand Evaluate
Investigate
and
Research
Horizon Scanning
Horizon Scanning is an important technique for establishing a sound knowledge
base for planning and decision-making. Anticipating and preparing for the future
uncertainty, threats, challenges, opportunities, patterns, trends and extrapolations is
an essential core component of any organisation's long-term sustainability strategy.
Horizon Scanning may explore novel and unexpected issues as well as persistent
problems or trends. The government's Chief Scientific Adviser is encouraging
Departments to undertake horizon scanning in a structured and auditable manner.
Horizon Scanning enables organisations to anticipate and prepare for new risks and
opportunities by looking at trends and information in the medium- to long-term future.
Socio-
Disruptive Technology
Shock Wave
Demographic Culture Change
Innovation Shock Wave
Data Science
Weak Signal Horizon
- Big Data
Processing Scanning Analytics
Horizon Scanning is used as an overall term for discovering and analysing the future of
the Human World Politics, Economics, Sociology, Religion Culture and War
considering how emerging trends and developments might potentially affect current policy
and practice. This helps policy makers in government to take a longer-term strategic
approach, and makes present policy more resilient to future uncertainty. In developing
policy, Horizon Scanning can help policy makers to develop new insights and to think
about outside of the box solutions to human threats and opportunities.
In contingency planning and disaster response, Horizon Scanning helps to manage risk
by discovering and planning ahead for the emergence of unlikely, but potentially high
impact Black Swan events. There are a range of Futures Studies philosophical
paradigms, and technological approaches which are all supported by numerous
methods, tools and techniques for developing and analysing possible, probable and
alternative future scenarios.
Horizon Scanning, Tracking and Monitoring - Subjects
Economic Shock Waves Geopolitical Shock Waves
1. Money Supply 1. Invasion / War Big Data
2. Commodity Price Economic Geo-political 2. Security / Civil Unrest
3. Sovereign Debt Default Shock Wave Shock Wave 3. Terrorism / Revolution
Analytics
Environment Ecological
Shock Wave Shock Wave
Environment Shocks Ecological Shocks
1. Natural Disasters 1. Population Curves Growth / Collapse
2. Global Catastrophes 2. Extinction-level Events
Horizon Scanning, Tracking and
Monitoring Processes
HORIZON SCANNING, MONITORING and TRACKING
Data Set Mashing and Big Data Global Content Analysis supports Horizon
Scanning, Monitoring and Tracking processes by taking numerous, apparently un-related
RSS and Data Feeds, along with other Information Streams, capturing unstructured Data
and Information Numeric Data, Text and Images loading this structured / unstructured
data into Document and Content Database Management Systems and Very Large Scale
(VLS) Dimension / Fact / Event Database Structures to support both Historic and Future
time-series Data Warehouse for interrogation using Real-time / Predictive Analytics.
These processes use Big Data to construct a Temporal View (4D Geospatial Timeline)
including Predictive Analytics, Geospatial Analysis, Propensity Modelling and Future
Management. that search for and identify Weak Signals, which are signs of possible
hidden relationships in the data to discover and interpret previously unknown Random
Events - Wild Cards or Black Swans. Weak Signals are messages originating from
these Random Events which may indicate global transformations unfolding as the future
Temporal View (4D Geospatial Timeline) approaches - in turn predicating possible,
probable and alternative Future Scenarios, Outcomes, Cycles Patterns and Trends. Big
Data Hadoop Clusters support Horizon Scanning, Monitoring and Tracking trough
Hadoop *Big Data* Collect, Load, Stage, Map Reduce and Publish process steps.
deterministic stochastic
Scenario Planning and Impact Analysis
Monte Carlo Simulation
Non-linear Models
Possible,
Probable Published SCENARIOS and Discovered Probable
Scenarios Scenarios USE CASES Scenarios & Alternative
Futures
Bayesian Analysis
Understand Evaluate
Impact Evaluated Profile
Analysis Scenarios Analysis
Scenario Planning and Impact Analysis
Scenario Planning and Impact Analysis is the archetypical method for futures studies
because it embodies the central principles of the discipline:
The future is uncertain - so we must prepare for a wide range of possible, probable
and alternative futures, not just the future that we desire (or hope) will happen.....
It is vitally important that we think deeply and creatively about the future, else we run
the risk of being surprised, unprepared for, or overcome by events or all of these.....
Scenarios contain the stories of these multiple futures - from the Utopian to the Dystopian,
from the preferred to the expected, from the Wild Card to the Black Swan - in forms which
are analytically coherent and imaginatively engaging. A good scenario grabs our attention
and says, Take a good look at this future. This could be your future - are you prepared ?
As consultants and organizations have come to recognize the value of scenarios, they
have also latched onto one scenario technique a very good one in fact as the default
for all their scenario work. That technique is the Royal Dutch Shell / Global Business
Network (GBN) matrix approach, created by Pierre Wack in the 1970s and popularized by
Schwartz (1991) in the Art of the Long View and Van der Heijden (1996) in Scenarios: The
Art of Strategic Conversations. In fact, Millett (2003, p. 18) calls it the gold standard of
corporate scenario generation.
Weak Signals and Wild Cards
Scan and
Identify
Communicate Discover
Random Event
Publish
Wild Signal Weak Track and
and
Card Processing Signal Monitor
Socialise
Strong Signal
Understand Evaluate
Investigate
Random Events and
Weak Signal / Wild Card Research
Signal Processing
Weak Signals and Wild Cards
Wild Card or "Black Swan" manifestations are extreme and unexpected
events which have a very low probability of occurrence, but an inordinately
high impact when they do happen. Trend-making and Trend-breaking agents
or catalysts of change may predicate, influence or cause wild card events
which are very hard - or even impossible - to anticipate, forecast or predict.
Black Random
Swan Event
Communicate Discover
Understand Evaluate
Strong
Signal
Weak Signals and Wild Cards
Scan and
Identify
Communicate Discover
Random Event
Publish
Wild Signal Weak Track and
and
Card Processing Signal Monitor
Socialise
Strong Signal
Understand Evaluate
Investigate
Random Events and
Weak Signal / Wild Card Research
Signal Processing
Weak Signals and Wild Cards
Wild Card or "Black Swan" manifestations are extreme and unexpected
events which have a very low probability of occurrence, but an inordinately
high impact when they do happen. Trend-making and Trend-breaking agents
or catalysts of change may predicate, influence or cause wild card events
which are very hard - or even impossible - to anticipate, forecast or predict.
Black Random
Swan Event
Communicate Discover
Understand Evaluate
Strong
Signal
Scenario Planning and Impact Analysis
Monte Carlo Simulation
Non-linear Models
Possible,
Probable Published SCENARIOS and Discovered Probable
Scenarios Scenarios USE CASES Scenarios & Alternative
Futures
Bayesian Analysis
Understand Evaluate
Impact Evaluated Profile
Analysis Scenarios Analysis
Scenario Planning and Impact Analysis
Scenario Planning and Impact Analysis is the archetypical method for futures studies
because it embodies the central principles of the discipline:
The future is uncertain - so we must prepare for a wide range of possible, probable
and alternative futures, not just the future that we desire (or hope) will happen.....
It is vitally important that we think deeply and creatively about the future, else we run
the risk of being surprised, unprepared for, or overcome by events or all of these.....
Scenarios contain the stories of these multiple futures - from the Utopian to the Dystopian,
from the preferred to the expected, from the Wild Card to the Black Swan - in forms which
are analytically coherent and imaginatively engaging. A good scenario grabs our attention
and says, Take a good look at this future. This could be your future - are you prepared ?
As consultants and organizations have come to recognize the value of scenarios, they
have also latched onto one scenario technique a very good one in fact as the default
for all their scenario work. That technique is the Royal Dutch Shell / Global Business
Network (GBN) matrix approach, created by Pierre Wack in the 1970s and popularized by
Schwartz (1991) in the Art of the Long View and Van der Heijden (1996) in Scenarios: The
Art of Strategic Conversations. In fact, Millett (2003, p. 18) calls it the gold standard of
corporate scenario generation.
Scenario Planning and Impact Analysis
Outsights "21 Drivers for the 21st Century"
Scenarios are specially constructed stories about the future - each one portraying
a distinct, challenging and plausible world in which we might one day live and
work - and for which we need to anticipate, plan and prepare.
The Outsights Technique translates what is learnt into action in the following
ways to achieve sustainable change and risk management : -
The Outsights Technique helps stakeholders stand back, take stock and
seek fresh points of view: -
Steps
1. Participants are given a scope, focus and time horizon for the exercise.
2. Horizon Scanning, Monitoring and Tracking and Monte Carlo Simulations provide
sources of information. These data sets can come from internal or external sources
Data Scientists, Domain Experts and Researchers, Big Data Analysts, the project
team, or from prior studies and data collection exercises from the individual team
participants. These should cover a broad external analysis, such as STEEP.
3. Individuals review the sources and spot items that cause personal insights on the
focus given. These insights and their sources are captured in the form of abstracts.
4. Abstracts are discussed and themed to indicate wave-forms over the time horizon
concerned. Scenarios are stacked, racked and prioritised by impact and probability.
5. The participants agree on how to address the resulting Scenarios, Waves, Cycles,
Patterns and Trends with supporting information for further futures analysis.
More information about tools and uses of horizon scanning in Central Government can be
found on the Foresight Horizon Scanning Centre website.
Seeing in Multiple Horizons: -
Connecting Strategy to the Future
THE THREE HORIZONS MODEL describes a Strategic Foresight method called Seeing in Multiple
Horizons: - Connecting Strategy to the Futures " The current THREE HORIZONS MODEL differs
significantly from the original version first described in management literature over a decade ago.
This model enables a range of Futures Studies techniques to be integrated with Strategy Analysis
methods in order to reveal powerful and compelling future insights and may be deployed in various
combinations, whenever or wherever the Futures Studies techniques and Strategy Analysis methods
are deemed to support the futures domains, subjects, applications and data in the current study.
THE THREE HORIZONS MODEL method connects the Present Timeline with deterministic (desired or
proposed) futures, and also helps us to identify probabilistic (forecast or predicted) future scenarios
which may emerge as a result of interaction between embedded present-day factors and emerging
catalysts of change thus presenting us with a range of divergent possible futures. The Three
Horizons method connects to models of change developed within the Social Shaping Strategy
Development Framework via the Action Link to Strategy Execution. Finally, it summarises a number of
futures applications where this evolving technique has been successfully deployed.
The new approach to Seeing in Multiple Horizons: - Connecting Strategy to the Future has several
unique features. It can relate change drivers and trends-based futures analysis to emerging issues. It
enables policy or strategy implications of futures to be identified and links futures work to processes
of change. In doing so this enables Foresight to be connected to existing and proposed underlying
system domains and data structures, with different rates of change propagation impacting across
different parts of the system, and also to integrate seamlessly with tools and processes which facilitate
Strategic Analysis. This approach is especially helpful where there are complex transformations which
are likely to be radically disruptive in nature - rather than simple incremental transitions.
Seeing in Multiple Horizons: -
Connecting Strategy to the Future
The Three Horizons
All of this external data is found widely distributed across the internet as Global Content
RSS News Feeds and Data Streams, Academic Research Papers and Datasets - is
processed in order to detect and identify the possibility of unfolding random events and
clusters to systematically reduce the level of exposure to uncertainty, to reduce risk
and gain future insights in order to prepare for adverse future conditions or to exploit
novel and unexpected opportunities for innovation" (LESCA, 1994). As a management
support tool for strategic decision-making, horizon and environment scanning process
have some very special challenges that need to be taken into account by environment /
horizon scanners, researchers, data scientists and analysts - as well as stakeholders.
Horizon and Environment Scanning,
Tracking and Monitoring Processes
Horizon Scanning (Human Activity Phenomena) and Environment Scanning (Natural
Phenomena) are the broad processes of capturing input data to drive futures projects and
programmes - but they also refer to specific futures studies tool sets, as described below.
Individuals use sources to draw insights and create abstracts of the source, then share
these with other participants. Horizon scanning lays a platform for further futures activities
such as scenarios or roadmaps. This builds strategic analysis capabilities and informs
strategy development priorities. Once uncovered, such insights can be themed as key
trends, assessed as drivers or used as contextual information within a scenario narrative.
The graphic image below illustrates how horizon scanning is useful in driving Strategy
Analysis and Development: -
Strategy versus Horizon Scanning
Horizon and Environment Scanning,
Tracking and Monitoring Processes
Horizon Scanning, Tracking and Monitoring is the major input for unstructured Big Data to
be introduced into the Scenario Planning and Impact Analysis process (along with Monte
Carlo Simulation and other probabilistic models providing structured data inputs). In this
regard, Scenario Planning and Impact Analysis helps to create a conducive team working
environment. It allows consideration of a broad spectrum of input data beyond the usual
timescales and sources drawing information together in order to identify future challenges,
opportunities and trends. It looks for evidence at the margins of current thinking as well as in
more established trends. This allows the collective insights of the group to be integrated -
demonstrating the many differing ways which diverse sources contribute to these insights.
Horizon Scanning, Tracking and Monitoring is ideal as an initial activity for collecting Weak
Signal data input into the Horizon Scanning, Tracking and Monitoring process to kick-off
major futures studies projects and future management programmes. Scenario Planning and
Impact Analysis is also useful as a sense-making and interaction tool for an integrated
future-focused team. Horizon Scanning, Tracking and Monitoring combined with Scenario
Planning and Impact Analysis works best if people external to the organisation are included
in the team - and are encouraged to help bring together new and incisive perspectives.
The graphic image below illustrates how horizon scanning is useful in spotting weak signals
that might be otherwise difficult to see and so risk being overlooked: -
Seeing in Multiple Horizons
Horizon Scanning, Tracking and
Monitoring Processes
Horizon Scanning, Tracking and Monitoring is a systematic search and examination
of global internet content BIG DATA information which is gathered, processed and
used to identify potential threats, risks, emerging issues and opportunities as a result of
Human Activity - allowing for the incorporation of mitigation and exploitation themes into
in the policy making process as well as improved preparation for business continuity,
contingency planning and disaster response, and enterprise risk management events.
Horizon Scanning is used as an overall term for discovering and analysing the unfolding
future of the Human World Politics, Economics, Sociology, Religion Culture and War
considering how emerging trends and developments might potentially affect current policy
and practice. This helps policy makers in government to take a longer-term strategic
approach, and makes present policy more resilient to future uncertainty. In developing
policy, Horizon Scanning can help policy makers to develop new insights and to think
about outside of the box solutions to human activity threats and opportunities.
In contingency planning and disaster response, Horizon Scanning helps to manage risk
by discovering and planning ahead for the emergence of unlikely, but potentially high
impact Black Swan events. There is a wide range of Futures Studies philosophical
paradigms, and technology approaches which are all supported by numerous methods,
tools and techniques for exploring possible, probable and alternative future scenarios.
Horizon and Environment Scanning,
Tracking and Monitoring Processes
Horizon and Environment Scanning Event Types refer to Weak Signals of any unforeseen,
sudden and extreme Global-level transformation or change Future Events in either the military,
political, social, economic or environmental landscape - having an inordinately low probability of
occurrence - coupled with an extraordinarily high impact when they do occur (Nassim Taleb).
Weak Signals are messages, subliminal temporal indicators of ideas, patterns, trends or
random events coming to meet us from the future or signs of novel and emerging desires,
thoughts, ideas and influences which may interact with both current and pre-existing patterns
and trends to predicate impact or effect some change in our present or future environment.
Scenario Planning and Impact Analysis
Scenario Planning and Impact Analysis
Scenario Planning and Impact Analysis is the archetypical method for futures studies
because it embodies the central principles of the discipline:
It is vitally important that we think deeply and creatively about the future, or else we run
the risk of being either unprepared or surprised or both......
At the same time, the future is uncertain - so we must prepare for a range of multiple
possible and plausible futures, not just the one we expect to happen.
Scenarios contain the stories of these multiple futures, from the expected to the
wildcard, in forms that are analytically coherent and imaginatively engaging. A good
scenario grabs us by the collar and says, Take a good look at this future. This could be
your future. Are you going to be ready?
As consultants and organizations have come to recognize the value of scenarios, they
have also latched onto one scenario technique a very good one in fact as the
default for all their scenario work. That technique is the Royal Dutch Shell/Global
Business Network (GBN) matrix approach, created by Pierre Wack in the 1970s and
popularized by Schwartz (1991) in the Art of the Long View and Van der Heijden (1996)
in Scenarios: The Art of Strategic Conversations. In fact, Millett (2003, p. 18) calls it the
gold standard of corporate scenario generation.
Outsights "21 Drivers for the 21st Century"
1. War, terrorism and insecurity 12. History, Culture and Human Identity
2. Layers of power 13. Consumerism and the rise of the Middle
3. Economic and financial stability Classes
4. BRICs and emerging powers 14. Networks and Social Connectivity
Brazil 15. Space - the final frontier
Russia The Cosmology Revolution
India 16. Science and Technology Futures
China The Nano Revolution
5. The Five Flows of Globalisation The Quantum Revolution
Ideas The Information Revolution
Goods The Bio-Technology Revolution
People The Energy Revolution Oil Shale Kerogen Tar
Sands Methane Hydrate Nuclear Fusion
Capital
Services 17. Science and Society - Social Impact of
Technology
6. Intellectual Property and Knowledge
18. Natural Resources availability, scarcity and
7. Health, Wealth and Wellbeing control
8. Demographics, Ethnographics and Social 19. Climate Change
Anthropology - Transhumanism
Global Massive Change the Climate Revolution
9. Population Drift, Migration and Mobility
20. Environmental Degradation & Mass Extinction
10. Trust and Reputation
21. Urbanisation
11. Human Values and Beliefs
Outsights "21 Drivers for the 21st Century"
Scenarios are specially constructed stories about the future - each one portraying
a distinct, challenging and plausible world in which we might one day live and
work - and for which we need to anticipate, plan and prepare.
THE THREE HORIZONS MODEL method connects the Present Timeline with deterministic (desired or
proposed) futures, and also helps us to identify probabilistic (forecast or predicted) future scenarios
which may emerge as a result of interaction between embedded present-day factors and emerging
catalysts of change thus presenting us with a range of divergent possible futures. The Three
Horizons method connects to models of change developed within the Social Shaping Strategy
Development Framework via the Action Link to Strategy Execution. Finally, it summarises a number of
futures applications where this evolving technique has been successfully deployed.
The new approach to Seeing in Multiple Horizons: - Connecting Strategy to the Future has several
unique features. It can relate change drivers and trends-based futures analysis to emerging issues. It
enables policy or strategy implications of futures to be identified and links futures work to processes
of change. In doing so this enables Foresight to be connected to existing and proposed underlying
system domains and data structures, with different rates of change propagation impacting across
different parts of the system, and also to integrate seamlessly with tools and processes which facilitate
Strategic Analysis. This approach is especially helpful where there are complex transformations which
are likely to be radically disruptive in nature - rather than simple incremental transitions.
Seeing in Multiple Horizons: -
Connecting Strategy to the Future
The Three Horizons
The Eltville Model consists of a process model that explores and describes in turn, six
different viewpoints or perspectives of the future (the six futures lenses") in a sequence
of analytical steps for exploration and discovery in a workshop environment - as a futures
outputs model, or framework, which captures the results generated as "thought objects.
The SIX futures lenses below make it easier to analyse and understand the future: -
The ELTVILLE MODEL helps us all to structure our future scenarios and thoughts
about future outcomes to formulate future strategy in a coherent way without omitting
any important determining factors or neglecting any essential viewpoints.
The ELTVILLE MODEL helps us to obtain some clarity on the most important Future
Management outcomes, goals and objectives and communicate in a clear narrative
about the future of our market and our companies place in that market.
Using phenomenon-based scenario planning and impact analysis, the ELTVILLE FUTURE
MANAGEMENT! MODEL is proven in more than a thousand projects. Future
Management Group have defined the essential meaning of Future Management terms and
their key application to deliver a cognitive model and a cognitive map from them.
The ELTVILLE MODEL helps us all to apply the common Strategy Analysis and Strategic
Foresight tools much more effectively within a comprehensive Futures Framework. This
model also provides participants with a road map for thinking and communicating about
the future with your stakeholders and an integrated future-oriented structure for managing
strategy delivery projects.
The SIX futures lenses below make it easier to analyse and understand the future: -
1. BLUE lenses are for PROBABLISTIC FUTURE RATIONAL FUTURISTS
2. RED lenses are for FUTURE THREATS DISRUPTIVE FUTURISTS
3. GREEN lenses are for FUTURE OPPORTUNISTIIES EVOLUTIONARY FUTURISTS
4. GOLD lenses are for DESIRED FUTURE VISION GOAL ANALYSTS
5. INDIGO lenses are for STEADY STATE FUTURE EXTRAPOLATION and PATTERN ANALYSTS
6. VIOLET lenses are for DETERMINISITC FUTURE STRATEGIC POSITIVISTS
THE ELTVILLE MODEL by Pero Mii
The Eltville Model of Future Management is used by companies and public institutions to
support thinking and communicating about future environmental changes, the early
recognition of future markets, the development of future strategies and the building up of
future competence with a sound system of terms. The Eltville Model provides a
comprehensive and integrated terminology. It links the requirements on scientific future
management with the necessities of a companys day-to-day business.
The ELTVILLE MODEL has been developed through futures research in more than a
thousand workshops and projects with governmental and non-profit organizations as well
as with major corporations around the world, - including BOSCH, Microsoft, BAYER,
AstraZeneca, Roche, Ernst+Young, Ford, Vodafone, EADS and Nestle.
The SIX futures lenses below make it easier to analyse and understand the future: -
The purpose of the Disruptive Futurist role is to provide future analysis and strategic
direction to support senior client stakeholders who are charged by their organisations with
thinking about the future. This involves enabling clients to anticipate, prepare for and
manage the future by helping them to understanding how the future might unfold - thus
realising the Stakeholder Strategic Vision and Communications / Benefits Realisation
Strategies. This is achieved by scoping, influencing and shaping client organisational
change and driving technology innovation to enable rapid business transformation.
4. The future will evolve from a series of actions and events which emerge, unfold and
develop and then plateau, decline and collapse. These actions and events are
essentially natural responses to human impact on ecological and environmental support
systems - creating massive global change through population growth, environmental
degradation and scarcity of natural resources. Over the long term, global stability and
sustainability of those systems will be preserved at the expense of world-wide human
population levels.
The shape of the future may thus be created by the powerful and influential - the good
and the great - and may be discovered via Goal Analysis and interpretation of the
policies, behaviours and actions of such individuals, along with those think-tanks,
policy groups and political institutions to which they belong, subscribe to and follow.
Throughout eternity, all that is of like form comes around again everything that is the
same must return again in its own everlasting cycle.....
As the future-present develops and unfolds it does so as a continuum of time past, time
present and time future and so eternally perpetuating the eternally unfolding, extension,
replication and preservation of those historic cycles, patterns and trends that have shaped and
influenced actions and events throughout time.
The future may develop and unfold so as to comply with our positive vision of an ideal future
and thus fulfil all of our desired outcomes, goals and objectives in order that the planned
future becomes attainable and our preferred future options may ultimately be realised.
The results of spatial data analysis are largely dependent upon the type,
quantity, distribution and data quality of the spatial objects under analysis.
Minkowski
Space-Time
continuum
During 1907, in an attempt
to understand the previous
works of Lorentz and
Einstein - a radical four-
dimensional view of the
Universe (space-time
continuum) was designed
by German Mathematician
Hermann Minkowski .
Classical (Newtonian)
physics, describes a three-
dimensional vector co-
ordinate system defining
Space (position) - and the
flow of Time (history) the
other universal dimension
were considered to exist
independently until the
synthesis of Minkowski
space-time continuum, .
Complex Systems and Chaos Theory
Complex Systems and Chaos Theory has been used extensively in the field
of Futures Studies, Strategic Management, Natural Sciences and Behavioural
Science. It is applied in these domains to understand how individuals within
populations, societies, economies and states act as a collection of loosely
coupled interacting systems which adapt to changing environmental factors
and random events bio-ecological, socio-economic or geo-political.
One of the problems in addressing complexity issues has always been distinguishing between
the large number of elements (components) and relationships (interactions) evident in chaotic
(unconstrained) systems - Chaos Theory - and the still large, but significantly smaller number
of both and elements and interactions found in ordered (constrained) Complex Systems.
Orderly System Frameworks tend to dramatically reduce the total number of elements and
interactions with fewer and smaller classes of more uniform elements and with reduced,
sparser regimes of more restricted relationships featuring more highly-ordered, better internally
correlated and constrained interactions as compared with Disorderly System Frameworks.
Unconstrained Complex Adaptive Constrained Non-linear Linear
Void
Complexity Systems (CAS) Complexity Systems Systems
The results of geospatial analytics are fully dependent on the type, location,
data sample size - and data quality of the geospatial objects being studied.
4D Geospatial Analytics
Preferred
Future
Desired Outcomes,
Goals and Objectives
Probable
Future
Past
Possible
Future
4D Geospatial Analytics The Temporal Wave
The problems encountered in exploring, analysing and extracting insights from the vast
volumes of spatialtemporal information in today's data-rich landscape are becoming
increasingly difficult to manage effectively. In order to overcome the problem of data
volume and scale in an integrated Time (history) and Space (location) context requires
not only traditional locationspace and attributespace analysis common in GIS Mapping
and Spatial Analysis - but now with the additional dimension of Space-Time analysis. The
Temporal Wave supports a new method of Visual Exploration for Geospatial (location)
data within a Temporal (timeline) context. The Temporal Wave is a novel and innovative
method for Visual Modelling, Exploration and Analysis of the Space-Time dimension
fundamental to understanding Geospatial Big Data through simultaneously visualising
and displaying complex data within a Time (history) and Space (geographic) context.
If all future timeline were linear, then every event would unfold in an unerringly
predictable manner towards a known and certain conclusion. The future is,
however, both unknown and unknowable (Hawking Paradox) . Future outcomes
are uncertain future timelines are non-linear (branched) with a multitude of
possible alternative futures. Chaos Theory suggests that even the most
subliminal inputs, originating from unknown forces so minute as to be
undetectable, might become amplified through numerous system cycles to grow
in influence and impact over time deviating Space-Time trajectories far away
from their original predicted path so fundamentally altering the outcome of
future events.
The Flow of Information through Time
Space-Time is a four-dimensional (4D) Cluster consisting of the three Spatial
dimensions (x, y and z axes) plus Time (the fourth dimension - t). The arrow of
time governs the flow of Space-Time which can only flows relentlessly in a
single direction towards the future. Every item of Global Content in the Present
is somehow connected with both Past and Future temporal planes in a timeline
composed of a sequence of temporal planes stacked one on top of another.
Space-Time does not flow uniformly the arrow of time may be warped or
deflected by various factors gravitational fields, dark matter, dark energy, dark
flow, hidden dimensions or unknown Membranes in Hyperspace. There may
exist hidden external forces (unseen interactions) that create disturbance in the
temporal plane stack which marks the passage of time - with the potential to
create eddies, vortices and whirlpools along the trajectory of Time (chaos,
disorder and uncertainty) which in turn posses the capacity to generate ripples
and waves (randomness and disruption) thus changing the course of the
Space-Time continuum. Weak Signals are Ghosts in the Machine
echoes of these subliminal temporal interactions that may contain within
insights or clues about possible future Wild card or Black Swan random
events
4D Geospatial Analytics The Temporal Wave
The Temporal Wave is a novel and innovative method for Visual Modelling and
Exploration of Geospatial Big Data - simultaneously within a Time (history) and
Space (geographic) context. The problems encountered in exploring and analysing
vast volumes of spatialtemporal information in today's data-rich landscape are
becoming increasingly difficult to manage effectively. In order to overcome the
problem of data volume and scale in a Time (history) and Space (location) context
requires not only traditional locationspace and attributespace analysis common
in GIS Mapping and Spatial Analysis - but now with the additional dimension of
timespace analysis. The Temporal Wave supports a new method of Visual
Exploration for Geospatial (location) data within a Temporal (timeline) context.
The profiling and analysis of large aggregated datasets in order to determine a natural
structure of data relationships or groupings, is an important starting point forming the
basis of many mapping, statistical and analytic applications. Cluster analysis of implicit
similarities - such as time-series demographic or geographic distribution - is a critical
technique where no prior assumptions are made concerning the number or type of
groups that may be found, or their relationships, hierarchies or internal data structures.
Geospatial and demographic techniques are frequently used in order to profile and
segment populations by natural groupings. Shared characteristics or common factors
such as Behaviour / Propensity or Epidemiology, Clinical, Morbidity and Actuarial
outcomes allow us to discover and explore previously unknown, unrecognised or
concealed insights, patterns, trends or data relationships. "Big Data" sources include: -
Greater London covers 600 square miles. Up until the 17th century, however, the capital city
was crammed largely into a single square mile which today is marked by the skyscrapers which
are a feature of the financial district of the City. Unlike other historical cities such as Athens or
Rome, with an obvious patchwork of districts from different periods, London's individual
structures scheduled sites and listed buildings are in many cases constructed gradually by parts
assembled during different periods. Researchers who have tried previously to locate and
document archaeological structures and research historic references will know that these
features, when plotted, appear scrambled up like pieces of different jigsaw puzzles all
scattered across the contemporary London cityscape.
This visualisation, originally created for the Almost Lost exhibition by the Bartlett Centre for
Advanced Spatial Analysis (CASA), explores the historic evolution of the city by plotting a
timeline of the development of the road network - along with documented buildings and other
features through 4D geospatial analysis of a vast number of diverse geographic,
archaeological and historic data sets.
4D Geospatial Analytics
Geo-spatial and geodemographic
techniques are frequently used to
profile, stream and segment human
populations using natural groupings
such as shared or common
behavioural traits Medical, Clinical
Trial, Morbidity or Actuarial outcomes
along with many other common
factors and shared characteristics.....
GIS
Gazetteer
Social Intelligence Lifestyle Understanding
Multiple Pyramids can be created and cross-referenced using Social Intelligence and Brand
Interaction / Fan-base Profiling and Segmentation in order to deliver actionable insights for any
genre of Brand Loyalty and Lifestyle Understanding as well as for other Geo-demographic
Analytics purposes e.g. Digital Healthcare, Clinical Trials, Morbidity and Actuarial Outcomes: -
Social Intelligence drives Brand Loyalty Understanding - Fan-base Profiling, Streaming and Segmentation
expressed in the creation and maintenance of a detailed History and Balanced Scorecard for every individual in
the Pyramid, allowing summation by Stream / Segment: -
1. Fanatics demonstrate total Commitment / Dedication / Loyalty for all aspects of the Brand / Product / Media
2. Supporters show strong need, desire and propensity to support Brand / Product / Media consumption
3. Enthusiasts engaged with the Brand, participate in Brand / Product / Media events and merchandising
4. Followers follow the Brand, engage with social media and consume brand communications
5. Casuals exhibit Brand awareness and interest
6. Disconnected need to re-engage with the Brand
7. Indifferent need to educate them about core Brand Values
8. Unconnected need to draw their attention towards the Brand
Balanced Scorecard is a summary of all the data-points for an Individual / Stream / Segment
Propensity Score In the statistical analysis of observational data, Propensity Score Matching (PSM) is a
statistical matching technique that attempts to estimate the effect of a Campaign / Offer / Promotion or other
intervention by calculating the impact of factors that predict the outcome of the Campaign / Offer / Promotion.
Propensity Model is the Baysian probability of the outcome of an event in an Individual / Stream / Segment
Predictive Analytics - an area of data mining that deals with extracting information from data and using it to
predict trends and behaviour patterns. Often the unknown event of interest is in the future, however, Predictive
Analytics can be applied to any type of event with an unknown outcome - in the past, present or future.
Social Social IntelligenceFan-base
Intelligence Understanding
Fan-base Understanding
Social Intelligence Social Interaction
Social Interaction Pyramid Rules
1. Promiscuous Open Networker virtual Social Network across all categories- will connect with anybody
2. Networker Social Network clustered around shared, common interests Sport. Music and Fashion etc.
3. Friends and Family Social Network clustered around physical social contacts - Friends and Family
4. Workplace Social Network clustered around Work and Colleagues (e.g. City Brokers, Traders)
5. Eternal Student Social Network clustered around School / College / University Alumni
6. Home Boy Social Network clustered around Home Location Postcodes (Gang Culture)
7. Lone Wolf sparse / thin social network - may share negative information (Trolling)
8. Inactive not engaged low evidence / low affinity / low interest in Social Media
Number of Segments
With anonymous data (e.g polls) then the number of initial Segments is 4 (Matt Hart). With named individuals
we can discover much richer internal and external data sources (Social Media / User Content / Experian) - and
therefore segment the population with greater granularity
When individuals qualify for multiple segments - we can either add these deviant individuals to the Segment
that they have the greatest affinity with - or kick out any such deviants into an Outlying / Outcast /
Miscellaneous Segment for further processing or manual profiling / intervention
Social Interaction
How consumers use social media (e.g., Facebook, Twitter) to address and/or engage with companies around social and environmental issues.
Observing, Understanding and
Predicting Human Actions
Economic Analysis Human Actions
Economist Ludwig von Mises, explains that complex market phenomena are
simply "the outcomes of endless conscious, purposeful individual actions, by
countless individuals exercising personal choices and preferences - each of
whom is trying as best they can to optimise their circumstances in order to
achieve various needs and desires. Individuals, through economic activity
strive to attain their preferred outcomes - whilst at the same time attempting
to avoid any unintended consequences leading to unforeseen outcomes." s
Understanding Human Actions
Summary
In his foreword to Human Action: A Treatise on Economics, the great Austrian School
Economist, Ludwig von Mises, explains that complex market phenomena are simply "the
outcomes of endless conscious, purposeful individual actions, by countless individuals
exercising personal choices and preferences - each of whom is trying as best they can to
optimise their circumstances in order to achieve various needs and desires. Individuals,
through economic activity strive to attain their preferred outcomes - whilst at the same
time attempting to avoid any unintended consequences leading to unforeseen outcomes."
Thus von Mises lucidly presents the basis of economics as the science of observing, analysing,
understanding and predicting intimate human behaviour (human actions micro-economics)
which when aggregated together in a Market creates the flow of goods, services, people
and capital (market phenomena macro-economy).
Thus von Mises lucidly presents the basis of economics as the science of observing,
analysing, understanding and predicting intimate human behaviour (human actions or
micro-economics) which when aggregated creates the flow of goods, services, people
and capital (market phenomena - or the macro-economy). Individual choices in response
to subjective personal value judgments ultimately determine all market phenomena -
patterns of supply and demand, production and consumption, costs and prices, and even
profits and losses. Although commodity prices may appear to be set by economic planners
in central banks under strict government control - it is, in fact, the actions of individual
consumers living in communities and participating in their local economy who actually
determine what the Real Economic value of commodity prices really are. As a result of the
individual choices and collective actions exercised by producers and consumers through
competitive bidding in markets for capital and labour, goods and materials, products and
services throughout all global markets ultimately the global economy is both driven by,
and is the product of - the sum of all individual human actions.
Joseph Schumpeter
Joseph Schumpeter studied under the great Austrian economist Bohm-Bawerk, - but he was
far too independent in his thinking to be a part of any formal political movement or economic
school. The publication of his book the Theory of Economic Development was effectively
Schumpeters declaration of independence from the formal Austrian School Real Economic
Theory of capital transfer and disruptive economic change.
In this book, Schumpeter introduces the Business Cycle Theory as the driving force behind
Economic Development a theory of Capital Transfer which shocked many of his more
orthodox and conventional colleagues. Economic development, Schumpeter argues, involves
transferring capital from old businesses (cash cows) with their established methods of goods
production to emerging businesses (rising stars) using new, innovative methods of production.
Schumpeters special insight comes in trying to explain how the transfer of capital from old
industries (cash cows) into new and emerging industries (rising stars) takes place. Schumpeter
argued that capital transfer takes place through credit expansion. Through the fractional reserve
system, banks are able to create credit (print money.....), quite literally out of thin air. This money
is lent to businesses pioneering new methods of production, who then bid up the price of
production goods and consumer products in their effort to pay for the production goods they
require. Thus a form of inflationary spoliation takes place at the expense of established
businesses and consumers. Although Schumpeter does not draw attention specifically to the
spoliation inference from his theory, it is nonetheless, still there in the text for all to see.....
Econometrics
Value Creation vs. Value Consumption
We live in a natural world which, at the birth of civilisation, once was brimming to the full with
innumerable and diverse natural resources. It is important to realise that Wealth was never
bestowed on us for free simply as a result of that abundant feedstock of natural resources.
Throughout History, Wealth was always extracted or created through Human Actions the
result of countless men executing primary Value Creation Processes throughout the last 10,000
years- -such as Hunting and Gathering, Fishing and Forestry, Agriculture and Livestock, Mining
and Quarrying, Refining and Manufacturing. Secondary Added Value Processes - such as
Transport and Trading, Shipping and Mercantilism serve only to Add Value to primary Wealth
which was originally created by the labour of others executing primary Value Chain Processes.
The Economic Wealth that we enjoy today as an advanced globalised society is not generated
magically through intellectual discovery and technology innovation, nor through market
phenomena created by the efforts of brokers and traders - or even by monetarist intervention
from economic planners or central bankers. Economic Wealth is as a result of the effort of man
- Human Actions and primary Value Chain Processes generating Utility or Exchange Value
Vast amounts of Wealth can also be created (and destroyed.....) via Market Phenomena - the
Boom and Bust Business Cycles of Economic Growth and Recession which act to influence
the Demand / Supply Models and Price Curves of Commodities, Bonds, Stocks and Shares in
Global Markets. Market Phenomena are simply the sum of all Human Actions the aggregated
activity of Traders and Brokers, Buyers and Sellers participating in that particular marketplace.
Value Creation in Business
As an introduction to this special topic of the Value Chain - we have defined value
creation in terms of: Utility Value which is contrasted with Exchange Value -
1. Utility Value skills, learning, know-how, intellectual property and acquired knowledge
2. Exchange Value land, property, capital, goods, traded instruments, commodities and
accumulated wealth.
Some of the key issues related to the study of Value are discussed - including the
topics of value creation, capture and consumption. All Utility and Exchange Value is
derived from fundamental Human Actions. Although this definition of value creation is
common across multiple levels of activity and analysis, the process of value creation
will differ based on its origination or source - whether that economic value is created
by an individual, a community, an enterprise - or due to Market Phenomena.
We explore the concepts of Human Actions, competition for scarce resources and
market isolating mechanisms which drive Business Cycles and Market Phenomena in
the Global Economy - using Value Chain analysis in order to explain how value may
be created, exchanged and captured or consumed, dissipated and lost as a result
of different activities using different processes at various levels within the Value Chain
Value Creation in Business
In order to develop a theory of value creation by enterprises, it is useful to first
characterise the value creation process. In the next two sections of this document
we develop a framework that builds upon Schumpeter's arguments to show: -
In other words - resource combination and exchange lie at the heart of the value
creation process and in sections II and III we both describe how this process
functions - and also identify the conditions that facilitate and encourage, or slow
down and impede, each of these five elements of the Value Creation process.
Value Creation in Business
This framework establishes the theoretical infrastructure for the analysis of the roles firms
play in this value creation process and of how both firms and markets collectively influence
the process of economic development which is derived from Human Actions: -
1. Value Creation primary Wealth Creation Processes
2. Value Capture the Acquisition of Wealth by means other than Value Creation
3. Value Stockpiling the Accumulation of Wealth
4. Value-added Services Mercantilism, shipping, transport, sales, trading, battering , exchange
5. Value Consumption the depletion of Resources or the exhaustion of Wealth
As our analysis of the requirements for effective resource combination and exchange reveals,
global market phenomena alone are able to create only a very small fraction of the total
value that can be created out of the stock of resources available in economies. The very
different institutional nature and context of enterprises, operating in a state of creative
tension within global markets, substantially enhance the fraction of the total potential value
that can be obtained out of natures resources. We describe this process of value creation by
firms and, in section V, we integrate the firm's role with that of markets to explain why both
firms and markets are needed to ensure that economies develop and progress in a way that
achieves what Douglass North (1990) has described as "adaptive efficiency."'
Value Creation vs. Value Consumption
There are five major roles for people in society: those who create wealth Primary Value
Creators (Agriculture and Manufacturing) , those who Capture Value from others (through
Taxation, War, Plunder or Theft) those who stockpile Wealth (Savers) and those who
merely consume the wealth generated by others Value Consumers.. Somewhere in the
middle are the Added Value Providers those who create secondary value by executing
value-added processes to commodities and goods created by primary Value Creators.
About half of society Children, Students, Invalid and Sick, Unemployed and Government
Workers consume much of the wealth generated by Primary and Secondary Wealth
Creators offsetting only little of their depletion of Resources or consumption of Wealth.
Wave-form Analytics in Econometrics
WAVE THEORY NATURAL CYCLES
Milankovitch Astronomic Cycles
Milankovitch Cycles are a Composite Harmonic Wave Series built up from individual wave-forms with
periodicity of 20-100 thousand years - exhibiting multiple wave harmonics, resonance and interference
patterns. Over very long periods of astronomic time Milankovitch Cycles and Sub-cycles have been
beating out precise periodic waves, acting in concert together, like a vast celestial metronome.
From the numerous geological examples found in Nature including ice-cores, marine sediments and
calcite deposits, we know that Composite Wave Models such as Milankovitch Cycles behave as a
Composite Wave Series with automatic, self-regulating control mechanisms - and demonstrate
Harmonic. Resonance and Interference Patters with extraordinary stability in periodicity through
many system cycles over durations measured in tens of millions of years.
Climatic Change and the fundamental astronomical and climatic cyclic variation frequencies are
coherent, strongly aligned and phase-locked with the predictable orbital variation of 20-100 k.y
Milankovitch Climatic Cycles which have been modeled and measured for many iterations, over a
prolonged period of time, and across many levels of temporal tiers - each tier hosting different types of
geological processes, which in turn influence different layers of Human Activity.
Milankovitch Cycles - are precise astronomical cycles with periodicities of 22, 41, 100 and 400 k.y
Precession (Polar Wandering) - 22,000 year cycle
Eccentricity (Orbital Ellipse) 100,000 and 400,000 year cycles
Obliquity (Axial Tilt) - 41,000-year cycle
WAVE THEORY NATURAL CYCLES
Sub-Milankovitch Climatic Cycles
Sub-Milankovitch Climatic Cycles are less well understood varying from Sun Cycles of 11 years
to Climatic Variation Trends of up to 1470 years intervals, may also impact on Human Activity
short-term Economic Patterns, Cycles and Innovation Trends to long-term Technology Waves and
the rise and fall of Civilizations. A possible explanation might be found in Resonance Harmonics of
Milankovitch-Cycles 20-100 ky / sub-Cycle Periodicity - resulting in Interference Phenomenon from
periodic waves being re-enforced and cancelled. Dansgaard-Oeschger (D/O) events with precise
1470 years intervals - occurred repeatedly throughout much of the late Quaternary Period.
Dansgaard-Oeschger (D/O) events were first reported in Greenland ice cores by scientists Willi
Dansgaard and Hans Oeschger. Each of the 25 observed D/O events in the Quaternary Glaciation
Time Series consist of an abrupt warming to near-interglacial conditions that occurred in a matter of
decades - followed by a long period of gradual cooling down again over thousands of years
Climate change is not uniform some areas of the globe (Arctic and Antarctica) have seen a
dramatic rise in average annual temperature whilst other areas have seen lower temperature
gains. The original published temperature record for Climate Change is in red, while the updated
version is in blue. The black curve is the proposed harmonic component plus the proposed
corrected anthropogenic warming trend. The figure shows in yellow the harmonic component
alone made of the four cycles, which may be interpreted as a lower boundary limit for the natural
variability. The green area represents the range of the IPCC 2007 GCM projections.
The astronomical / harmonic model forecast since 2000 looks in good agreement with the data
gathered up to now, whilst the IPCC model projection is not in agreement with the steady
temperature observed since 2000. This may be due to other effects, such as cooling due to
increased water evaporation (humidity has increased about 4% since measurements began in the
18th centaury) or cloud seeded by jet aircraft condensation trails which reduce solar forcing by
reflecting energy back into space. Both short-term solar-lunar cycle climate forecasting and
long-term Milankovitch solar forcing cycles point towards a natural cyclic phase of gradual
cooling - which partially off-sets those Climate Change factors (Co2 etc.) due to Human Actions.
Scafetta on his latest paper: Harmonic climate model versus the IPCC general circulation climate models
Wave-form Analytics in Econometrics
Wave-form Analytics characterised as periodic sequences of regular, recurring high
and low activity resulting in cyclic phases of increased and reduced periodic trends
supports an integrated study of complex, compound wave forms which can be
used in order to identify hidden Cycles, Patterns and Trends in Economic Big Data.
The challenge found everywhere in business cycle theory is how to interpret
interacting large scale, long period, compound wave-form (polyphonic) temporal data
sets which are variable (dynamic) in nature the Schumpter Economic Wave Series.
Wave-form Analytics in Econometrics
Biological, Sociological, Economic and Political systems all tend to demonstrate
Complex Adaptive System (CAS) behaviour - which appears to be more similar
in nature to biological behaviour in a living organism than to Disorderly, Chaotic,
Stochastic Systems (Random Systems). For example, the remarkable
adaptability, stability and resilience of market economies may be demonstrated by
the impact of Black Swan Events causing stock market crashes - such as oil price
shocks (1970-72) and credit supply shocks (1927- 1929 and 2008 onwards).
Unexpected and surprising Cycle Pattern changes have historically occurred
during regional and global conflicts being fuelled by technology innovation-driven
arms races - and also during US Republican administrations (Reagan and Bush -
why?). Just as advances in electron microscopy have revolutionised biology -
non-stationary time series wave-form analysis has opened up a new space for
Biological, Sociological, Economic and Political system studies and diagnostics.
Scan and
Identify
Communicate Discover
Background Noise
Individual Wave
The Strauss-Howe model holds that the Kondriatev Infrastructure Investment Cycle
(K-cycle ) has shifted from one-half to a full saeculum in length - as a result of global
industrialization - and is now about 72 years long. The cause of this lengthening is
the emergence of government economic management, which itself is a direct effect
of industrialization as mediated through the generational saeculum cycle. The
rise of the industrial economy did more than simply introduce the Kitchen cycle. It
also increased the intensity in the Strauss-Howe model of Kitchen, Kuznets and
Kondratiev cycles - all of which had already been part of the pre-industrial economy.
Whilst the Kuznets-related Panic of 1819 was the first stock market panic to make it
into the history books, it was a pretty mild bear market. The Panic of 1837 was worse
and the one in 1857 worse still. The Panic of 1873 ushered in the second worst bear
market of all time. The depression following the Panic of 1893 was the worst up to
that time. This depression was the first to take place with a majority of the population
involved in non-agricultural occupations. Although hard times on the farm were a
frequent occurrence, depressions did not usually mean hunger. Yet for the large
numbers of urban workers thrown onto "the industrial scrap heap" the depression of
the 1890's produced a level of suffering unprecedented by a business fluctuation.
Wave-form Analytics in Econometrics
Figure 1. Economic Wave Series Joseph Schumpeter Business Cycles
Cycle Pre-industrial (before 1860) Modern (post 1929)
Kitchen Inventory Cycle (KI-cycle) Stock-turn Cycle (3-5 years) One KI-cycle = 5 years
Juglar Fixed Investment Cycle (J-cycle) Business Cycle (7-11 years) One J-cycle = 10 years
Kuznets Infrastructure Cycle (KU-cycle) Property Cycle (15-25 years) One KU-cycle = 20 years
Kondratiev Cycle (KO-cycle) Technology Cycle (45-60 years) One KO-cycle = 40 years
Juglar Fixed Investment Cycle (J-cycle) Business Cycle (8-11 years) Economic Wave - 9 years
Kuznets Infrastructure Cycle (KU-cycle) Asset Cycle (20-25 years) Investment Wave - 18 years
Strauss-Howe Cycle (SH-cycle) Population Cycle (20-30 years) Generation Wave 20-25 years
Kondratiev Cycle (KO-cycle) Industry Cycle (45-60 years) Innovation Wave 30-45 years
Grand-cycle / Super-cycle (GS-cycle) Saeculum (70 years +) Century Wave 60-90 years
Wave-form Analytics in Econometrics
1. 2.
QUARTER ANNUAL
Quarterly Profit Annual Financial
Publish Publish
Forecasts Report
Quarterly Annual
Forecasts Report
Inventory
3.
Refreshment
Saeculum 8. Kitchen
Grand-cycle / Inventory Stock-turn Cycle
Century Wave Super-cycle Cycle (3-5 years)
(75-120 years) (GS-cycle) (KI-cycle)
Major periodic changes in business activity are due to recurring cyclic phases in economic
expansion and contraction - classical bear and bull markets, or boom and bust cycles.
The time series decomposition necessary to explain this complex phenomenon presents us
with many interpretive difficulties due to background noise and interference as multiple
business cycles, patterns and trends interact and impact upon each other. We are now able
to compare cyclical movements in output levels, deviations from trend, and smoothed growth
rates of the principal measures of aggregate economic activity - the quarterly Real (Austrian)
GDP and the monthly U.S. Coincident Index - using the phase average trend (PAT).
This paper provides a study of business cycles - which are defined as periodic sequences of
expansion and contraction in the general level of economic activity. The proposed Wave-
form Analytics approach helps us to identify Cycles, Patterns and Trends in Big Data. This
approach may be characterised as periodic sequences of high and low business activity
resulting in cyclic phases of increased and reduced output trends supporting an integrated
study of disaggregated economic cycles that does not require repeated multiple and iterative
processes of trend estimation and elimination for every possible business cycle duration..
Business Cycles, Patterns and
Trends
The purpose of this section is to examine the nature and content of Clement Juglars
contribution to Business Cycle Theory and then to compare and contrast it with that of Joseph
Schumpeters analysis of cyclical economic fluctuations. There are many similarities evident -
but there are also some important differences between the two authorities theories.
Schumpeters classical Business Cycle is driven by a series of multiple co-dependent
technology innovations of low to medium impact - whereas according to Juglar the trigger for
a runaway boom is market speculation fuelled by over-supply of credit. A deeper examination
of Juglars business cycles can reveal the richness of Juglars original and very interesting
approach. Indeed Juglar, without having proposed a complete theory of business cycles,
nevertheless provides us with an original theory supporting a more detailed comparison and
benchmarking between these two co-existing and compatible business cycle theories.
In a specific economic context characterised by the rapid development of both industry and
trade, Juglar's theory interconnects the development of new markets with credit availability,
speculative investments and the banks behaviours in response to the various phases of the
Business Cycle Crisis, Liquidation, Recovery, Growth and Prosperity, . The way that the
money supply, credit availability and industrial development interact to create business cycles
is quite different in Juglars viewpoint than that expressed by Schumpeter in his theory of
economic development but does not necessarily express any fundamental contradiction.
Compared and contrasted, the two different approaches refer to market phenomena which
are both separate and different but still entirely compatible and co-existent.
Waves, Cycles, Patterns and Trends
Business Cycles were once thought to be an economic phenomenon due to periodic
fluctuations in economic activity. These mid-term economic cycle fluctuations are
usually measured using Real (Austrian) Gross Domestic Product (rGDP). Business
Cycles take place against a long-term background trend in Economic Output
growth, stagnation or recession which affects Money Supply as well as the relative
availability and consumption (Demand v. Supply and Value v. Price) of other
Economic Commodities. Any excess of Money Supply may lead to an economic
expansion or boom, conversely shortage of Money Supply may lead to economic
contraction or bust. Business Cycles are recurring, fluctuating levels of economic
activity experiences in an economy over a significant timeline (decades or centuries).
The five stages of Business Cycles are growth (expansion), peak, recession
(contraction), trough and recovery. Business Cycles were once widely thought to be
extremely regular, with predictable durations, but todays Global Market Business
Cycles are now thought to be unstable and appear to behave in irregular, random and
even chaotic patterns varying in frequency, range, magnitude and duration. Many
leading economists now also suspect that Business Cycles may be influenced by fiscal
policy as much as market phenomena - even that Global Economic Wild Card and
Black Swan events are actually triggered by Economic Planners in Government
Treasury Departments and in Central Banks as a result of manipulating the Money
Supply under the interventionalist Fiscal Policies adopted by some Western Nations.
Economic Waves, Cycles, Patterns and Trends
Real (Austrian) business cycle theory assigns a central role to shock waves as the primary
source of economic fluctuations or disturbances. As King and Rebelo (1999) discuss in
.Resuscitating Real Business Cycles, when persistent technology shocks are fed through a
standard real business cycle model then the simulated economy displays impact patterns
which are similar to those exhibited by actual business cycles. While the last decade has
seen the addition of other types of shocks in these models - such as monetary policy and
government spending - none has been shown to be a central impulse to business cycles.
A trio of recent papers has called into question the theory that technology shocks have
anything to do with the fundamental shape of business cycles. Although they use very
different methods, Gal (1999), Shea (1998) and Basu, Kimball, and Fernald (1999) all
present the same result: positive technology shocks appear to lead to declines in labour
input.1 Gal identifies technology shocks using long-run restrictions in a structural VAR;
Shea uses data on patents and R&D; and Basu, Kimball and Fernald identify technology
shocks by estimating Hall-style regressions with proxies for utilization.
In all cases, they find significant negative correlations of hours with the technology shock
waves, Gail's paper also studies the effects of the non-technology shocks such as
Terrorism, Insecurity and Military Conflicts, as well as Monetary Supply and Commodity-
price Shocks - which he suggests might be interpreted as demand / supply shocks.
These shocks produce the typical business cycle co-movement between output and hours.
In response to a positive shock, both output and hours show a rise in the typical hump-
shaped pattern. Productivity also rises - but with only temporarily economic effect
modifying Business Cycles rather than radically altering them.
Wave Theory Of Human Activity
It seems that many Human Activity Cycles - Business, Social, Political, Economic, Historic and Pre-historic (Archaeology) Cycles -
may be compatible with, and map onto - one or more of the Natural Cycles.: -
Earth and Lunar Natural Cycles - Diurnal to Annual (1 day to 1 year)
Tidal Deposition Lamellae in Deltas, Estuaries and Salt Marshes Diurnal
Seasonal Growth rings in Stromatolites, Stalagmites and Trees - Annual / Biannual
Lamellae in Ice Cores, Calcite Deposits, Lake and Marine Sediments Annual / Biannual
Human Activity Waves Seasonal, Trading and Fiscal Cycles Diurnal to Annual (1 day to 1 year)
Natural Resonance / Harmonic / Interference Waves - Southern Oscillation / Solar Activity @ 3, 5, 7,11 years
Schumpeter Composite Wave Series - Resonance / Harmonic Wave Cycles @ 3, 5, 7,11 & 15, 20, 25 years
Kitchin inventory cycle of 35 years (after Joseph Kitchin);
Juglar fixed investment cycle of 711 years (often referred to as 'the business cycle);
Kuznets infrastructural investment cycle of 1525 years (after Simon Kuznets);
Industrial / Technology Arms Race Cycles 25 years
American Civil War 1863
Anglo-Chinese Opium War - 1888
The Great War - 1914
The Second World War - 1939
Geo-political Rivalry and Conflict 20 years (World Cup years - odd decades)
Korean War - 1950
Vietnam War - 1970
1st Gulf War - 1990
Arab Spring Uprisings - 2010
Culminating in a future Arabian Gulf Conflict in 2030 ?
Geo-political Rivalry and Conflict 20 years (Olympics Years - even decades)
The Second World War - 1940
Malayan Emergency - 1960
Russian War in Afghanistan - 1980
Balkan Conflict - 2000
Culminating in a future Trade War between USA and China in 2020 ?
Wave Theory Of Human Activity
It also seems that many Human Activity Cycles - Business, Social, Political, Economic, Historic and Pre-historic
(Archaeology) Cycles - may be compatible with, and map onto - one or more of Minor Bond Climatic Cycles with
periodicity at 117, 64 and 57 years
Kondratiev wave or long technological cycle of 4560 years (after Nikolai Kondratiev)
Industry Cycles
Generation Waves
Technology Shock Waves
Major Bond Climatic Cycles - 800 to 1000 and 1470 years duration of Civilisations
Western Roman Empire (300 BC 500 AD
Eastern Roman Empire (500 1300 AD)
Vikings and Normans - Nordic Ascendency (700-1500 AD)
Anglo-French Rivalry Norman Conquest to Entente Cordial (1066 -1911)
Mayan Civilisation
Khmer Civilisation (Amkor)
Greenland Vikings (Medieval mini Ice Age)
Pueblo Indians (Anastasia) drought in South-Western USA
Easter Islanders
Milankovitch Climatic Cycles Insolation for Quaternary Ice Ages (Pluvial / Inter-pluvial)
Clovis Culture, Soloutrean Culture, Neanderthal Culture
Major Extinction-level Events (Kill Moments)
Pre-Cambrian and Cambrian Extinction Events 1000-542 million years ago
Permian-Triassic Boundary (PTB) Event 251.4 million years ago
Cretaceous Tertiary Boundary Event 65 million years agp
Global Massive Change 20 ky ago to present day (ongoing)
Wave-form Analytics in Cyclic
Business Studies
Trend-cycle decomposition is a critical technique for testing multiple competing dynamic models
In the study of complex cyclic business phenomena - including both deterministic and stochastic
(probabilistic) paradigms. A fundamental challenge found everywhere in business cycle theory is
how to interpret compound (polyphonic) time series which are both complex and dynamic (non-
stationary) in nature. Wave-form Analytics is a new analytical too based on Time-frequency
analysis a technique which exploits the wave frequency and time symmetry principle , which
is introduced here or the first time in the field of study of business cycles, patterns and trends,.
Economic systems demonstrate Complex Adaptive System (CAS) behaviour - more similar to
an organism than chaotic Random Walks. The remarkable stability and resilience of market
economies can be seen from the impact of Black Swan Events causing stock market crashes -
such as oil price shocks and credit crises. Surprising pattern changes occurred during wars,
arm races, and during the Reagan administration. Like microscopy for biology, non-stationary
time series analysis opens up a new space for business cycle studies and policy diagnostics.
The role of time scale and preferred reference from economic observation is discussed.
Fundamental constraints for Friedman's rational arbitrageurs are re examined from the view of
information ambiguity and dynamic instability.
Quantitative and Qualitative Analysis
Techniques
TECHNICAL (QUANTITATIVE) METHODS TECHNICAL (QUANTITATIVE) METHODS (cont.)
The Juglar Business Cycle is now widely regarded by many leading Economists
as the fundamental, real or true interpretation of the classic boom-and-bust
Sock Market Cycle. Subsequent analysis designated the years 1825, 1836,
1847, 1857, 1866, 1873, 1882, 1890, 1900, 1907, 1913, 1920, and 1929 as the
initial years of an Economic Recession or Market re-adjustment (fiscal down-
swing - i.e., the beginning of a Juglar Business Cycle crisis phase).
Juglar Business Cycle
Joseph Schumpeter
The source of Joseph Schumpeter's dynamic, change-oriented, and innovation-based
economics was the Historical School of economics. Although Schumpeters writings
could be critical of the School, Schumpeter's work on the role of innovation and
entrepreneurship can be seen as a continuation of ideas originated by the Historical
School especially from the work of Gustav von Schmoller and Werner Sombart.
Schumpeter's scholarly learning is readily apparent in his posthumous publication of
the History of Economic Analysis - although many of his judgments now seem to
be somewhat idiosyncratic and some even appear to be downright cavalier......
Schumpeter thought that the greatest 18th century economist was Turgot, not Adam
Smith, as many economists believe today, and he considered Lon Walras to be the
"greatest of all economists", beside whom other economists' theories were "like
inadequate attempts to catch some particular aspects of the Walrasian truth".
Schumpeter criticized John Maynard Keynes and David Ricardo for the "Ricardian
vice." Ricardo and Keynes often reasoned in terms of abstract economic models,
where they could isolate, freeze or ignore all but a few major variables. According to
Schumpeter, they were then free to argue that one factor impacted on another in a
simple monotonic cause-and-effect fashion. This has led to the mistaken belief in
economics that anyone could easily deduce effective real-world economic policy
conclusions directly from a highly abstract and simplistic theoretical economic model.
Joseph Schumpeter
Schumpeter's relationships with the ideas of other economists were quite complex -
following neither Walras nor Keynes, There was actually some considerable professional
rivalry between Schumpeter and Kuznets. Schumpeter starts his most important
contributions to economic analysis the theory of business cycles and economic
development The Theory of Economic Development[ with a treatise on circular flow in
which he postulates a stationary economy is created whenever economic input is starved
of entrepreneurial activities - disruptive innovation and technology wave stimulation. This
economic stagnation is, according to Schumpeter, described by Walrasian equilibrium.
In developing the Economic Wave theory, Schumpeter postulated the idea that the
entrepreneur is the primary catalyst of industrial activity which develops along several
discrete and interacting time periods in a cyclic fashion connecting the development of
innovation, technology and generation waves with economic investment and stock-market
cycles. This disruptive process acts to disturb the otherwise stationary economic status-
quo or equilibrium Thus the true hero of his story is the entrepreneur.. Schumpeter also
kept alive the Russian Nikolai Kondratiev's thoughts and ideas of economic cycles with 50-
year periodicity - Kondratiev waves.
Joseph Schumpeter
Schumpeter suggested an integrated Economic Model in which the four main cycles,
Kondratiev (54 years), Kuznets (18 years), Juglar (9 years) and Kitchin (about 4
years) can be aggregated together to form a composite economic waveform. The
wave form suggested here did not include the Kuznets Cycle simply because
Schumpeter did not recognize it as a valid cycle (see "Business Cycle" for further
information). There was actually some considerable professional rivalry between
Schumpeter and Kuznets. As far as the segmentation of the Kondratiev Wave,
Schumpeter further postulated that a single Kondratiev wave may well be consistent
with the aggregation of three lower-order Kuznets waves
Each Kuznets wave could, itself, be made up of two Juglar waves. Similarly two or
three Kitchin waves could form a higher-order Juglar wave. If each of these were in
harmonic phase, more importantly if the downward arc of each was simultaneous so
that the nadir of each was coincident - it could explain disastrous slumps and
consequential recessions and depressions. Schumpeter never proposed a rigid,
fixed-periodicity model. He saw that these cycles could vary in length over time -
impacted upon by various random, chaotic and radically disruptive Wild Card and
Black Swan events - catastrophes such as War, Famine and Disease, Commodity
Price Shocks, Money Supply Shocks and Sovereign Debt Default Shocks - .events
which are all too common in the economy of today..
Business Cycles, Patterns and
Trends
Figure 1. Economic Wave Series Joseph Schumpter Business Cycles
Cycle Pre-industrial (before 1860) Modern (post 1929)
Kitchen Inventory Cycle (KI-cycle) Stock-turn Cycle (3-5 years) One KI-cycle 5 years
Juglar Fixed Investment Cycle (J-cycle) Business Cycle (7-11 years) One J-cycle - 10 years
Kuznets Infrastructure Cycle (KU-cycle) Property Cycle (15-25 years) One KU-cycle - 20 years
Kondratiev Cycle (KO-cycle) Technology Cycle (45-60 years) One KO-cycle 40 years
Juglar Fixed Investment Cycle (J-cycle) Business Cycle (7-11 years) Economic Wave - 9 years
Kuznets Infrastructure Cycle (KU-cycle) Asset Cycle (20-25 years) Investment Wave - 18 years
Strauss-Howe Cycle (SH-cycle) Population Cycle (20-30 years) Generation Wave 20-25 years
Kondratiev Cycle (KO-cycle) Technology Cycle (45-60 years) Innovation Wave 30-45 years
Grand-cycle / Super-cycle (GS-cycle) Saeculum (70 years +) Century Wave 60-90 years
StraussHowe Generation Waves
The StraussHowe Generation Wave theory, created by authors William Strauss and
Neil Howe, identifies a recurring generational cycle in American history. Strauss and
Howe lay the groundwork for the theory in their 1991 book Generations, which retells
the history of America as a series of generational biographies going back to 1584.[1] In
their 1997 book The Fourth Turning, the authors expand the theory to focus on a
fourfold cycle of generational types and recurring mood eras in American history.[2] Their
consultancy, Life Course Associates, has expanded on the concept in a variety of
publications since then.
The StraussHowe Generation Wave theory was developed to describe the history of
the United States, including the 13 colonies and their Anglo-Saxon antecedents, and
this is where the most detailed research has been done. However, the authors have
also examined generational trends elsewhere in the world and identified similar cycles
in several developed countries.[ The books are best-sellers and the theory has been
widely influential and acclaimed. Eric Hoover has called the authors pioneers in a
burgeoning industry of consultants, speakers and researchers focused on generations.
StraussHowe Generation Waves
Arthurian Generation (14331460) (H)
Business Cycles the intervals between Stock Market Boom-and-Bust were apparently of
random length up to a full Juglar Business Cycle in the range of 8 to 11 years. With the arrival
of industrialisation, then ordinary Business Cycle was now joined by a new Economic
phenomenon the Inventory Cycle, or Kitchen Cycle (KI-cycle) with a range of 3-5 years
duration which was later replaced by a new, decreased and lower, more uniform length
(average 40 months). The Kuznets Cycle (KU-cycle) and Kondratiev Cycles carried on much
as before. From the changes induced by industrialisation, the Robert Bronson SMECT
structure emerged, in which sixteen 40 month Kitchen cycles "fit" into a standard Kondratiev
cycle and the KO-cycle subdivided into 1/2, 1/4 and 1/8-length sub-cycles.
Innovation Waves
Business Cycles, Patterns and
Trend - Introduction
In his recent book on the Kondratiev cycle, Generations and Business Cycles - Part I -
Michael A. Alexander further developed the idea first postulated by Strauss and Howe - that the
Kondratiev Cycle (KO-cycle) is fundamentally generational in nature. Although it had been 28
years since the last real estate peak in 1980, property valuations had yet to reach previous peak
levels when the Sub-Prime Crisis began in 2006. Just as it had done in 1988 and 1998, the
property boom spawned by the Federal Bank's rate cuts continued to drive an upward spiral of
increasing real estate valuations for a couple of more years -- until the Toxic Debt Crisis began
with a trickle of sub-prime mortgage defaults in 2006, and continued with the Financial Service
sector collapses triggering the Credit Crunch / Sovereign Debt Defaults which arrived in 2008.
From late Medieval times up until the early 19th century, the Kondratiev Cycle (KO-cycle) was
thought to be roughly equal in length to two human generation intervals around 50 years in
duration. Thus two Kondratiev cycles in turn form one saeculum, a generational cycle described
by American authors William Strauss and Neil Howe. The KO-cycle was closely aligned with
wars, and a possible mechanism for the cycle was alternating periods (of generational length) of
government debt growth and decline associated with war finance. After the world economy
became widely industrialised in the late 19th century and the relationship between the
compound cycles seem to have changed. Instead of two KO-cycles per saeculum - there was
now only appears to be multiple KO-cycles possibly driven by Geopolitical Rivalry and Arms
Races. In the Saeculum from 1914 2014 we experienced WWI, WWII, the Cold War and its
spawning of numerous Regional Conflicts Korea, Vietnam, Malaysia, the Arab-Israeli Wars, the
break up of Yugoslavia, the Gulf Wars and Afghan Conflicts culminating in the Arab Spring.
Innovation Waves
Business Cycles, Patterns and
Trends
Figure 3. Robert Bronson's SMECT System
Cycle Pre-industrial (before 1860) Modern (post 1929)
Kitchen Cycle (KI-cycle) Production Cycle (3-5 years) Inventory Wave- 40 months (av.)
Juglar Cycle (J-cycle) Business Cycle (8-11 years) Economic Wave - 9 years
Kuznets Cycle (KU-cycle) Property Cycle (20-25 years) Infrastructure Wave -18 years
Strauss-Howe Cycle (SH-cycle) Population Cycle (20-30 years) Generation Wave - 36 years
Kondratiev Cycle (KO-cycle) Technology Cycle (45-60 years) Innovation Wave - 72 years
Figure 4. Michael Alexander - Business cycle length and bear market spacing over time
Cycle Pre-industrial (before 1860) Modern (post 1929)
Juglar Cycle (J-cycle) Business Cycle (8-11 years) Economic Wave - 9 years
K0-trend / Infrastructure Wave Property Cycle (20-25 years) Infrastructure Wave - 18 years
K0-wave / Generation Wave Population Cycle (20-30 years) Generation Wave - 36 years
K0-cycle / Innovation Wave Technology Cycle (45-60 years) Innovation Wave - 72 years
Grand-cycle / Super-cycle (GS-cycle) Saeculum (70 years +) Century Wave - 108 years
Business Cycles, Patterns and
Trends
Economic Periodicity appears less metronomic and more irregular from 1860 to 1929 (and
from 2000 onwards). Strauss and Howe claim that these changes in Economic Periodicity
were created by a shift in economic cycle dynamics caused by industrialisation around the
time of the American Civil War hinting towards Schumpters view that Innovation and
Black Swan events can impact on Economic Cycle periodicity. Michael Alexander claims
that this new pattern only emerged after1929 when the Kondratiev Cycle (KO-cycle)
appeared lengthened and at the same time the Saeculum shortened - to the point where
they both became roughly equal, and merged with a Periodicity of about 72 years long.....
Michael Alexander further maintains that each Kondratiev wave can be subdivided into two
Kondratiev seasons, each associated with a secular market trend. Table 1 shows how
these cycles were related to each other before and after industrialization. The Kondratiev
cycle itself consists of two Kondratiev waves, each of which is associated with sixteen
occurrences or iterations of the Stock Cycle. The Juglar cycle was first noted by Clement
Juglar in 1860s and existed in pre-industrial economies. The other two cycles were
identified much later (Kitchen in 1923). The Kuznets real-estate cycle, proposed in 1930,
still persists and this might be thought of as a periodic infrastructure investment cycle
which is typical of industrialised economies after the 1929 Depression. Shorter economic
cycles also exist, such as the Kuznets cycle of 15-20 years (related to building/real estate
valuation cycles), along with the Juglar cycle of 7-11 years (related to Stock Market
activity) and the Kitchen cycle of about 40 months (related to Stock or Inventory Cycles).
Robert Bronson's SMECT
Forecasting Model
Each thing is of like form from everlasting and comes
round again in its cycle - Marcus Aurelius
A number of ears ago, Bob Bronson, principal of Bronson Capital Markets Research, developed
a useful model for predicting certain aspects of the occurrence characteristics of both Business
cycles (stock-market price curves) and Economic cycles (Fiscal Policies). The template for this
model graphically illustrates that the model not only explains the interrelationship of these past
cycles with a high degree of accuracy - a minimum condition for any meaningful modelling tool,
but it also has been, and should continue to be, a reasonably accurate forecasting mechanism.
Robert Bronson's SMECT System is a Forecasting Model that integrates multiple Business
(Stock-Market Movement) and Economic Cycles. Since there is an obvious interrelationship
between short-term business cycles and short-term stock-market cycles, it is useful to be able to
discover and understand their common elements - in order to develop an economic theory that
explains the underlying connections between them and, in our case, to form meaningful,
differentiating forecasts - especially over longer-term horizons. By pulling back from the close-up
differences and viewing the cycles from a longer-term perspective, their common features
become more apparent , Business Cycles are also subject to unexpected impact from external
or unknown forces - Random Events which are analogous to Uncertainty Theory in the way
that they become manifest - but subject to different interactions and feedback mechanisms.
Wholesale Price Index 1790-1640
Robert Bronson SMECT System
An alternative thesis proposed Strauss and Howe has also noted the
discontinuous behaviour of their Generation Waves at the same time the
so-called War Anomaly. What is happening here ? Strauss and Howe
attribute these changes to a skipped generation caused by losses in the
American Civil War (and later, the Great War). The unusually poor economic
outcomes after these conflicts is due to massive War Debts and the
absence of stimulation from a lost generation.
Geo-spatial Data Science
Geo-demographics - Big Data
Geo-demographics - Big Data
When we examine the heavens above there appears to be order in the movement
and appearance of the celestial bodies - galaxies, stars, planets, asteroids, etc.
Since the dawn of our species, humans have speculated on how these bodies were
formed and on the meaning of their ordered movement. Most observations of natural
phenomena support the contention that nature is mostly orderly and predictable. The
origin of that force which brought about this order differs depending upon the source
of the historic explanation of how this order came to be. For much of human history,
super-natural forces were mostly credited with the imposition of order upon nature.
In a tradition that begins with the classical Greek natural philosophers (circa 600 -
200 BC) and continues today through contemporary philosophy and science it has
long been held that the order of nature is the result of universal laws which govern
the forces of nature. So what is the role of sudden and unexpected radical change
and the cause of chaos and disruption created by random, stochastic processes at
the heart of a universe which otherwise exhibits such a high degree of order ?
Randomness
There are many kinds of Stochastic or Random processes that impact on every area
of Nature and Human Activity. Randomness can be found in Science and Technology
and in Humanities and the Arts. Random events are taking place almost everywhere
we look for example from Complex Systems and Chaos Theory to Cosmology and
the distribution and flow of energy and matter in the Universe, from Brownian motion
and quantum theory to fractal branching and linear transformations. There are further
examples atmospheric turbulence in Weather Systems and Climatology, and system
dependence influencing complex orbital and solar cycles. Other examples include
sequences of Random Events, Weak Signals, Wild Cards and Black Swan Events
occurring in every aspect of Nature and Human Activity from the Environment and
Ecology - to Politics, Economics and Human Behaviour and in the outcomes of current
and historic wars, campaigns, battles and skirmishes - and much, much more.
These Stochastic or Random processes are agents of change that may precipitate
global impact-level events which either threaten the very survival of the organisation -
or present novel and unexpected opportunities for expansion and growth. The ability to
include Weak Signals and peripheral vision into the strategy and planning process may
therefore be critical in contributing towards the continued growth, success, wellbeing
and survival of both individuals and organisations at the micro-level as well as cities,
states and federations at the macro-level - as witnessed in the rise and fall of empires.
Randomness
Stochastic Processes Random Events
It has long been recognized that one of the most important competitive factors for any
organization to master is Randomness, Disorder and Chaos - its Nature, Behaviour and
Cause. Uncertainty is the major intangible factor contributing towards the risk of failure in
every process, at every level, in every type of business. The way that we think about the
future must mirror how the future actually unfolds. As we have learned from recent
experience, the future is not a straightforward extrapolation of simple, single-domain
trends. We now have to consider ways in which the possibility of random, chaotic and
radically disruptive events may be factored into enterprise threat assessment and risk
management frameworks and incorporated into decision-making structures and processes.
Managers and organisations often aim to stay focused and maintain concentration on a
narrow range of perspectives in dealing with key business issues, challenges and targets.
Any concentration of focus or narrow outlook may in turn risk overlooking Weak Signals
indicating potential issues and events, agents and catalysts of change. Any such Weak
Signals along with their resultant Strong Indicators, Wild Card and Black Swan
Events - represent an early warning of radically disruptive future global transformations
which are even now taking shape at the very periphery and horizon of corporate insight,
awareness, perception and vision or just beyond.
The Nature of Randomness
There are many kinds of Stochastic or Random processes that impact on every area
of Nature and Human Activity. Randomness can be found in Science and Technology
and in Humanities and the Arts. Random events are taking place almost everywhere
we look for example from Complex Systems and Chaos Theory to Cosmology and
the distribution and flow of energy and matter in the Universe, from Brownian motion
and quantum theory to fractal branching and linear transformations. There are further
examples atmospheric turbulence in Weather Systems and Climatology, and system
dependence influencing complex orbital and solar cycles. Other examples include
sequences of Random Events, Weak Signals, Wild Cards and Black Swan Events
occurring in every aspect of Nature and Human Activity from the Environment and
Ecology - to Politics, Economics and Human Behaviour and in the outcomes of current
and historic wars, campaigns, battles and skirmishes - and much, much more.
These Stochastic or Random processes are agents of change that may precipitate
global impact-level events which either threaten the very survival of the organisation -
or present novel and unexpected opportunities for expansion and growth. The ability to
include Weak Signals and peripheral vision into the strategy and planning process may
therefore be critical in contributing towards the continued growth, success, wellbeing
and survival of both individuals and organisations at the micro-level as well as cities,
states and federations at the macro-level - as witnessed in the rise and fall of empires.
The Nature of Randomness
Randomness makes precise prediction of future outcomes impossible. We are unable to predict any
outcome with any significant degree of confidence or accuracy due to the inherent presence of
randomness and uncertainty associated with Complex Systems. Randomness in Complex Systems
introduces chaos and disorder causing disruption. Events no longer continue along a predictable linear
course leading towards an inevitable outcome instead, we experience surprises.
What we can do, however, is to identify the degree of uncertainty present in those Systems, based on
known, objective measures of System Order and Complexity - the number and nature of elements
present in the system, and the number and nature of relationships which exist between those System
elements. This in turn enables us to describe the risk associated with possible, probable and alternative
Scenarios, and thus equips us to be able to forecast risk and the probability of each of those future
Scenarios materialising.
If true randomness exists and future outcomes cannot be predicted then what is the origin of
that randomness? For example, are unexpected outcomes simply apparent as a result of sub-
atomic nano-randomness existing at the quantum level such as uncertainty phenomena etc..?
The Stephen Hawking Paradox postulates that uncertainty dominates complex and chaotic systems to
such an extent that future outcomes are both unknown - and unknowable. The working context of this
paradox is restricted, however, to the realm of Quantum Mechanics where each and every natural event
that occurs at the sub-atomic level is truly intrinsically and completely both symmetrical and random.
The Nature of Randomness
What is the explanation for randomness evident in all high-order phenomena
found in nature?
In order to obtain realistic glimpses into the Future, - then the major paradigm
differences between the Actual Reality that we experience every day and our
limited Systemic Models which attempt to simplify, abstract and simulate reality -
must be clearly distinguished between and understood.
When we design our Systemic Models representing Actual Reality such as the
Economy, Geo-political systems, Climate Change, Weather and so on - if we are
lucky enough, then some high-order phenomena found in nature may be captured
by a random rule; and with even more luck, by a deterministic rule (which can be
regarded as a special case of randomness) - but if we are unlucky - then those
rules might not be no captured at all. Regarding the nature of reality - it still
remains unclear what factors distinguish truly random phenomenon found in
nature at the Quantum level (e.g. radioactive decay?) from Random Events which
are triggered by unseen forces.
The Nature of Randomness
Can we accept that these natural phenomena are not truly random at all that is, given sufficient
information such as complete event data sets - it is possible to predict random events? If so,
are all random events the result of the same natural phenomenon - unseen or hidden forces ?
Classical (Newtonian) Physics describe the laws which govern all of the systems and objects that we are
familiar with in our everyday routine lives. Relativity Theory, on the other hand, describes unimaginably
large things, whilst Quantum Mechanics describes impossibly small things and Wave Theory (String
Theory) attempts to describe everything. True randomness does not really exist in Classical (Newtonian)
Physics the laws which control Chaos and Complex Systems that govern every aspect of our life on Earth
today from Natural Systems such as Cosmology, Astronomy, Climatology, Geology and Biology through to
Human Activity Systems such as Political, Economic and Sociological Complex Adaptive Systems (CAS).
Randomness is simply the results of those forces which are not known, not recognised, not understood, are
not under the control of the observer or simply occur outside of the known boundaries of observable system
components but, nevertheless, must still exist and exert influence over the system. Over many System
Cycles, immeasurably small inputs interacting with Complex System components and relationships - may
be amplified into extremely significant outputs.....
The role of time scale and preferred reference from economic observation is explored in
detail. For example - fundamental constraints for Friedman's rational arbitrageurs are re
examined from the view of information ambiguity and dynamic instability. Alongside
Joseph Schumpters Economic Wave Series and Strauss and Howes Generation Waves,
we also discuss Robert Bronson's SMECT Forecasting Model - which integrates both
Business and multiple Stock-Market Cycles into its structure.....
A tradition that begins with the classical Greek natural philosophers (circa 600 -
200 BC) and continues through contemporary science - holds that change and
the order of nature are the result of natural forces. What is the role of random,
stochastic processes in a universe that exhibits such order? When we examine
the heavens there seems to be a great deal of order to the appearance and
movement of the celestial bodies - galaxies, stars, planets, asteroids, etc.
Since the dawn of our species, humans have speculated on how these bodies
were formed and on the meaning of their movements. Most observations of
natural phenomena support the contention that nature is ordered. The force
that brought about this order differs depending upon the source of the historic
explanation of how this order came to be. For most of human history, super-
natural forces were credited with the imposition of order on nature.
Random Events
Random Processes
Random Processes may act upon or influence any natural and human phenomena: -
2. Politics Any random behaviour is a result of irrational thoughts, emotions and actions
4. Sociology - All random behaviour is a result of irrational thoughts, emotions and actions
Philosophy Human Knowledge the Any apparent random behaviour Hellenic Philosophy -
Moral and Ethical basis of is as a result of irrational human Aristotle, Ptolemy
Human Understanding, thoughts, emotions and actions.
Thoughts and Actions
Politics Human Governance the The dual human
Political basis of Human emotions of Fear and
Actions and Behaviour Greed drives Politics,
Society and Economics
Sociology The Human Condition the
Market Sentiment &
Social basis of Human
Commodity / Financial
Actions and Behaviour
Product Price Curves
Economics The Human Condition the
Economic basis of Human
Actions and Behaviour
Psychology The Human Condition the Dementia, Psychosis,
Biological basis of Human Mania, Melancholia,
Understanding, Thought,
Actions and Behaviour
The Nature of Randomness
Philosophy the condition of Human Knowledge, Rationality, Logic and Wisdom
Human Knowledge the Moral / Ethical basis of Human Understanding, Thought , Actions
Apparent randomness is a direct result of irrational human thoughts, emotions and actions.
Classical Mechanics Everyday objects Any apparent randomness is as Sir Isaac Newton
a result of Unknown Forces -
(Newtonian Physics)
internal or external - acting upon
Thermodynamics Energy Systems - the System under observation Newcomen, Trevithick,
Entropy, Enthalpy Watt, Stephenson
Biology Evolution Darwin, Banks, Huxley,
Krebs, Crick, Watson
Chemistry Molecules Lavoisier, Priestley
Atomic Theory Atoms Atomic events are intrinsically Max Plank, Niels Bohr,
truly, utterly and unerringly fully Bragg, Paul Dirac,
predictable (Dirac Equation). Richard Feynman
Quantum Mechanics Sub-atomic particles Each and every Quantum event Erwin Schrodinger ,
is truly, intrinsically, absolutely Werner Heisenberg,
and totally random / symmetrical Albert Einstein,
(Hawking Paradox) Hermann Minkowsky
The Nature of Randomness
Thermodynamics
governs the flow of energy and the transformation (change in state) of systems
randomness, chaos and uncertainty is the result of the effects of Enthalpy and Entropy
Chemistry
Chemistry (Transformation) governs the change in state of atoms and molecules
any apparent randomness is as a result of unimaginably small, unobservable and
unmeasurable Unknown Forces - either internal or external - acting upon a System.
Biology
Biology (Ecology ) governs Evolution - the life and death of all living Organisms
any apparent randomness is as a result of unimaginably small, unobservable and
unmeasurable Unknown Forces - either internal or external - acting upon a System.
The Nature of Randomness
Domain Scope / Scale Randomness Pioneers
Geology The Earth, Planets, Any apparent randomness is as Hutton, Lyell, Wagner
Planetoids, Asteroids, a result of Unknown Forces
Meteors / Meteorites
Astronomy Common, familiar and Any apparent randomness or Galileo, Copernicus,
Observable nearby asymmetry may be as a result of Kepler, Lovell, Hubble
Celestial Objects Quantum effects or other
Unknown Forces acting early in
the history of Space-Time
Cosmology Distant, super-massive Any apparent randomness or Hoyle, Ryall, Rees,
Celestial Objects in the asymmetry may be as a result of Penrose, Bell-Burnell
observable Universe interaction with Dark Matter,
Dark Energy or Dark Flow
Relativity Theory The Universe Any apparent randomness or Albert Einstein,
asymmetry is as a result of Hermann Minkowski,
Unknown Forces / Dimensions Stephen Hawking
Wave Mechanics The Multiverse, Any apparent randomness or Prof. Michael Green,
(String Theory or Membranes and asymmetry is as a result of the Prof. Michio Kaku,
Hyperspace presence of nearby unknown Dr. Laura Mersini-
Quantum Dynamics)
Universes / Forces / Dimensions Houghton
The Nature of Randomness
Atomic Theory
governs the behaviour of unimaginably small objects (atoms and sub-atomic particles)
all events are truly and intrinsically, utterly and unerringly predictable (Dirac Equation).
Quantum Mechanics
governs the behaviour of unimaginably tiny objects (fundamental sub-atomic particles)
all events are truly and intrinsically both symmetrical and random (Hawking Paradox).
Geology
Geology governs the behaviour of local Solar System Objects (such as The Earth, Planets,
Planetoids, Asteroids, Meteors / Meteorites) which populate the Solar System
any apparent randomness is as a result of unimaginably small, unobservable and
unmeasurable Unknown Forces - either internal or external - acting upon a System
Astronomy
Astronomy governs the behaviour of Common, Observable Celestial Objects (such as
Asteroids, Planets, Stars and Stellar Clusters) which populate and structure Galaxies
any apparent randomness or asymmetry is as a result of Quantum Effects, Unknown
Forces or Unknown Dimensions acting very early in the history of Universal Space-Time
The Nature of Randomness
Cosmology
Cosmology governs the behaviour of impossibly super-massive cosmic building blocks
(such as Galaxies and Galactic Clusters) which populate and structure the Universe
any apparent randomness or asymmetry is due to the influence of Quantum Effects,
Unknown Forces (Dark Matter, Dark Flow and Dark Energy) or Unknown Dimensions
Relativity Theory
Relativity Theory governs the behaviour of impossibly super-massive cosmic structures
(such as Galaxies and Galactic Clusters) which populate and structure the Universe
any apparent randomness or asymmetry is as a result of Quantum Effects, Unknown
Forces or Unknown Dimensions acting very early in the history of Universal Space-Time
Every item of Global Content that we find in the Present is somehow connected with both the
Past and the Future. Space-Time is a Dimension which flows in a single direction, as does a
River. Space-Time, like water diverted along an alternative river channel, does not flow
uniformly outside of the main channel there could well be submerged objects (random
events) that disturb the passage of time, and may possess the potential capability of creating
unforeseen eddies, whirlpools and currents in the flow of Time (disorder and uncertainty)
which in turn posses the capacity to generate ripples, and waves (chaos and disruption) thus
changing the course of the Space-Time continuum. Weak Signals are Ghosts in the
Machine of these subliminal temporal interactions with the capability to contain information
about future Wild card or Black Swan random events.
The Nature of Randomness
Weak Signals, Strong Signals, Wild Cards and Black Swan Events are a sequence of
waves that have a common source or origin and are linked and integrated in ascending order
of magnitude which emanate either from a single Random Event instance or arise from a
linked series of chaotic and disruptive Random Events - an Event Storm. Signals from these
Random Events propagate through the space-time continuum as an integrated and related
series of waves with an ascending order of magnitude and impact the first wave to arrive is
the fastest travelling - Weak Signals - something like a faint echo of a Random Event which
may in turn be followed in turn by a ripple (Strong Signals) then possibly by a wave (Wild
Card) - which may predicate a further increase in magnitude and intensity which finally arrives
as a catastrophically unfolding mega-wave - something like a tsunami (Black Swan Event).
The Drunkard's Walk:- How Randomness Rules Our Lives - Leonard Mlodinow
The Drunkard's Walk dives much deeper into the Nature of Randomness. This book is
different - it is natural for scientific books to discuss science but unusual for them to
contain highly readable prose and good humour, not to mention useful and practical
insights which help to live your life with a greater understanding of chaotic effects in the
world about you. The book's major weakness is that it comes up short on fundamental
explanations of Chaos, Disruption, Complexity and Randomness. Mlodinow simply
advises readers to "be aware" and "conscious" of how important randomness is.
deterministic stochastic
Stochastic Processes
Random Event Sequences
The Nature of Randomness Uncertainty, Disorder and Chaos
Physical Systems and Mechanical Processes
Classical (Newtonian) Physics apparent randomness is as a result of Unknown Forces
Thermodynamics randomness is a direct result of Enthalpy (Disorder and Chaos)
Relativity Theory any apparent randomness or asymmetry is as a result of Quantum effects
Quantum Mechanics all events are truly and intrinsically both symmetrical and random
Quantum Dynamics randomness and asymmetry is as a result of Unknown Dimensions
Wave (String) Theory randomness and asymmetry is as a result of Unknown Membranes
Random Event Clustering
Patterns in the Chaos
The Nature of Uncertainty Randomness
Physical and Mechanical Processes:
Thermodynamics (Complexity + Chaos Theory) governs the behaviour of Energetic Systems
Classical Mechanics (Newtonian Physics) governs the behaviour of all everyday objects
Quantum Mechanics governs the behaviour of unimaginably small sub-atomic particles
Relativity Theory governs the behaviour of impossibly super-massive cosmic structures
Quantum Dynamics (String Theory) governs the interaction of Membranes in Hyperspace
Wave Mechanics (String Theory) integrates the behaviour of every size and type of object
Random Event Clustering Patterns in
the Chaos.....
Order out of Chaos Patterns in the Randomness
There is an interesting phenomenon called Phase Locking where two loosely coupled systems with slightly
different frequencies show a tendency to move into resonance in order to harmonise with one another. We
also know that the opposite of system convergence - system divergence - is also possible with phase-locked
systems, which can also diverge with only very tiny inputs - especially if we run those systems in reverse.
Thus phase locking draws two nearly harmonic systems into resonance and gives us the appearance of a
coincidence. There are, however, no coincidences in Physics. Sensitive Dependence in Complexity Theory
also tells us that minute, imperceptible changes to inputs at the initial state of a system, at the beginning of a
cycle, are sufficient to dramatically alter the final state after even only a few iterations of the system cycle.
The discovery of Chaos and Complexity has increased our understanding of the Cosmos and its effect
on us. If you surf the chaos content regions of the internet, you will invariably encounter terms such as: -
These influences can take some time to manifest themselves, but that is the nature of the phenomena
identified as a "strange attractor." Such differences could be small to the point of invisibility - how tiny
can influences be to have any effect? This is captured in the butterfly scenario described below.
Complex Systems and Chaos Theory
Weaver (Complexity Theory) along with Gleick and Lorenzo (Chaos Theory) have
given us some of the tools that we need to understand these complex, interrelated
chaotic and radically disruptive political, economic and social events such as the
collapse of Global markets and the various protests against this - using Event
Decomposition, Complexity Mapping, and Statistical Analysis to help us identify
patterns, extrapolations, scenarios and trends unfolding as seemingly unrelated,
random and chaotic events. The Hawking Paradox, however, challenges this view of
Complex Systems by postulating that uncertainty dominates complex, chaotic systems
to such an extent that future outcomes are both unknown - and unknowable.
There is an interesting phenomenon called Phase Locking where two loosely coupled
systems with slightly different frequencies show a tendency to move into resonance in order
to harmonise with one another. We also know that the opposite of system convergence -
system divergence - is also possible with phase-locked systems, which can also diverge with
only very tiny inputs - especially if we run those systems in reverse. Thus phase locking
draws two nearly harmonic systems into resonance and gives us the appearance of a
coincidence. There are, however, no coincidences in Physics. Sensitive Dependence in
Complexity Theory also tells us that minute, imperceptible changes to inputs at the initial state
of a system, at the beginning of a cycle, are sufficient to dramatically alter the final state after
even only a few iterations of the system cycle.
Complex Systems and Chaos Theory
Complex Systems and Chaos Theory has been used extensively in the field of Futures Studies, Strategic
Management, Natural Sciences and Behavioural Science. It is applied in these domains to understand how
individuals or populations, societies and states act as a collection of systems which adapt to changing
environments bio-ecological, socio-economic or geo-political. The theory treats individuals, crowds and
populations as a collective of pervasive social structures which are influenced by random individual
behaviours such as flocks of birds moving together in flight to avoid collision, shoals of fish forming a bait
ball in response to predation, or groups of individuals coordinating their behaviour in order to exploit novel
and unexpected opportunities which have been discovered or presented to them.
When Systems demonstrate properties of Complex Adaptive Systems (CAS) - which is often defined as
consisting of a small number of relatively simple and loosely connected systems - then they are much more
likely to adapt to their environment and, thus, survive the impact of change and random events. Complexity
Theory thinking has been present in strategic and organisational studies since the first inception of Complex
Adaptive Systems (CAS) as an academic discipline.
Complex Adaptive Systems are further contrasted compared with other ordered and chaotic systems by
the relationship that exists between the system and the agents and catalysts of change which act upon it. In
an ordered system the level of constraint means that all agent behaviour is limited to the rules of the system.
In a chaotic system these agents are unconstrained and are capable of random events, uncertainty and
disruption. In a CAS, both the system and the agents co-evolve together; the system acting to lightly
constrain the agents behaviour - the agents of change, however, modify the system by their interaction. CAS
approaches to behavioural science seek to understand both the nature of system constraints and change
agent interactions and generally takes an evolutionary or naturalistic approach to crowd scenario planning
and impact analysis.
Hertzsprung
Russell
The Hertzsprung
Russell diagram is a
scatter plot Cluster
Diagram which shows
the Main Sequence
Stellar Lifecycles.
A Hertzsprung Russell
diagram is a scatter
plot Stellar Cluster
Diagram which
demonstrates the
relationship between a
stars temperature and
luminosity over time
using red to blue colour
to indicate the mean
temperature at the
surface of the star.
Star
Clusters
New and
improved
understanding
of star cluster
physics brings
us within reach
of answering a
number of
fundamental
questions in
astrophysics,
ranging from
the formation
and evolution
of galaxies
to intimate
details of the
star formation
process itself.
Star
Clusters
The Physics of star
clustering leads us
to new questions
related to the
make-up of stellar
clusters and
galaxies, stellar
populations in
different types of
galaxy, and the
relationships
between high-
stellar populations
and local clusters
overall, resolved
and unresolved
the implications
for their relative
formation times
and galactic star-
formation histories.
Hertzsprung
Russell
The Hertzsprung Russell
diagram is a scatter plot
Cluster Diagram which
shows Stellar Lifecycles
along the Main Sequence
A Hertzsprung Russell
diagram is a scatter plot
Stellar Cluster Diagram
which demonstrates the
relationship between a
stars temperature and
luminosity over time
using a red to blue colour
code to indicate the
surface temperature
through the stars lifecycle
.
Saunders et al, (2009)
Qualitative and Quantitative Methods
A warp brings two discrete points from different Hyperspace Planes close
enough together to allow a Hyperspace Jump. Over any given time interval -
multiple Hyperspace Planes stack up on top of each other to create a time-line
which extends along the temporal axis of the Minkowski Space-Time Continuum.
In order to obtain realistic glimpses into the Future, then the major paradigm
differences between our limited Systemic Models which attempt to simplify,
abstract and simulate reality - and the Actual Reality that we experience every
day - must be clearly distinguished between, differentiated and understood.
That difference is Randomness bringing Uncertainty, Disorder and Chaos..
Classical Mechanics (Newtonian Physics) describe the laws which govern all
of the systems and objects that we are familiar with in our everyday routine
lives. Relativity Theory, on the other hand, describes unimaginably large
things, whilst Quantum Mechanics describes impossibly small things and
Wave Mechanics (String Theory) attempts to describe everything. True
randomness does not really exist in Classical (Newtonian) Physics the laws
which control Chaos and Complex Systems that govern every aspect of our
life on Earth today from Natural Systems such as Cosmology, Astronomy,
Climatology, Geology and Biology through to Human Activity Systems such
as Political, Economic and Sociological Complex Adaptive Systems (CAS).
Random Events
Randomness is simply the results of those forces which are not known, not
recognised, not understood, are not under the control of the observer or
simply occur outside of the known boundaries of observable system
components but, nevertheless, must still exist and exert influence over the
system. Over many System Cycles, immeasurably small inputs interacting
with Complex System components and relationships - may be amplified into
extremely significant outputs.....
This pattern is
sometimes referred to
as the drunkard's walk.
The intersecting lines at
the top and the right of
the picture are
Cartesian coordinates
and mark the origin
where X=0 and Y=0.
Weak Signals, Strong Signals, Wild Cards and Black Swan Events are a
sequence of linked and integrated waves in ascending order of magnitude, which
have a common source or origin - either a single Random Event instance or
arising from a linked series of chaotic and disruptive Random Events - an Event
Storm. These Random Events propagate through the space-time continuum as a
related and integrated series of waves with an ascending order of magnitude and
impact the first wave to arrive is the fastest travelling,- Weak Signals - something
like a faint echo of the causal Random Event, This may in turn be followed in turn
by a ripple (Strong Signals) then possibly by a wave (Wild Card) - which could
indicate the unfolding a further increase in magnitude and intensity which suddenly
and catastrophically arrives - something like a tsunami (Black Swan Event).
Temporal Disturbances in the
SpaceTime Continuum
Weak Signals, Strong Signals, Wild Cards and Black Swan Events are a sequence of
waves that have a common source or origin and are linked and integrated in ascending order
of magnitude which emanate either from a single Random Event instance or arise from a
linked series of chaotic and disruptive Random Events - an Event Storm. Signals from these
Random Events propagate through the space-time continuum as an integrated and related
series of waves with an ascending order of magnitude and impact the first wave to arrive is
the fastest travelling - Weak Signals - something like a faint echo of a Random Event which
may in turn be followed in turn by a ripple (Strong Signals) then possibly by a wave (Wild
Card) - which may predicate a further increase in magnitude and intensity which finally arrives
as a catastrophically unfolding mega-wave - something like a tsunami (Black Swan Event).
Perhaps some of the different Wave Types - Weak Signals, Wild Cards and
Black Swan Events can travel faster or take a different route compared with
some of the other types perhaps because their Wave forms can propagate
through the Space- Time Matrix (which is made up of dark matter, dark energy
and dark flow) more rapidly than the other Wave forms - or perhaps they are
different types of Wave and specific Wave Types may able to take a short-cut
between two points on different Hyperspace Planes and so arrive sooner.
It is possible that certain types of Random Event may be able to bend the Time-
Space continuum to bring two discrete points on different Hyperspace Planes
closer together and so take a short-cut over a time interval extended through a
time-line flowing along the Time axis of the Minkowski Space-Time Continuum.
Temporal Disturbances in the
SpaceTime Continuum
Every item of Global Content that we find in the Present is somehow
connected with both the Past and the Future. Space-Time is a Dimension
which flows in a single direction, as does a River towards the Future.
Space-Time, like water diverted along an alternative river channel, does not
always flow uniformly outside of the main channel there could well be
submerged objects (random events) that disturb the passage of time, and
may possess the potential capability of creating unforeseen eddies, whirlpools
and currents in the flow of Time (disorder and uncertainty) which in turn
posses the capacity to generate ripples, and waves (chaos and disruption)
thus changing the course of the Time-Space continuum. Weak Signals are
Ghosts in the Machine of these subliminal temporal interactions with the
capability to contain information about future Wild card or Black Swan
random events.
Temporal Disturbances in the
SpaceTime Continuum
Biological, Sociological, Economic and Political systems all tend to demonstrate
Complex Adaptive System (CAS) behaviour - which appears to be more similar
in nature to biological behaviour in an population than to truly Disorderly, Chaotic,
Stochastic Systems (Random Systems). For example, the remarkable long-term
adaptability, stability and resilience of market economies may be demonstrated by
the impact of Black Swan Events causing stock market crashes - such as oil price
shocks (1970-72) and credit supply shocks (1927- 1929 and 2008 onwards) by
the ability of Financial markets to rapidly absorb and recover from these events.
Unexpected and surprising Cycle Pattern changes have historically occurred during
regional and global conflicts being fuelled by technology innovation-driven arms
races - and also during US Republican administrations (Reagan and Bush - why?).
Just as advances in electron microscopy have revolutionised the science of biology
- non-stationary time series wave-form analysis has opened up a new space for
Biological, Sociological, Economic and Political system studies and diagnostics.
Temporal Disturbances in the
SpaceTime Continuum
In any crowd of human beings or a swarm of animals, individuals are so
closely connected that they share the same mood and emotions (fear, greed,
rage) and demonstrate the same or very similar behaviour (fight, flee or
feeding frenzy). Only the first few individuals exposed to the Causal Event or
incident may at first respond strongly and directly to the initial trigger
stimulus, causal event or incident (opportunity or threat such as external
predation, aggression or discovery of a novel or unexpected opportunity to
satisfy a basic need such as feeding, reproduction or territorialism).
Those individuals who have been directly exposed to the initial trigger event
or incident - the system input or causal event that initiated a specific outbreak
of behaviour in a crowd or swarm quickly communicate and propagate their
swarm response mechanism and share with all the other individuals those
members of the Crowd immediately next to them so that modified Crowd
behaviour quickly spreads from the periphery or edge of the Crowd.
Weak Signals indicate possible future transformations and changes which are
happening right now, on or even just beyond the visible horizon, predicating
changes in how we do business, what business we do, and the future
environment in which we will all live and work.
Weak Signals are messages from the future, subliminal temporal indicators
of change (Random Events) coming to meet us from the distant horizon
perhaps indicators of novel and emerging desires, thoughts, ideas, influences,
patterns and trends which may arrive to interact with both current and historic
waves, patterns and trends to alter, enhance, impact or effect future outcomes
and events, or simply some future change taking place in the current
environment in which we all share our life experiences.....
Weak Signals and Wild Cards
Scan and
Identify
Communicate Discover
Random Event
Publish
Wild Signal Weak Track and
and
Card Processing Signal Monitor
Socialise
Strong Signal
Understand Evaluate
Investigate
Random Events and
Weak Signal / Wild Card Research
Signal Processing
Weak Signals Wild Cards, Black Swans
Black Random
Swan Event
Communicate Discover
Understand Evaluate
Strong
Signal
Weak Signals
Weak Signal is a descriptor for an unusual and unexpected message from the future
faint and subliminal predicating a forthcoming Random Event. Weak Signal is sign
indicating either a possible future outcome or random event which has not been forecast
or anticipated (either because it seemed unlikely - or because no-one had even thought
about it) - but which may indicate some future extreme and far-reaching impact or effect.
1. SURPRISE Weak Signals are a sudden and unexpected surprise to the observer.
3. SPEED - Weak Signals appear out of nowhere then either disperse or become stronger.
4. DUALITY OF NATURE - Weak Signals may indicate a possible future serious challenge or
threat or reveal to the observer a future novel and unexpected window of opportunity.
5. PARADOX - Weak Signals at their first appearance could or should have been picked up
and recognised if the Weak Signal is detected against the overwhelming foreground and
background noise - then identified, analysed and correctly accounted for.
Weak Signals
Weak Signals are messages, subliminal temporal indicators of ideas, patterns or trends
coming to meet us from the future or perhaps indicators of novel and emerging, ideas,
influences and messages which may interact with both current and pre-existing patterns
and trends to impact or affect some change taking place in our current environment even
an early warning or sign of impending random events, disasters or catastrophes which, at
some point, time or place in the Future, may predicate, influence or impact on future
events, objects or processes to effect subtle, minor or major changes in how we live,
work and play or even threaten the very existence of the world as we know it today.....
A Weak Signal is an early warning or sign of change, which typically becomes stronger by
combining with other signals. The significance of a weak future signal is determined by the
nature and content of the message it contains predicating positive or negative change
and the scope and objectives of its recipient. Finding Weak Signals typically requires
systematic searching through Big Data - internet content, news feeds, data streams,
academic papers and scientific research data sets. A weak future signal requires: i)
support, ii) critical mass, iii) growth of its influence space, and dedicated actors, i.e. the
champions, in order to become a strong future signal - else Weak Signals evaporate or
disappear into the ether. A Weak Future Signal is usually first recognised by research
pioneers, think tanks or special interest groups (amateur astronomers and comets) but
very often missed or dismissed by acknowledged main-stream subject matter experts.
Weak Signals
Weak Signals refer to Weak Future Signals in Horizon and Environment Scanning for any
unforeseen, sudden and extreme Global-level transformation or change Future Events in either
the military, political, social, economic or environmental landscape some having an inordinately
low probability of occurrence - coupled with an extraordinarily high impact when they do occur.
2. The nature of the early information which can be assimilated from Random Events - Weak
Signals, Strong Signals, Wild Cards and Black Swan Events - arrive in an integrated Wave
Series (ANSOFF, 1975) and has little internal structure or reference, so cannot be described or
defined in advance of receiving those very first Weak Signals (MARCH and FELDMAN, 1981),
3. The Stochastic hybrid and cross-functional and Probabilistic nature of Weak Signals limits the
impact, relevance and application of Deterministic prescriptive methods and approaches, and
precludes rigid, inflexible algorithm-based expert systems approaches (GOSHAL and KIM, 1986).
4. In strategic decision making, the uniqueness in the form and function of Weak Signals, Strong
Signals, Wild Cards and Black Swan Events - implies the use of flexible approaches and
solutions based on Probabilistic Methods including cognitive filtering, bounded rationality,
fuzzy logic, approximate reasoning, neural networks and adaptive systems (SIMON, 1983);
5. The random and ethereal nature of the Horizon and Environment Scanning, Tracking and
Monitoring process involves dependence - strange actors, clustering, numerous elements and
complex interactions - and requires very large scale (VLS) computing and BIG DATA Analytics
techniques to reliably and accurately discover, identify, classify and interpret Weak Signals.
Weak Signals
6. Neural Networks and Complex / Adaptive / Learning System Models combined with BIG DATA
methods are therefore likely to be the most successful and appropriate technology approaches for
executing both Horizon and Environment Scanning, Tracking and Monitoring studies.
7. A major component of the process of Horizon and Environment Scanning, Tracking and
Monitoring is achieved either by horizon or environmental scanners who capture weak signals
hidden within massive amounts of external raw data, and data scientists using BIG DATA content
techniques for data analysis - washing and mashing and racking and stacking
8. A Weak Future Signal is an early warning of change, which typically becomes stronger by combining
with other signals. The significance of a weak future signal is determined by the objectives of its
recipient, and finding it typically requires systematic searching. A weak future signal requires: i)
support, ii) critical mass, iii) growth of its influence space, and dedicated actors, i.e. the champions,
in order to become a strong future signal, or to prevent itself from becoming a strong negative signal.
A Weak Future Signal is often recognised by pioneers or special groups - not by acknowledged
subject matter experts
9. The Weak Future Signal Event Types refer to subliminal indications of future unforeseen,
sudden and extreme Global-level transformation or change. Weak Signal Event Types in either the
military, political, social, economic or environmental landscape - having an inordinately low probability
of occurrence - coupled with an extraordinarily high impact when they do occur.
Weak Signals
Weak Signal Property Different views and viewpoints
1 Nature Weak Signals are subtle indicators of ideas, patterns or
trends that give us a glimpse into the future predicating
possible future transformations and changes which are
happening on or even just over the visible horizon, changes
in how we do business, what business we do, and the future
environment in which we will all live and work.
2 Quality Weak Signals may be novel and surprising from the signal
analyst's vantage point - although many other signal
analyst's may have already, failed to recognise,
misinterpreted or dismissed the same Weak Signals
3 Purpose Weak Signals are used for Horizon Scanning, Tracking
and Monitoring and for Future Analysis and Management
4 Source Weak Signals, Strong Signals, Wild Cards and Black
Swan Events are a sequence of waves linked and
integrated in ascending order of magnitude, which have a
common source or origin - either a single Random Event
instance or arising from a linked series of chaotic and
disruptive Random Events generating Weak Signals from
a Random Event Cluster or Random Event Storm.
Weak Signals
Weak Signal Property Different views and viewpoints
5 Wave-form Analytics Wave-form Analytics may be used with Big Data to analyse
and Big Data Global how Random Events propagate through the space-time
continuum in a related and integrated series of waves with an
Internet Content
ascending order of magnitude and impact the first wave to
arrive is the fastest travelling - Weak Signals - something like
a faint echo of a Random Event which may be followed in
turn by a ripple (Strong Signals) then possibly by a wave (Wild
Card) - which may indicate the unfolding a further increase in
magnitude and intensity which finally arrives catastrophically
- something like a tsunami (Black Swan Event).
6 Identification Weak Signals are sometimes difficult to track down, receive,
tune in, identify, amplify and analyse amid the overwhelming
volume of white noise from stronger signals and other
foreground and background noise sources
7 Principle of Dual Nature Weak Signals may indicate the possibility of either a potential
(possibility of either an future threat or opportunity to yourself or your organization -
Opportunity or Threat) or foretell the pending arrival of a future advantage or
reversal a Wild card or Black Swan event
Weak Signals and Wild Cards
Wild Card or "Black Swan" manifestations are extreme and unexpected
events which have a very low probability of occurrence, but an inordinately
high impact when they do happen. Trend-making and Trend-breaking agents
or catalysts of change may predicate, influence or cause wild card events
which are very hard - or even impossible - to anticipate, forecast or predict.
1. SURPRISE Strong Signals are a complete and unexpected surprise to the observer.
3. SPEED - Strong Signals appear out of nowhere then either disperse or magnify.
5. PARADOX - Strong Signals are rationalised by hindsight, as at their first appearance they
could or should have been foreseen had the relevant Weak Signals been available and
detected in the background noise, identified correctly, analysed and accounted for.
Strong Signals
Strong Signals represent the first clear and visible presence of a Random Event the
secondary arrival of stronger but slower-travelling waves containing more information of
possible, probable and alternative future events random events, future catastrophes, or
indications o novel and emerging, ideas, influences and messages which may interact with
both current and pre-existing patterns and trends to impact or affect some change taking
place in our environment - at some point, time or place in the future for example, what
future climatic and ecological environment will live , work and play in what political, social
and economic environment will live , work and play in, how we live, work and play, what
business we do, how we do business and who we do Business with......
1. Strong Signals may demonstrate a substantial lag time before they follow their
preceding indicators, prior Weak Signals
2. Strong Signals may contain confirmation about future events random events,
catastrophes, or indications o novel and emerging, ideas, influences and messages.
They therefore present a second potential window of opportunity if the first Weak Signals
in the series were undetected, overlooked or dismissed
3. Strong Signals arrive, become established, develop, grow and mature - then peak,
plateau decline and collapse or interact with current and pre-existing extrapolations,
patterns or trends which act to transform or change the current outlook or landscape.
Strong Signals
Property Different Views and Viewpoints
Strong Signals follow Weak Signals to give a more clear and apparent
1 Nature
indication of ideas, patterns or trends that provide us with a stronger and
more lasting glimpse into the future predicating probable future
transformations and changes which are happening on or even just over
the visible horizon, changes in how we do business, what business we
do, and the future environment in which we will all live and work.
Strong Signals are used in Horizon Scanning, Tracking and Monitoring -
2 Purpose
for Strategy Analysis and Strategy Management, Future Analysis and
Future Management
Weak Signals, Strong Signals (which are second in the sequence), Wild
3 Source
Cards and Black Swan Events are a linked sequence of integrated
waves in a timeline and ascending order of magnitude, which have a
common source or origin - either a single Random Event instance or
arising from a linked series of chaotic and disruptive Random Events
creating a Random Event Cluster or Random Event Storm.
Strong Signals
Property Different Views and Viewpoints
1. SURPRISE Wild Card Events are a complete and totally unexpected surprise to the
observer - the scale of the event falling well outside the realm of previous experience.
3. SPEED - Wild Card Events appear out of nowhere then unfold with speed and rapidity.
4. DUALITY OF NATURE - Wild Card Events may represent either a potentially serious
challenge or threat or present the observer with a novel and unexpected opportunity.
5. PARADOX - Wild Card Events are rationalised by hindsight, as at their first appearance
they could or should have been foreseen had the relevant Weak Signals been available
and detected in the background noise, identified correctly, analysed and accounted for.
Wild Card Events
Definition of Wild card Event
A Wild card Event is a surprise - an event or occurrence that deviates outside of what
would normally be expected of any given situation or set of circumstances, and which therefore
would be difficult to anticipate or predict. This term was coined by Stephen Aguilar-Milan in the
1960s and popularised by Ansoff in the 1970s. Wild card Events are any unforeseen,
sudden and unexpected change events or transformation scenarios which occur within the
military, political, social, economic or environmental landscape - having a low probability of
occurrence, coupled with an high impact when they do occur (Stephen Aguilar-Milan): -
4 Identification Wild cards are much easier to recognise than Weak Signals and
Strong Signals, above the background of white noise from and
other signals from foreground and background noise sources
5 Perception Whereas Weak Signals and even Strong Signals are often missed,
dismissed or scoffed at by other Subject Matter Experts Wild
cards events are almost universally recognised and accepted
6 Opportunity Wild cards bring realisation of startling new events, novel and
emerging ideas, influences and messages - therefore they represent
an third and final window of potential opportunity.
Weak Signals and even Strong Signals may be novel and surprising
7 Quality
from the signal analyst's vantage point - Wild cards, however,
cannot be so easily dismissed. Many other signal analyst's may
now join in to confirm and support the content of such Wild cards.
9 Timing Wild cards may demonstrate a substantial lag time before they
follow their preceding indicators, those prior Weak Signals and their
followers, the Strong Signals
Wild Cards
Climate and Environmental Agents & Catalysts of Change impact on Human Futures
For most of human existence our ancestors led precarious lives as scavengers, hunters,
and gatherers, and there were fewer than 10 million human beings on Earth at any one
time. Today, many of our cities have more than 10 million inhabitants each - as global
human populations continue to grow unchecked. The total global human population
stands today at 7 billion - with as many as three billion more people on the planet by 2050.
Human Activity Cycles - Business, Social, Political, Economic, Historic and Pre-historic
(Archaeology) Waves - may be compatible with, and map onto - one or more Natural
Cycles Orbital, Climate and so on. Current trends in Human Population Growth are
unsustainable we are already beginning to run out of Food, Energy and Water (FEW)
which will first limit, then reverse human population growth falling below 1bn by 2060 ?
Over the long term, ecological stability and sustainability will be preserved but at the
expense of the continued, unchecked growth of human populations. Global population will
rise to 10 billion by 2040 followed by a massive population collapse to under 1 billion -
recovering to 1 billion by the end of the 21st century. There are eight major threats to
Human Society, which are Chill, Grill, Ill, Kill, Nil, Spill, Thrill and Till.
Environmental Wild Card Event Types
Event Type Force Environmental Black Swan Event
1 Natural Natural Natural disasters occur when extreme magnitude events of stochastic
Disasters & Forces natural processes cause severe damage to human society. "Catastrophe" is
Catastrophe used about an extreme disaster, although originally both referred only to
extreme events (disaster is from the Latin, catastrophe from Ancient Greek).
4 Impact Gravity Asteroid or comet impact the odds of an asteroid or comet impact on the
Event Earth depend on the size of the Object. An Object approximately 15 feet in
diameter hits the Earth once every several months; 35 feet every 10 years; 60
feet every 100 years; 200 feet, or size of the Tunguska impact, every 200
years; 350 feet every several thousand years; 1,000 feet every 50,000 years;
six tenths of a mile every 500,000 years; and 5 to 6 miles across every 100
million years.
5 Thermal Geo- Spill Moments - Local and Regional Natural Disasters e.g. Andesitic volcanic
Process Thermal eruption at tectonic plate margins for example, the Vesuvius eruption and ash
Energy cloud destroying the Roman cities of Herculaneum and Pompeii, and Volcanic
eruption / collapse causing Landslides and Tsunamis - Stromboli eruption /
collapse fatally weakening the Minoan Civilisation on Crete, Krakatau eruption
in the 19th Century causing Indonesian Tsunamis, ocean-floor sediment slips
causing in recent years the recent Pacific / Indian Oceanic, and Japanese
Tsunamis resulting in coastal flooding, inundation and widespread destruction
6 Climate Human Melting of the polar ice-caps, rising sea levels combined with increased
Change Activity severity and frequency of extreme weather events El Nino and La Nina
have already begun to threaten these low-lying coastal cities (New Orleans,
Brisbane). By 2040, a combination of rising sea levels, storm surges of
increased intensity and duration and flash floods will flood much more
often. Coast, Deltas, Estuaries & River Valleys will flood up to 90km inland
up to 90 km into the interior from the present coast frequently drowning
many of the major cities along with much of our most productive agricultural
land and washing away homes and soil in the process. Human Population
Drift to Cities and Urbanisation also drives the destruction of prime arable
land as it is gobbled up by developers to build even more cities.
Liquid water melted by warm air at the surface of a glacier, runs down sink-
holes to the glacier base where it lubricates the rock / glacier interface
causing glacier flow surges up to 20 times the normal flow-rate. Increased
glacial flow-rate is usually further aided and by the loss of sea pack ice
which acts to moderate Glacier flow during cold periods - due to oceanic
temperature rise (oceanic climate forcing). This scenario does satisfy not
the timing requirements of climate change events which occur at the
culmination of a next Bond Cycles believed to be oceanic climate forcing
phenomena. It does fit in well with the rapid rise in temperature that occurs
at the beginning of the next Bond Cycle which takes only a few decades
after the culmination of the previous Bond Cycle.
Wild card Events
Type Force Black Swan Event
7 Climate Solar Climate Change Dansgaard-Oetcher and Bond Cycles - oceanic climate forcing
Change Forcing cycles consisting of episodes of rapid warming followed by slow cooling have been
traced and plotted over the last 26 cycles 40,000 years - with metronomic precision
Event
of exact 1,490-years periodicity. Solar orbital cycle variations with periodicities from
20,000 to 400,000-years have also been traced and plotted over many cycles tens of
millions of years again with metronomic regularity. These longer-scale Milankovich
Cycles are responsible for Pluvial and Inter-pluvial episodes (Ice Ages) during the
Quaternary period - due to orbital variation causing changes to solar climate forcing.
Global warmingHuman Activity has been largely held responsible for the Earth
getting warmer every decade for the last two hundred years and the rate of warming
has accelerated over the last few decades. The Earth could eventually wind up like its
greenhouse sister, Venus. Grill - rapidly rising temperatures such as found in Ice
Age Inter-Glacial episodes (Inter-pluvial Periods) precipitating environmental and
ecological change under heat stress and drought causing the disappearance of the
Neanderthal, Soloutrean and Clovis cultures with deforestation, desertification and
drying driving the migration or disappearance of the Anastasia in SW America - along
with the Sahara Desert migrating south and impacting on Sub-Saharan cultures.
.Global cooling The Earth has dramatically cooled and plunged into Ice Ages on
many occasions throughout Geological History, Earth might eventually change to
resemble its frozen sister, Mars. Chill rapid cooling, e.g. Ice Age Glaciations
(Pluvial Periods) causing the depopulation of Northern Europe in early hominid Eolithic
times and impact of the medieval mini Ice Age on Danish settlers in Greenland.
Wild Card Event Types
Type Force Wild card Event
5 Global Human FEW - Food, Energy, Water Crisis - as scarcity of Natural Resources (FEW -
Massive Impact Food, Energy, Water) and increased competition to obtain those scarce
Change on Eco- resources begins to limit and then reverse population growth, global population
Event system levels will continue expansion towards an estimated 8 or 9 billion human beings
by the middle of this century then collapse catastrophically to below 1 billion
slowly recovering and stabilising out again at a sustainable population of about 1
billion human beings by the end of this century.
Till Moments - Societys growth-associated impacts on its own ecological and
environmental support systems, for example intensive agriculture causing
exhaustion of natural resources by the Mayan and Khmer cultures, de-
forestation and over-grazing causing catastrophic ecological damage and
resulting in climatic change for example, the Easter Island culture, the de-
population of upland moors and highlands in Britain from the Iron Age onwards
including the Iron Age retreat from northern and southern English uplands, the
Scottish Highland Clearances and replacement of subsistence crofting by deer
and grouse for hunting and sheep for wool on major Scottish Highland Estates
and the current sub-Saharan de-forestation and subsequent desertification by
semi-nomadic pastoralists. Like Samson, will we use our strength to bring down
the temple? Or, like Solomon, will we have the wisdom to match our technology?
Wild Card Event Types
Type Force Wild card Event
8 Alien Biological Ill Moments - Contact with a foreign population or alien civilization and their
Contact Disease bio-cloud bringing along with them their own parasite burden and contagious
Event diseases (viruses and bacteria) - leading to pandemics to which the exposed
human population has developed little or no immunity or treatment. Examples
include the Bubonic Plague - Black Death - arriving in Europe in ships from Asia,
Spanish Explorers sailing up the Amazon and spreading Smallpox to Amazonian
Basin Indians from the Dark Earth - Terra Prate - Culture and Columbian Sailors
returning to Europe introducing Syphilis from the New World, the Spanish Flu
Pandemic carried home by returning soldiers at the end of the Great War - which
killed more people than did all the military action during the whole of WWI).
9 Alien Biological Kill Moments Invasion, conquest and genocide by a civilisation with
Contact Predation superior technology, e.g. Roman conquest of Celtic Tribes in Western Europe,
Event William the Conquerors Harrying of the North in England, Spanish
conquistadores meet Aztecs and Amazonian Indians in Central and South
America, Cowboys v. Indians across the plains of North America..
10 Hyper- Quantum Nil Moments Singularity or Hyperspace Events where the Earth and Solar
space Dynamics System are swallowed up by a rogue Black Hole or the dimensional fabric of
Event the whole Universe is ripped apart when two Membranes (Universes) collide in
hyperspace and one dimension set is subsumed into the other they merge into
a large multi-dimensional Membrane and split up into two new Membranes?
Recent Historic Wild card Events
Wild card Events Surprise Impact Type Trigger
Tay Bridge disaster (1879) railway bridge collapsed during a High Medium Bridge Wind
violent storm whilst a passenger train was passing across Design
Tacoma Narrows bridge collapse (1940) road bridge High Low Bridge Wind
collapsed in a moderate wind due to aeroelastic flutter Design
Flixborough Chemical Works Disaster (1974) cyclo-hexane High Medium Health & Equipment
chemical leak resulting in a hydrocarbon vapour cloud explosion Safety Failure
Chernobyl nuclear disaster (1986) safety systems shut down High High Health & Human
for a technical exercise on the turbine generator core meltdown Safety Error
World Trade Centre (1990) Wahid terrorist group activity High Medium Security Terrorism
World Trade Centre (2001) Al Qaida terrorist group activity High High Security Terrorism
Buncefield storage depot (2005) undetected oil fuel leak High Medium Health & Equipment
ignited resulting in a hydrocarbon vapour cloud explosion Safety Failure
Texas City oil refinery explosion (2005) hydrocarbon cloud High Medium Health & Equipment
accumulation from a fuel leak - resulting in a vapour explosion Safety Failure
Gulf of Mexico oil rig explosion (2009) high pressure methane High High Health & Human
blow-back during deep water drilling - resulting in a explosion Safety Error
Mumbai Taj Mahal Hotel (2012) Taliban terrorist group activity High Medium Security Terrorism
Nairobi Shopping Mall (2013) Al Shabab terrorist group activity High Medium Security Terrorism
Black Swan Events
Black Swan Events
Definition of a Black Swan Event Trigger
J
A
Global
Recession
K
Black Swan Events are unforeseen,
sudden and extreme or change events or
Trigger Trigger Credit
Global-level transformation in either the D G
D
Crisis
military, political, social, economic or
USA Sub-Prime
environmental landscape. Black Swan Mortgage Crisis
Events have an inordinately low
probability of occurrence - coupled with an Trigger
K
E
Sovereign
Debt Crisis
extraordinarily high impact when they do
occur (Nassim Taleb).
Black Swan Event Cluster or Storm
Black Swan Events
Black Swan Events
The phrase Black Swan is a metaphor describing an unusual and rare random event
which is totally unanticipated (perhaps because it seemed impossible or because no-one
had considered it before) - which has extreme and far-reaching consequences. This term
is also often used as a descriptive adjective - as in the expression black-swan event.
1. SHOCK - Black Swan Events are a complete and totally unexpected shock to the observer
- the scale of the event falling well outside the bounds of any prior expectations.
2. SEVERE - Black Swan Events have a severe impact, even a historical significance, as a
catalyst of massive change - or as an agent bringing severe global transformation.
3. SUDDEN - Black Swan Events appear suddenly and unfold with an extraordinary pace.
5. PARADOX - Black Swan Events are rationalised by hindsight, as at their first appearance
they could or should have been foreseen had the relevant Weak Signals been available
and detected in the background noise, identified correctly, analysed and accounted for.
Black Swan Events
Definition of Black Swan Event
Black Swan Events are any unforeseen, sudden and extreme random events
agent and catalysts of massive change, or Global-level transformation scenarios
which occur within the military, political, social, economic, cultural or environmental
landscape, having an inordinately low probability of occurrence - coupled with an
extraordinarily high impact when they do occur (Nassim Taleb).
Black Random
Swan Event
Communicate Discover
Understand Evaluate
Strong
Signal
Black Swan Events
Black Swan events are typically random and unexpected - characterized by
three main criteria: first, they are surprising, falling outside the realm of usual
expectation; second, they have a major effect (sometimes of historical or
geopolitical significance); and third, with the benefit of hindsight they are often
rationalized as something that could, should or would have been foreseen -
had all of the facts been available and examined carefully enough.
1. Black Swan events are surprising, falling well outside the realm of usual
experience or expectation.
2. Black Swan events have a sudden and severe impact (sometimes of far-
reaching and historic global significance).
3. Black Swan events might have been foreseen - as viewed through the
retrospective hindsight of Causal Layer Analysis (CLA) processes , back-
casting and back-sight (the reverse of forecasting and foresight).
Black Swan Event Definition
Black Swan Event Features
Black Swan Event is a common expression or metaphor describing an extraordinarily
rare and unusual random event which is totally unanticipated (perhaps because it
seemed impossible or because no-one had ever considered it before) and which has
extreme and far-reaching impact, consequences and effects. This term is also often used
as a descriptive adjective - as in the phrase Black Swan. The Black Swan metaphor
refers to those extreme events which are so chaotic that they are both unknown and
unknowable (Hawking Paradox) the unknown unknowns those events which are
impossible to anticipate from any analysis of recognised threats and existing risk factors.
1. SUDDEN - Black Swan Events appear suddenly and unfold at an extraordinarily rapid
pace the impact, scale and consequences of the event falling well outside the bounds
of any prior expectations.
2. SEVERE - Black Swan Events have a massively severe impact, even a historical
significance, as both a catalyst and agent of extreme and far-reaching impact - bringing
massive global transformation and change.
3. SHOCK Black Swan Events are extraordinarily unusual and rare random and chaotic
phenomenon - which comes as a complete and totally unforeseen shock to the observer.
4. SURPRISE A Black Swan Event - is a totally unexpected and unanticipated surprise
to the observer
Black Swan Event Characteristics
1. DICHOTOMY If all the relevant knowledge in the period leading up to the Black Swan
Event had been readily available, and if all of those Weak Signals, Strong Signals and
Wild Cards in the background noise had been detected at their first appearance, then
subsequently identified, analysed and interpreted then that Black Swan Event could,
should or would have been anticipated or foreseen and correctly accounted for..
2. PARADOX Any further Black Swan Events which are subsequently experienced still
remain as totally unexpected shocks and surprises despite the recent deep impact of
the previous Black Swan cluster..
In recent years, war, terrorism and insecurity and its resultant global economic
instability is the major Human Impact context in which the term Black Swan Event
occurs - especially in reference to the resulting geopolitical chaos, social disorder,
economic disruption and financial turmoil. In their stated aim to drain the kefirs
(infidels) of blood and treasure as well as attracting disenfranchised Moslems
with the dream of establishing a Kalifate fundamentalist Sunni Wahid terrorist
groups such as the Taliban, al Qaeda, al Shahab and ISIS have been surprisingly
successful.
Extinction-level Black Swan Event Types
Fiscal Black Swan Event Types
Type Force Fiscal Black Swan Event
1 Oil-Price Market Economic cycles and the global recessions that followed have been tightly
Shock forces coupled with the price of oil since the Oil Price shocks of the 1970s. In the
1980s, spurred on by these events, economists analysed the relationship
between the price of Oil and economic output in a number of econometric
studies, demonstrating a positive correlation in the US and other industrial
countries between oil prices and industrial output. The Oil Price shocks of
1990 and 2008 had a relatively lower impact on the global economy.
2 Money Market Contemporary Fiscal Models for the demand and supply of money are either
Supply forces inconsistent with the adjustment of price levels to expected changes in the
Shock nominal money supply - or demonstrate implausible fluctuations in interest
rates in response to unexpected changes in the nominal money supply.
A group of recently identified grey swans in the financial domain is the so-
called fiscal cliff, , a cocktail of tax increases and spending cuts disastrous for
Western economies against a background of growing demand for increased
spending on education, social security, healthcare, law and order, national security
and defence combat the activities of the influence of the enemy within as well
as the enemy without which could be disastrous for the US geopolitical status
quo, the economy and society in general.
As an example, the previously highly successful hedge fund Long Term Capital
Management (LTCM) was forced into bankruptcy as a result of the ripple effect
caused by the Russian government's debt default. The Russian government's
default represents a Black Swan Event - because none of LTCM's Risk managers
or their computer models could have reasonably predicted this event , nor any of
the Events subsequent unforeseen impacts, consequences and effects.
Natural Black Swan Event Types
Environment Scanning for Natural Black Swan Event Types
The other major global context in which the term Black Swan Event has
been strongly linked in recent times is that of Natural Disasters as an
example, drought, flooding, earthquakes, extreme storms, tsunamis and
volcanic eruption: -
Greater London covers 600 square miles. Up until the 17th century, however,
the capital city was crammed largely into a single square mile which today is
marked by the skyscrapers which are a feature of the financial district of the City.
This visualisation, originally created for the Almost Lost exhibition by the Bartlett
Centre for Advanced Spatial Analysis (CASA), explores the historic evolution of
the city by plotting a timeline of the development of the road network - along with
documented buildings and other features through 4D geospatial analysis of a
vast number of diverse geographic, archaeological and historic data sets.
Unlike other historical cities such as Athens or Rome, with an obvious patchwork
of districts from different periods, London's individual structures scheduled sites
and listed buildings are in many cases constructed gradually by parts assembled
during different periods. Researchers who have tried previously to locate and
document archaeological structures and research historic references will know
that these features, when plotted, appear scrambled up like pieces of different
jigsaw puzzles all scattered across the contemporary London cityscape.
History of Digital Epidemiology
Doctor John Snow (15 March 1813 16
June 1858) was an English physician and a
leading figure in the adoption of anaesthesia
and medical hygiene. John Snow is largely
credited with sparking and pursuing a total
transformation in Public Health and epidemic
disease management and is considered one
of the fathers of modern epidemiology in part
because of his work in tracing the source of
a cholera outbreak in Soho, London, in 1854.
The current focus in epidemiology is on the known unknowns - factors with which we are
familiar in the pandemic risk assessment processes. These risk processes cover, for
example, monitoring the course of the pandemic, estimating the most affected age groups,
and assessing population-level clinical and pharmaceutical interventions. This section
looks for the unknown unknowns - factors with a lack of, or silence, of evidence, of which
we have only limited or weak understanding in the pandemic risk assessment processes.
Pandemic risk assessment shows, that any developing, new and emerging or sudden and
unpredictable change in the pandemic situation does not accumulate a robust body of
evidence for decision making. These uncertainties may be conceptualised as unknown
unknowns, or silent evidence. Historical and archaeological pandemic studies indicate
that there may well have been evidence that was not discovered, known or recognised.
This section looks at a new method to discover silent evidence - unknown factors - that
affect pandemic risk assessment - by focusing on the tension under pressure that impacts
upon the actions of key decision-makers in the pandemic risk decision-making process.
Antonine Plague (Smallpox ) AD 165-180
Pandemic Black Swan Events
Black Swan Pandemic Type / Location Impact Date
For the entirety of human history, The Malaria pathogen kills more
Malaria 20 kya present
Malaria has been a pathogen humans than any other disease
Smallpox (Antonine Plague) Smallpox Roman Empire / Italy Smallpox is the 2nd worst killer 165-180
Black Death (Plague of Justinian) Bubonic Plague Roman Empire 50 million people died 6th century
Black Death (Late Middle Ages) Bubonic Plague Europe 75 to 200 million people died 13401400
Smallpox Amazonian Basin Indians 90% Amazonian Indians died 16th century
Tuberculosis Western Europe, 18th - 19th c 900 deaths per 100,000 pop. 18th - 19th c
Syphilis Global pandemic invariably fatal 10% of Victorian men carriers 19th century
1st Cholera Pandemic Global pandemic Started in the Bay of Bengal 1817-1823
Smallpox Global pandemic 300 million people died in 20th c Eliminated 20th c
Contracted by up to 500,000
Poliomyelitis Global pandemic 1950s -1960s
persons per year 1950s/1960s
AIDS Global pandemic mostly fatal 10% Sub-Saharans are carriers Late 20th century
Ebola West African epidemic 50% fatal Sub-Saharan Africa epicentre Late 20th century
For the entirety of human history, Malaria has
been the most lethal pathogen to attack man
Pandemic Black Swan Event Types
Type Force Epidemiology Black Swan Event
1 Malaria Parasitic The Malaria pathogen has killed more humans than any other disease. Human
Biological malaria most likely originated in Africa and has coevolved along with its hosts,
Disease mosquitoes and non-human primates. The first evidence of malaria parasites
was found in mosquitoes preserved in amber from the Palaeogene period that
are approximately 30 million years old. Malaria may have been a human
pathogen for the entire history of the species. Humans may have originally
caught Plasmodium falciparum from gorillas. About 10,000 years ago, a period
which coincides with the development of agriculture (Neolithic revolution) -
malaria started having a major impact on human survival. A consequence was
natural selection for sickle-cell disease, thalassaemias, glucose-6-phosphate
dehydrogenase deficiency, ovalocytosis, elliptocytosis and loss of the Gerbich
antigen (glycophorin C) and the Duffy antigen on erythrocytes because such
blood disorders confer a selective advantage against malaria infection (balancing
selection). The first known description of malaria dates back 4000 years to 2700
B.C. China where ancient writings refer to symptoms now commonly associated
with malaria. Early malaria treatments were first developed in China from
Quinghao plant, which contains the active ingredient artemisinin, re-discovered
and still used in anti-malaria drugs today. Largely overlooked by researchers is
the role of disease and epidemics in the fall of Rome. Three major types of
inherited genetic resistance to malaria (sickle-cell disease, thalassaemias, and
glucose-6-phosphate dehydrogenase deficiency) were all present in the
Mediterranean world 2,000 years ago, at the time of the Roman Empire.
Pandemic Black Swan Event Types
Type Force Epidemiology Black Swan Event
2 Smallpox Viral The history of smallpox holds a unique place in medical history. One of the
Biological deadliest viral diseases known to man, it is the first disease to be treated by
Disease vaccination - and also the only disease to have been eradicated from the
face of the earth by vaccination. Smallpox plagued human populations for
thousands of years. Researchers who examined the mummy of Egyptian
pharaoh Ramses V (died 1157 BCE) observed scarring similar to that from
smallpox on his remains. Ancient Sanskrit medical texts, dating from about
1500 BCE, describe a smallpox-like illness. Smallpox was most likely
present in Europe by about 300 CE. although there are no unequivocal
records of smallpox in Europe before the 6th century CE. It has been
suggested that it was a major component of the Plague of Athens that
occurred in 430 BCE, during the Peloponnesian Wars, and was described
by Thucydides. A recent analysis of the description of clinical features
provided by Galen during the Antonine Plague that swept through the
Roman Empire and Italy in 165180, indicates that the probable cause was
smallpox. In 1796, after noting Smallpox immunity amongst milkmaids
Edward Jenner carried out his now famous experiment on eight-year-old
James Phipps, using Cow Pox as a vaccine to confer immunity to Smallpox.
Some estimates indicate that 20th century worldwide deaths from smallpox
numbered more than 300 million. The last known case of wild smallpox
occurred in Somalia in 1977 until recent outbreaks in Pakistan and Syria.
Pandemic Black Swan Event Types
Type Force Epidemiology Black Swan Event
3 Bubonic Bacterial The Bubonic Plague or Black Death was one of the most devastating
Plague Biological pandemics in human history, killing an estimated 75 to 200 million people
Disease and peaking in Europe in the years 134850 CE. The Bubonic Plague is a
bacterial disease spread by fleas carried by Asian Black Rats - which
originated in or near China and then travelled to Italy, overland along the Silk
Road, or by sea along the Silk Route. From Italy the Black Death spread
onwards through other European countries. Research published in 2002
suggests that the Black Death began in the spring of 1346 in the Russian
steppe region, where a plague reservoir stretched from the north-western
shore of the Caspian Sea into southern Russia. Although there were
several competing theories as to the etiology of the Black Death, analysis of
DNA from victims in northern and southern Europe published in 2010 and
2011 indicates that the pathogen responsible was the Yersinia pestis
bacterium, possibly causing several forms of plague. The first recorded
epidemic ravaged the Byzantine Empire during the sixth century, and was
named the Plague of Justinian after emperor Justinian I, who was infected
but survived through extensive treatment. The epidemic is estimated to have
killed approximately 50 million people in the Roman Empire alone. During
the Late Middle Ages (13401400) Europe experienced the most deadly
disease outbreak in history when the Black Death, the infamous pandemic
of bubonic plague, peaked in 1347, killing one third of the human population.
Pandemic Black Swan Event Types
Type Force Epidemiology Black Swan Event
4 Syphilis Bacterial Syphilis - the exact origin of syphilis is unknown. There are two primary
Biological hypotheses: one proposes that syphilis was carried from the Americas to
Disease Europe by the crew of Christopher Columbus, the other proposes that
syphilis previously existed in Europe but went unrecognized. These are
referred to as the "Columbian" and "pre-Columbian" hypotheses. In late 2011
newly published evidence suggested that the Columbian hypothesis is valid.
The appearance of syphilis in Europe at the end of the 1400s heralded
decades of death as the disease raged across the continent. The first
evidence of an outbreak of syphilis in Europe were recorded in 1494/1495
in Naples, Italy, during a French invasion. First spread by returning French
troops, the disease was known as French disease, and it was not until
1530 that the term "syphilis" was first applied by the Italian physician and
poet Girolamo Fracastoro. By the 1800s it had become endemic, carried by
as many as 10% of men in some areas - in late Victorian London this may
have been as high as 20%. Invariably fatal, associated with extramarital sex
and prostitution, syphilis was accompanied by enormous social stigma. The
secretive nature of syphilis helped it spread - disgrace was such that many
sufferers hid their symptoms, while others carrying the latent form of the
disease were unaware they even had it. Treponema pallidum, the syphilis
causal organism, was first identified by Fritz Schaudinn and Erich Hoffmann
in 1905. The first effective treatment (Salvarsan) was developed in 1910
by Paul Ehrlich which was followed by the introduction of penicillin in 1943.
Pandemic Black Swan Event Types
Type Force Epidemiology Black Swan Event
6 Cholera Bacterial Cholera is a severe infection in the small intestine caused by the bacterium
Biological vibrio cholerae, contracted by drinking water or eating food contaminated
Disease with the bacterium. Cholera symptoms include profuse watery diarrhoea and
vomiting. The primary danger posed by cholera is severe dehydration, which
can lead to rapid death. Cholera can now be treated with re-hydration and
prevented by vaccination. Cholera outbreaks in recorded history have
indeed been explosive and the global proliferation of the disease is seen by
most scholars to have occurred in six separate pandemics, with the seventh
pandemic still rampant in many developing countries around the world. The
first recorded instance of cholera was described in 1563 in an Indian medical
report. In modern times, the story of the disease begins in 1817 when it
spread from its ancient homeland of the Ganges Delta in the bay of Bengal
in North East India - to the rest of the world. The first cholera pandemic
raged from 1817-1823, the second from 1826-1837 The disease reached
Britain during October 1831 - and finally arrived in London in 1832 (13,000
deaths) with subsequent major outbreaks in 1841, 1848 (21,000 deaths)
1854 (15,000 deaths) and 1866. Surgeon John Snow by studying the
outbreak cantered around the Broad Street well in 1854 traced the source
of cholera to drinking water which was contaminated by infected human
faeces ending the miasma or bad air theory of cholera transmission.
Pandemic Black Swan Event Types
Type Force Epidemiology Black Swan Event
7 Poliomyelitis Viral The history of poliomyelitis (polio) infections extends into prehistory.
Biological Ancient Egyptian paintings and carvings depict otherwise healthy people
Disease with withered limbs, and children walking with canes at a young age.[3] It is
theorized that the Roman Emperor Claudius was stricken as a child, and this
caused him to walk with a limp for the rest of his life. Perhaps the earliest
recorded case of poliomyelitis is that of Sir Walter Scott. At the time, polio
was not known to medicine. In 1773 Scott was said to have developed "a
severe teething fever which deprived him of the power of his right leg." The
symptoms of poliomyelitis have been described as: Dental Paralysis,
Infantile Spinal Paralysis, Essential Paralysis of Children, Regressive
Paralysis, Myelitis of the Anterior Horns and Paralysis of the Morning.
In 1789 the first clinical description of poliomyelitis was provided by the
British physician Michael Underwood as "a debility of the lower extremities.
Although major polio epidemics were unknown before the 20th century, the
disease has caused paralysis and death for much of human history. Over
millennia, polio survived quietly as an endemic pathogen until the 1880s
when major epidemics began to occur in Europe; soon after, widespread
epidemics appeared in the United States. By 1910, frequent epidemics
became regular events throughout the developed world, primarily in cities
during the summer months. At its peak in the 1940s and 1950s, polio would
maim, paralyse or kill over half a million people worldwide every year
Pandemic Black Swan Event Types
Type Force Epidemiology Black Swan Event
8 Typhus Bacterial Typhoid fever (jail fever) is an acute illness associated with a high fever that
Biological is most often caused by the Salmonella typhi bacteria. Typhoid may also be
Disease caused by Salmonella paratyphi, a related bacterium that usually leads to a
less severe illness. The bacteria are spread via deposition in water or food
by a human carrier. An estimated 1633 million cases of typhoid fever occur
annually. Its incidence is highest in children and young adults between 5 and
19 years old. These cases as of 2010 caused about 190,000 deaths up from
137,000 in 1990. Historically, in the pre-antibiotic era, the case fatality rate of
typhoid fever was 10-20%. Today, with prompt treatment, it is less than 1%.
9 Dysentery Bacterial / Dysentery (the Flux or the bloody flux) is a form of gastroenteritis a type
Parasitic inflammatory disorder of the intestine, especially of the colon, resulting in
Biological severe diarrhea containing blood and mucus in the feces accompanied by
Disease fever, abdominal pain and rectal tenesmus (feeling incomplete defecation),
caused by any kind of gastric infection. Conservative estimates suggest
that 90 million cases of Bacterial Dysentery (Shigellosis) are contracted
annually, killing at least 100,000. Amoebic Dysentery (Amebiasis) infects
some 50 million people each year, with over 50,000 cases resulting in death.
Pandemic Black Swan Event Types
Type Force Epidemiology Black Swan Event
10 Spanish Viral In the United States, the Spanish Flu was first observed in Haskell County,
Flu Biological Kansas, in January 1918, prompting a local doctor, Loring Miner to warn the
Disease U.S. Public Health Service's academic journal. On 4th March 1918, army cook
Albert Gitchell reported sick at Fort Riley, Kansas. A week later on 11th March
1918, over 100 soldiers were in hospital and the Spanish Flu virus had now
reached Queens New York. Within days, 522 men had reported sick at the
army camp. In August 1918, a more virulent strain appeared simultaneously
in Brest, Brittany-France, in Freetown, Sierra Leone, and in the U.S, in Boston,
Massachusetts. It is estimated that in 1918, between 20-40% of the worlds
population became infected by Spanish Flu - with 50 million deaths globally.
11 HIV / AIDS Viral AIDS was first reported in America in 1981 and provoked reactions which
Biological echoed those associated with syphilis for so long. Many of the earliest cases
Disease were among homosexual men - creating a climate of prejudice and moral
panic. Fear of catching this new and terrifying disease was also widespread
among the public. The observed time-lag between contracting HIV and the
onset of AIDS, coupled with new drug treatments, changed perceptions.
Increasingly it was seen as a chronic but manageable disease. The global
story was very different - by the mid-1980s it became clear that the virus had
spread, largely unnoticed, throughout the rest of the world. The nature of this
global pandemic varies from region to region, with poorer areas hit hardest. In
parts of sub-Saharan Africa nearly 1 in 10 adults carries the virus - a statistic
which is reminiscent of the spread of syphilis in parts of Europe in the 1800s.
Pandemic Black Swan Event Types
Type Force Epidemiology Black Swan Event
12 Ebola Haemorrhagic Ebola is a highly lethal Haemorrhagic Viral Biological Disease, which has
Viral caused at least 16 confirmed outbreaks in Africa between 1976 and 2015.
Biological
Disease Ebola Virus Disease (EVD) is found in wild great apes and kills up to 90% of
humans infected - making it one of the deadliest diseases known to man. It is
so dangerous that it is considered to be a potential Grade A bioterrorism agent
on a par with anthrax, smallpox, and bubonic plague. The current outbreak
of EVD has seen confirmed cases in Guinea, Liberia and Sierra Leone,
countries in an area of West Africa where the disease has not previously
occurred. There were also a handful of suspected cases in neighbouring Mali,
but these patients were found to have contracted other diseases
13 Future Bacterial Bacteria were most likely the real killers in the 1918 Flu Pandemic - the vast
Bacterial Biological majority of deaths in the 19181919 influenza pandemic resulted directly from
Pandemic Disease secondary bacterial pneumonia, caused by common upper respiratory-tract
Infections bacteria. Less substantial data from the subsequent 1957 and 1968 Flu
pandemics are consistent with these findings. If severe pandemic influenza is
largely a problem of viral-bacterial co-pathogenesis, pandemic planning needs
to go beyond addressing the viral cause alone (influenza vaccines and
antiviral drugs). The diagnosis, prophylaxis, treatment and prevention of
secondary bacterial pneumonia - as well as stockpiling of antibiotics and
bacterial vaccines should be high priorities for future pandemic planning.
14 Future Viral What was Learned from Reconstructing the 1918 Spanish Flu Virus
Viral Biological Comparing pandemic H1N1 influenza viruses at the molecular level yields key
Pandemic Disease insights into pathogenesis the way animal viruses mutate to cross species.
infections The availability of these two H1N1 virus genomes separated by over 90 years,
provided an unparalleled opportunity to study and recognise genetic properties
associated with virulent pandemic viruses - allowing for a comprehensive
assessment of emerging influenza viruses with human pandemic potential.
There are only four to six mutations required within the first three days of viral
infection in a new human host, to change an animal virus to become highly
virulent and infectious to human beings. Candidate viral gene pools for future
possible Human Pandemics include Anthrax, Lassa Fever, Rift Valley Fever,
SARS, MIRS, H1N1 Swine Flu (2009) and H7N9 Avian / Bat Flu (2013).
Complex Systems and
Chaos Theory
Complex Systems and Chaos Theory has been used
extensively in the field of Futures Studies, Strategic
Management, Natural Sciences and Behavioural
Science. It is applied in these domains to understand
how individuals within populations, societies,
economies and states act as a collection of loosely
coupled interacting systems which adapt to changing
environmental factors and random events bio-
ecological, socio-economic or geo-political.....
Complex Systems and Chaos Theory
Complex Systems and Chaos Theory has been used extensively in the field
of Futures Studies, Strategic Management, Natural Sciences and Behavioural
Science. It is applied in these domains to understand how individuals within
populations, societies, economies and states act as a collection of loosely
coupled interacting systems which adapt to changing environmental factors
and random events bio-ecological, socio-economic or geo-political.
One of the problems in addressing complexity issues has always been distinguishing between
the large number of elements (components) and relationships (interactions) evident in chaotic
(unconstrained) systems - Chaos Theory - and the still large, but significantly smaller number
of both and elements and interactions found in ordered (constrained) Complex Systems.
Orderly System Frameworks tend to dramatically reduce the total number of elements and
interactions with fewer and smaller classes of more uniform elements and with reduced and
sparser regimes of more restricted relationships featuring more highly-ordered, better internally
correlated and constrained interactions as compared with Disorderly System Frameworks.
The discovery of Chaos and Complexity has increased our understanding of the Cosmos and its effect
on us. If you surf the chaos content regions of the internet, you will invariably encounter terms such as: -
These influences can take some time to manifest themselves, but that is the nature of the phenomena
identified as a "strange attractor." Such differences could be small to the point of invisibility - how tiny
can influences be to have any effect? This is captured in the butterfly scenario described below.
Complex Systems and Chaos Theory
Weaver (Complexity Theory) along with Gleick and Lorenzo (Chaos Theory) have
given us some of the tools that we need to understand these complex, interrelated
chaotic and radically disruptive political, economic and social events such as the
collapse of Global markets and the various protests against this - using Event
Decomposition, Complexity Mapping, and Statistical Analysis to help us identify
patterns, extrapolations, scenarios and trends unfolding as seemingly unrelated,
random and chaotic events. The Hawking Paradox, however, challenges this view of
Complex Systems by postulating that uncertainty dominates complex, chaotic systems
to such an extent that future outcomes are both unknown - and unknowable.
Complex Adaptive Systems are further contrasted compared with other ordered and
chaotic systems by the relationship that exists between the system and the agents and
catalysts of change which act upon it. In an ordered system the level of constraint means
that all agent behaviour is limited to the rules of the system. In a chaotic system these
agents are unconstrained and are capable of random events, uncertainty and disruption.
In a CAS, both the system and the agents co-evolve together; the system acting to
lightly constrain the agents behaviour - the agents of change, however, modify the
system by their interaction. CAS approaches to behavioural science seek to understand
both the nature of system constraints and change agent interactions and generally takes
an evolutionary or naturalistic approach to crowd scenario planning and impact analysis.
Linear and Non-linear Systems
Linear Systems all system outputs are directly and proportionally related to system inputs
Types of linear algebraic function behaviours; examples of Simple Systems include: -
Game Theory and Lanchester Theory
Civilisations and SIM City Games
Drake Equation (SETI) for Galactic Civilisations
Non-linear Systems system outputs are asymmetric and not proportional or related to inputs
Types of non-linear algebraic function behaviours: examples of Complex / Chaotic Systems are: -
Complex Systems large numbers of elements with both symmetric and asymmetric relationships
Complex Adaptive Systems (CAS) co-dependency and co-evolution with external systems
Multi-stability alternates between multiple exclusive states.(lift status = going up, down, static)
Chaotic Systems
Classical chaos the behaviour of a chaotic system cannot be predicted.
A-periodic oscillations functions that do not repeat values after a certain period (# of cycles)
Solitons self-reinforcing solitary waves - due to feedback by forces within the same system
Amplitude death any oscillations present in the system cease after a certain period (# of cycles)
due to feedback by forces in the same system - or some kind of interaction with external systems.
Navis-Stokes Equation for the motion of a fluid: -
Weather Forecasting
Plate Tectonics and Continental Drift
System Complexity
System Complexity is typically characterised by the number of elements in a system,
the number of interactions between those elements and the nature (type) of interactions.
One of the problems in addressing complexity issues has always been distinguishing
between the large number of elements and relationships, or interactions evident in
chaotic (disruptive, unconstrained) systems - and the still large, but significantly smaller
number of elements and interactions found in ordered (constrained) systems.
Disorderly (unconstrained) System Frameworks tend to have both a very large total
number of non-uniform elements featuring complex (non-linear, asymmetric) interactions
which may be organised into many classes and regimes. Disorderly (unconstrained)
System Frameworks feature a greater number of more disordered, uncorrelated and
unconstrained element interactions with implicit or random rules which tend to exhibit
unpredictable, random, chaotic and disruptive system behaviour and creates surprises.
Complexity Map
Complex Systems and Chaos Theory
A system may be defined as simple or linear whenever its evolution sensitively is fully
independent of its initial conditions and may also be described as deterministic
whenever the behaviour of a simple (linear) systems can be accurately predicted and
when all of the observable system outputs are directly and proportionally related to
system inputs. We can expect smooth, linear, highly predictable outcomes to simple
systems which are driven by linear algebraic functions.
The Control of Chaos refers to a process where a tiny external system influence is
applied to a chaotic system, so as to slightly vary system conditions in order to achieve
a desirable and predictable (periodic or stationary) outcome. To synchronise and resolve
chaotic system behaviour we may invoke external procedures for stabilizing chaos which
interact with symbolic sequences of an embedded chaotic attractor - thus influencing
chaotic trajectories. The major concepts involved in the Control of Chaos, are described
by two methods the Ott-Grebogi-Yorke (OGY) Method and the Adaptive Method.
The Adaptive Method for the resolution of Complex, Chaotic Systems introduces multiple
relatively simple and loosely coupled interacting systems in an attempt to model over time
the behaviour of a single, large Complex and Chaotic System - which may still be subject
to undetermined external influences thus creating random system effects.....
Wave-form Analytics
WAVE-FORM ANALYTICS is an analytical tool based on Time-frequency Wave-
form analysis which has been borrowed from spectral wave frequency analysis in
Physics. Deploying the Wigner-Gabor-Qian (WGQ) spectrogram a method which
exploits wave frequency and time symmetry principles demonstrates a distinct trend
forecasting and analysis capability in Wave-form Analytics. Trend-cycle wave-form
decomposition is a critical technique for testing the validity of multiple (compound)
dynamic wave-series models competing in a complex array of interacting and inter-
dependant cyclic systems - waves driven by both deterministic (human actions) and
stochastic (random, chaotic) paradigms in the study of complex cyclic phenomena.
Scan and
Identify
Communicate Discover
Background Noise
Individual Wave
Complex Adaptive Systems are further contrasted compared with other ordered and
chaotic systems by the relationship that exists between the system and the agents and
catalysts of change which act upon it. In an ordered system the level of constraint means
that all agent behaviour is limited to the rules of the system. In a chaotic system these
agents are unconstrained and are capable of random events, uncertainty and disruption.
In a CAS, both the system and the agents co-evolve together; the system acting to
lightly constrain the agents behaviour - the agents of change, however, modify the
system by their interaction. CAS approaches to behavioural science seek to understand
both the nature of system constraints and change agent interactions and generally takes
an evolutionary or naturalistic approach to crowd scenario planning and impact analysis.
Complex Adaptive Systems
Biological, Sociological, Economic and Political systems all tend to demonstrate
Complex Adaptive System (CAS) behaviour - which appears to be more similar
in nature to biological behaviour in an population than to truly Disorderly, Chaotic,
Stochastic Systems (Random Systems). For example, the remarkable long-term
adaptability, stability and resilience of market economies may be demonstrated by
the impact of Black Swan Events causing stock market crashes - such as oil price
shocks (1970-72) and credit supply shocks (1927- 1929 and 2008 onwards) by
the ability of Financial markets to rapidly absorb and recover from these events.
Unexpected and surprising Cycle Pattern changes have historically occurred during
regional and global conflicts being fuelled by technology innovation-driven arms
races - and also during US Republican administrations (Reagan and Bush - why?).
Just as advances in electron microscopy have revolutionised the science of biology
- non-stationary time series wave-form analysis has opened up a new space for
Biological, Sociological, Economic and Political system studies and diagnostics.
Event Complexity Map
Crowd Behaviour 1 the Swarm
An example of Random Clustering is a Crowd or Swarm in Social Animals -
Insects (locusts), Birds (starlings) and Mammals (lemmings) and Human Beings.
There are a various forces which contribute towards Crowd Behaviour or
Swarming. In any crowd of human beings or a swarm of animals, individuals in
the crowd or swarm are closely connected so that they share the same mood and
emotions (fear, greed, rage) and demonstrate the same or very similar behaviour
(fight, flee or feeding frenzy). Only the initial few individuals exposed to the
Random Event or incident may at first respond strongly and directly to the initial
trigger stimulus, causal event or incident (opportunity or threat such as
external predation, aggression or discovery of a novel or unexpected opportunity
to satisfy a basic need such as feeding, reproduction or territorialism).
Those individuals who have been directly exposed to the initial trigger event or
incident - the system input or causal event that initiated a specific outbreak of
behaviour in a crowd or swarm quickly communicate and propagate their
swarm response mechanism and share with all the other individuals those
members of the Crowd immediately next to them so that modified Crowd
behaviour quickly spreads from the periphery or edge of the Crowd.
Crowd Behaviour 2 the Swarm
In a gathering or crowd of human beings or in a swarm of animals (insect swarm, fish
bait ball, flock of birds, pack of mammals), individuals are so closely connected or
tightly packed that they share the same, or interconnected, mood and emotions (fear,
curiosity, greed, rage) that they demonstrate the same - or very similar - patterns of
behaviour (fight, flee or feeding frenzy). Only the initial few individuals at the edge of
the crowd that are exposed to the Causal Stimulus, Event or Incident respond at first
- strongly and directly - to the initial trigger stimulus, causal event or incident
(opportunity or threat such as external predation, aggression or territorialism) - or
discovery of a novel or unexpected opportunity to satisfy and fulfil a basic need
(such as feeding, nesting, roosting or reproduction).
More and more Peripheral Crowd members in turn adopt the Crowd response
behaviour - without having been directly exposed to, or even know about, the Swarm
trigger. Members of the crowd or swarm may be oblivious to the initial source or
nature of the trigger stimulus - nonetheless, the common Crowd or Swarm behaviour
response quickly spreads to all of the individuals in or around that crowd or swarm.
Crowd Behaviour 3 the Swarm
Thus those individuals who have been directly exposed to the initial trigger event or
incident (predation threat, feeding frenzy, roosting etc.) can quickly communicate and
propagate the initial trigger event through their swarm response mechanisms and
share that trigger / response coupling with all the other individuals beginning with
those members of the Crowd immediately next to them so that every new, modified
Crowd behaviour quickly spreads from the periphery or edge of the Crowd
throughout the whole Crowd population.
Peripheral Crowd members in turn adopt the Crowd response behaviour without
having been directly exposed to the trigger the system input or causal event that
initiated a specific outbreak of behaviour in a crowd or swarm . Most members of the
crowd or swarm may be totally oblivious as to the initial source or nature of the
trigger stimulus - nonetheless, the common Crowd behaviour response quickly
spreads to all of the individuals in or around that crowd or swarm.
The discovery of Chaos and Complexity has increased our understanding of the Cosmos and its effect
on us. If you surf the chaos content regions of the internet, you will invariably encounter terms such as: -
These influences can take some time to manifest themselves, but that is the nature of the phenomena
identified as a "strange attractor." Such differences could be small to the point of invisibility - how tiny
can influences be to have any effect? This is captured in the butterfly scenario described below.
Complex Systems and Chaos Theory
Weaver (Complexity Theory) along with Gleick and Lorenzo (Chaos Theory) have
given us some of the tools that we need to understand these complex, interrelated
chaotic and radically disruptive political, economic and social events such as the
collapse of Global markets and the various protests against this - using Event
Decomposition, Complexity Mapping, and Statistical Analysis to help us identify
patterns, extrapolations, scenarios and trends unfolding as seemingly unrelated,
random and chaotic events. The Hawking Paradox, however, challenges this view of
Complex Systems by postulating that uncertainty dominates complex, chaotic systems
to such an extent that future outcomes are both unknown - and unknowable.
There is an interesting phenomenon called Phase Locking where two loosely coupled
systems with slightly different frequencies show a tendency to move into resonance in order
to harmonise with one another. We also know that the opposite of system convergence -
system divergence - is also possible with phase-locked systems, which can also diverge with
only very tiny inputs - especially if we run those systems in reverse. Thus phase locking
draws two nearly harmonic systems into resonance and gives us the appearance of a
coincidence. There are, however, no coincidences in Physics. Sensitive Dependence in
Complexity Theory also tells us that minute, imperceptible changes to inputs at the initial state
of a system, at the beginning of a cycle, are sufficient to dramatically alter the final state after
even only a few iterations of the system cycle.
Complex Systems and Chaos Theory
Complex Systems and Chaos Theory has been used extensively in the field of Futures Studies, Strategic
Management, Natural Sciences and Behavioural Science. It is applied in these domains to understand how
individuals or populations, societies and states act as a collection of systems which adapt to changing
environments bio-ecological, socio-economic or geo-political. The theory treats individuals, crowds and
populations as a collective of pervasive social structures which are influenced by random individual
behaviours such as flocks of birds moving together in flight to avoid collision, shoals of fish forming a bait
ball in response to predation, or groups of individuals coordinating their behaviour in order to exploit novel
and unexpected opportunities which have been discovered or presented to them.
When Systems demonstrate properties of Complex Adaptive Systems (CAS) - which is often defined as
consisting of a small number of relatively simple and loosely connected systems - then they are much more
likely to adapt to their environment and, thus, survive the impact of change and random events. Complexity
Theory thinking has been present in strategic and organisational studies since the first inception of Complex
Adaptive Systems (CAS) as an academic discipline.
Complex Adaptive Systems are further contrasted compared with other ordered and chaotic systems by
the relationship that exists between the system and the agents and catalysts of change which act upon it. In
an ordered system the level of constraint means that all agent behaviour is limited to the rules of the system.
In a chaotic system these agents are unconstrained and are capable of random events, uncertainty and
disruption. In a CAS, both the system and the agents co-evolve together; the system acting to lightly
constrain the agents behaviour - the agents of change, however, modify the system by their interaction. CAS
approaches to behavioural science seek to understand both the nature of system constraints and change
agent interactions and generally takes an evolutionary or naturalistic approach to crowd scenario planning
and impact analysis.
Random Event Clustering Patterns in
the Chaos.....
Order out of Chaos Patterns in the Randomness
There is an interesting phenomenon called Phase Locking where two loosely coupled systems with slightly
different frequencies show a tendency to move into resonance in order to harmonise with one another. We
also know that the opposite of system convergence - system divergence - is also possible with phase-locked
systems, which can also diverge with only very tiny inputs - especially if we run those systems in reverse.
Thus phase locking draws two nearly harmonic systems into resonance and gives us the appearance of a
coincidence. There are, however, no coincidences in Physics. Sensitive Dependence in Complexity Theory
also tells us that minute, imperceptible changes to inputs at the initial state of a system, at the beginning of a
cycle, are sufficient to dramatically alter the final state after even only a few iterations of the system cycle.
From the numerous geological examples found in Nature including ice-cores, marine sediments and
calcite deposits, we know that Composite Wave Models such as Milankovitch Cycles behave as a
Composite Wave Series with automatic, self-regulating control mechanisms - and demonstrate
Harmonic. Resonance and Interference Patters with extraordinary stability in periodicity through
many system cycles over durations measured in tens of millions of years.
Climatic Change and the fundamental astronomical and climatic cyclic variation frequencies are
coherent, strongly aligned and phase-locked with the predictable orbital variation of 20-100 k.y
Milankovitch Climatic Cycles which have been modeled and measured for many iterations, over a
prolonged period of time, and across many levels of temporal tiers - each tier hosting different types of
geological processes, which in turn influence different layers of Human Activity.
Milankovitch Cycles - are precise astronomical cycles with periodicities of 22, 41, 100 and 400 k.y
Precession (Polar Wandering) - 22,000 year cycle
Eccentricity (Orbital Ellipse) 100,000 and 400,000 year cycles
Obliquity (Axial Tilt) - 41,000-year cycle
WAVE THEORY NATURAL CYCLES
Sub-Milankovitch Climatic Cycles
Sub-Milankovitch Climatic Cycles are less well understood varying from Sun Cycles of 11 years
to Climatic Variation Trends of up to 1470 years intervals, may also impact on Human Activity
short-term Economic Patterns, Cycles and Innovation Trends to long-term Technology Waves and
the rise and fall of Civilizations. A possible explanation might be found in Resonance Harmonics of
Milankovitch-Cycles 20-100 ky / sub-Cycle Periodicity - resulting in Interference Phenomenon from
periodic waves being re-enforced and cancelled. Dansgaard-Oeschger (D/O) events with precise
1470 years intervals - occurred repeatedly throughout much of the late Quaternary Period.
Dansgaard-Oeschger (D/O) events were first reported in Greenland ice cores by scientists Willi
Dansgaard and Hans Oeschger. Each of the 25 observed D/O events in the Quaternary Glaciation
Time Series consist of an abrupt warming to near-interglacial conditions that occurred in a matter of
decades - followed by a long period of gradual cooling down again over thousands of years
Climate change is not uniform some areas of the globe (Arctic and Antarctica) have seen a
dramatic rise in average annual temperature whilst other areas have seen lower temperature
gains. The original published temperature record for Climate Change is in red, while the updated
version is in blue. The black curve is the proposed harmonic component plus the proposed
corrected anthropogenic warming trend. The figure shows in yellow the harmonic component
alone made of the four cycles, which may be interpreted as a lower boundary limit for the natural
variability. The green area represents the range of the IPCC 2007 GCM projections.
The astronomical / harmonic model forecast since 2000 looks in good agreement with the data
gathered up to now, whilst the IPCC model projection is not in agreement with the steady
temperature observed since 2000. This may be due to other effects, such as cooling due to
increased water evaporation (humidity has increased about 4% since measurements began in the
18th centaury) or cloud seeded by jet aircraft condensation trails which reduce solar forcing by
reflecting energy back into space. Both short-term solar-lunar cycle climate forecasting and
long-term Milankovitch solar forcing cycles point towards a natural cyclic phase of gradual
cooling - which partially off-sets those Climate Change factors (Co2 etc.) due to Human Actions.
Scafetta on his latest paper: Harmonic climate model versus the IPCC general circulation climate models
Clustering Phenomena in Big Data
Clustering in Big Data
A Cluster is a group of the same or similar data elements
which are aggregated or closely distributed together
Data Set Mashing and Big Data Global Content Analysis drives Horizon Scanning,
Monitoring and Tracking processes by taking numerous, apparently un-related RSS and
other Information Streams and Data Feeds, loading them into Very large Scale (VLS)
DWH Structures and Document Management Systems for Real-time Analytics searching
for and identifying possible signs of relationships hidden in data (Facts/Events) in order to
discover and interpret previously unknown Data Relationships driven by hidden Clustering
Forces revealed via Weak Signals indicating emerging and developing Application
Scenarios, Patterns and Trends - in turn predicating possible, probable and alternative
global transformations which may unfold as future Wild Card or Black Swan events.
Clustering in Big Data
The profiling and analysis of
large aggregated datasets in
order to determine a natural
structure of groupings provides
an important technique for many
statistical and analytic
applications. Cluster analysis
on the basis of profile similarities
or geographic distribution is a
method where no prior
assumptions are made
concerning the number of
groups or group hierarchies and
internal structure. Geo-
demographic techniques are
frequently used in order to
profile and segment populations
by natural groupings - such as
common behavioural traits,
Clinical Trial, Morbidity or
Actuarial outcomes - along with
many other shared
characteristics and common
factors.....
Clustering in Big Data
"BIG DATA ANALYTICS PROFILING, CLUSTERING and 4D GEOSPATIAL ANALYSIS
The profiling and analysis of large aggregated datasets in order to determine a natural
structure of data relationships or groupings, is an important starting point forming the basis of
many mapping, statistical and analytic applications. Cluster analysis of implicit similarities -
such as time-series demographic or geographic distribution - is a critical technique where no
prior assumptions are made concerning the number or type of groups that may be found, or
their relationships, hierarchies or internal data structures. Geospatial and demographic
techniques are frequently used in order to profile and segment populations by natural
groupings. Shared characteristics or common factors such as Behaviour / Propensity or
Epidemiology, Clinical, Morbidity and Actuarial outcomes allow us to discover and explore
previously unknown, concealed or unrecognised insights, patterns, trends or data relationships.
The results of spatial data analysis are largely dependent upon the type,
quantity, distribution and data quality of the spatial objects under analysis.
World-wide Visitor Count GIS Mapping
Geo-demographic Clustering in Big Data
GEODEMOGRAPHIC PROFILING CLUSTERING INBIG DATA
A H
C
E
D
B
G
F
The above is an illustration of Event relationships - how Events might be connected. Any detailed,
intimate understanding of the connection between Events may help us to answer questions such as: -
Answering questions such as these allows us to plan our Event Management approach and Risk
mitigation strategy and to decide how better to focus our Incident / Event resources and effort..
Event Clusters and Connectivity
Aggregated Event includes coincident, related, connected and interconnected Event: -
Related - two more Events materialise in the same domain sharing common
Event features or characteristics (may share a possible hidden common trigger or
cause and so are candidates for further analysis and investigation)
Connected - two more Events materialise in the same domain due to the same
trigger (common cause)
A series of Aggregated Events may result in a significant cumulative impact - and are
therefore frequently identified incorrectly as Wild-card or Black Swan Events - rather
than just simply as event clusters or event storms.....
Event Clusters and Connectivity
Residence
Claimant 1 Claimant 2
1 8
3
Event 5
Cluster
4
Risk Event 2
7
6 Vehicle
The above is an illustration of Event relationships - how Risk Events might be connected. A detailed and
intimate understanding of Event clusters and the connection between Events may help us to understand: -
What is the relationship between Events 1 and 8, and what impact do they have on Events 2 - 7 ?
Events 2 - 5 and Events 6 and 7 occur in clusters what are the factors influencing these clusters ?
Answering questions such as these allows us to plan our Risk Event management approach and mitigation
strategy and to decide how to better focus our resources and effort on Risk Events and fraud management.
Aggregated Event Types
Coincident Events Related Events
Trigger Trigger G H
F Event
Event Complexity Map
Random Event Clustering
Patterns in the Chaos
The Nature of Uncertainty Randomness
A H
C
E
D
B
G
F
The above is an illustration of Event relationships - how Events might be connected. Any detailed,
intimate understanding of the connection between Events may help us to answer questions such as: -
Answering questions such as these allows us to plan our Event Management approach and Risk
mitigation strategy and to decide how better to focus our Incident / Event resources and effort..
Event Clusters and Connectivity
Aggregated Event includes coincident, related, connected and interconnected Event: -
Related - two more Events materialise in the same domain sharing common
Event features or characteristics (may share a possible hidden common trigger or
cause and so are candidates for further analysis and investigation)
Connected - two more Events materialise in the same domain due to the same
trigger (common cause)
A series of Aggregated Events may result in a significant cumulative impact - and are
therefore frequently identified incorrectly as Wild-card or Black Swan Events - rather
than just simply as event clusters or event storms.....
Event Clusters and Connectivity
Residence
Claimant 1 Claimant 2
1 8
3
Event 5
Cluster
4
Risk Event 2
7
6 Vehicle
The above is an illustration of Event relationships - how Risk Events might be connected. A detailed and
intimate understanding of Event clusters and the connection between Events may help us to understand: -
What is the relationship between Events 1 and 8, and what impact do they have on Events 2 - 7 ?
Events 2 - 5 and Events 6 and 7 occur in clusters what are the factors influencing these clusters ?
Answering questions such as these allows us to plan our Risk Event management approach and mitigation
strategy and to decide how to better focus our resources and effort on Risk Events and fraud management.
Aggregated Event Types
Coincident Events Related Events
Trigger Trigger G H
F Event
Event Complexity Map
Multi-channel Retail - Digital Architecture
"The idea behind Tufte in R is to use R - the easiest and most powerful
open-source statistical analysis programming language - to replicate
the excellent data visualisation practices developed by Edward Tufte
- Diego Marinho de Oliveira - Lead Data Scientist / Ph.D. candidate
Social Intelligence The Emerging Big Data Stack
Data Set Mashing and Big Data Global Content Analysis drives Horizon Scanning,
Monitoring and Tracking processes by taking numerous, apparently un-related RSS and
other Information Streams and Data Feeds, loading them into Very large Scale (VLS)
DWH Structures and Document Management Systems for Real-time Analytics searching
for and identifying possible signs of relationships hidden in data (Facts/Events) in order to
discover and interpret previously unknown Data Relationships driven by hidden Clustering
Forces revealed via Weak Signals indicating emerging and developing Application
Scenarios, Patterns and Trends - in turn predicating possible, probable and alternative
global transformations which may unfold as future Wild Card or Black Swan events.
Forensic Big Data
FORENSIC BIG DATA
Forensic Big Data combines the use of Social Media and Social Mapping
Data in order to understand intimate inter-personal relationships for the purpose
of National Security, anti-Trafficking and Fraud Prevention through the
identification, composition, activity analysis and monitoring of Criminal
Enterprises and Terrorist Cells.....
Cluster Analysis is a technique used to explore very large volumes of transactional and
machine generated (automatic) data, social media and internet content and information -
in order to discover previously unknown, unrecognised or hidden data relationships.
Clustering is an essential tool for any Big Data problem. Cluster Analysis of both
explicit (given) or implicit (discovered) data relationships in Big Data is a critical
technique which attempts to explain the nature, cause and effect of the forces which drive
clustering. Any observed profiled data similarities geographic or temporal aggregations,
mathematical or statistical distributions may be explained through Causal Layer Analysis.
Choice of clustering algorithm and parameters are both process and data dependent
Approximate Kernel K-means provides a good trade-off between clustering accuracy and
data volumes, throughput, performance and scalability
The profiling and analysis of very large aggregated datasets to determine natural or
implicit data relationships and discover hidden common factors and data structures -
where no prior assumptions are made concerning the number or type of groups - is
driven by uncovering previously unknown data relationships and natural groupings.
The discovery of such Cluster / Group relationships, hierarchies or internal data
structures is an important starting point forming the basis of many statistical and
analytic applications which are designed to expose hidden data relationships.
Astrophysics 4D Distribution of Star Systems Mass / Energy Astronomy Images Optical Telescope Gravity
Matter across the Stellar Clusters Space / Time Microwave, Infrared, Infrared Telescope Dark Matter
Universe through Galaxies Optical, Ultraviolet, Radio, Radio Telescope Dark Energy
Space and Time Galactic Clusters X-ray, Gamma-ray X-ray Telescope Dark Flow
Climate Change Temperature Changes Hot / Cold Temperature Average Temperature Weather Station Data Solar Forcing
Precipitation Changes Dry / Wet Precipitation Average Precipitation Ice Core Data Oceanic Forcing
Ice-mass Changes More / Less ice Sea / Land Ice Greenhouse Gases % Tree-ring Data Atmospheric Forcing
Actuarial Science Place / Date of birth Birth / Death Medical Events Biomedical Data Register of Births Health
Morbidity, Clinical Place / Date of death Longevity Geography Demographic Data Register of Deaths Wealth
Trials, Epidemiology Cause of Death Cause of Death Time Geographic data Medical Records Demographics
Price Curves Economic growth Bull markets Monetary Value Real (Austrian) GDP Government Business Cycles
Economic Modelling Economic recession Bear markets Geography Foreign Exchange Rates Central Banks Economic Trends
Long-range Forecasting Time Interest Rates Money Markets Market Sentiment
Price movements Stock Exchange Fear and Greed
Daily Closing Prices Commodity Exchange Supply / Demand
Business Clusters Retail Parks Retail Company / SIC Entrepreneurs Investors Capital / Finance
Digital / Fin Tech Technology Geography Start-ups NGAs Political policy
Leisure / Tourism Resorts Time Mergers Government Economic policy
Creative / Academic Arts / Sciences Acquisitions Academic Bodies Social policy
Elite Team Sports Winners Team / Athlete Sporting Events Performance Data Sports Governing Bodies Technique
Performance Science Loosens Sport / Club Team / Athlete Biomedical Data RSS News Feeds Application
League Tables Sport / Club Social Media Form / Fitness
Medal Tables Geography Hawk-Eye Ability / Attitude
Time Pro-Zone Training / Coaching
Speed / Endurance
Future Management Human Activity Random Events Random Events Weak Signals Global Internet Content / Random Events
Natural Events Waves, Cycles, Geography Strong Signals Big Data Analytics - Waves, Cycles,
Patterns, Trends Time Wild Card Events Horizon Scanning, Patterns, Trends,
Black Swan Events Tracking and Monitoring Extrapolations
Clustering in Big Data
"BIG DATA ANALYTICS PROFILING, CLUSTERING and 4D GEOSPATIAL ANALYSIS
The profiling and analysis of large aggregated datasets in order to determine a natural
structure of data relationships or groupings, is an important starting point forming the basis of
many mapping, statistical and analytic applications. Cluster analysis of implicit similarities -
such as time-series demographic or geographic distribution - is a critical technique where no
prior assumptions are made concerning the number or type of groups that may be found, or
their relationships, hierarchies or internal data structures. Geospatial and demographic
techniques are frequently used in order to profile and segment populations by natural
groupings. Shared characteristics or common factors such as Behaviour / Propensity or
Epidemiology, Clinical, Morbidity and Actuarial outcomes allow us to discover and explore
previously unknown, concealed or unrecognised insights, patterns, trends or data relationships.
3 2.4 1.5
4 3.1 1.6
5 3.0 3.8
6 3.1 1.9
7 3.3 1.5
8 1.2 1.5
Distributed Clustering Model Performance
Distributed Approximate Kernel K-means
2.3 GHz quad-core Intel Xeon processors, with 8GB memory in intel07 cluster
Run-time
Size of Benchmark
dataset Performance
(no. of (Speedup
Records) Factor )
10K 3.8
100K 4.8
1M 3.8
10M 6.4
HPCC Clustering Models
High Performance / High Concurrence Real-time Delivery (HPCC)
Distributed Clustering Models
Hadoop
Clustering and Managing Data.....
Managing Data Transfers in Networked Computer Clusters using Orchestra
To illustrate I/O Bottlenecks, we studied Data Transfer impact in two clustered computing systems: -
Mosharaf Chowdhury, Matei Zaharia, Justin Ma, Michael I. Jordan, Ion Stoica
University of California, Berkeley
{mosharaf, matei, jtma, jordan, istoica}@cs.berkeley.edu
Clustering and Managing Data.....
Clustering and Managing Data.....
The differentials between new and old technology has a way of revealing itself by
demonstrating what is elastic and dynamic - compared to what is rigid and static.
Its not a measure of which technology is considered to ne good or bad. It simply
represents the progression from client/server technology to the Internet-scale,
data-driven services that is now gaining such critical momentum.
Using antonyms helps better correlate what is considered a cloud service and what
is not, as well as the relative relationship between an online service like Google
Docs as compared to a Microsoft Word document. The differences can help
understand the new way IT services are delivered as compared to older methods.
The big Internet companies have had to create an infrastructure that could scale
and be highly efficient and fast. The result: new ways to think of how we manage
data.
Clustering and Managing Data.....
Clustering and Managing Data.....
Hadoop has become popular as a big data platform because it's scalable, flexible,
cost-effective and can handle a range of data types (also known as multi-structured
data) without the data-modelling and transformation stages associated with relational
database technologies. The major drawback, however, is that the options for data
analysis on-the-fly in Hadoop range from the limited (through Hive, for example) to
the exceedingly restricted, slow and complicated (via batch-oriented MapReduce
processing). Plenty of vendors are working on solutions to this problem notably:-
EMC claims that its new Pivotal Labs HD distribution now has this problem resolved.
They announced that they have resolved one of the major limitations of the Apache
Hadoop platform by leveraging its Greenplum massively parallel processing (MPP)
database to query the data directly from the Hadoop Distributed File System (HDFS).
Clustering and Managing Data.....
Cluster computing applications such as Hadoop, Dryad, Swift, Flume and Millwheel
transfer massive amounts of data between each of their computational stages. These
transfers can have a significant impact on stage performance and throughput -
accounting for more than 50% of elapsed job time. Despite this severe impact, there
has been relatively little work done on optimizing the performance of these data
transfers - with networking researchers traditionally focusing on data-flow traffic
management.
"It's really cool to start seeing folks using a multi-structured data store as the storage
layer for SQL-based analysis," said John Myers, senior analyst at Enterprise
Management Associates, in an interview with Information Week. The combination will
enable companies to use Hadoop as a single platform for both structured and multi-
structured data, essentially combining data warehouses and Hadoop, Myers said. With
HAWQ, business users and analysts can use conventional SQL querying and BI tools for
their work while data scientist can continue to access date directly using programming
APIs and Hadoop-related tools such as MapReduce, Pig, Hive, Scoup and Mahout.
Clustering and Managing Data.....
While in large part successful, these solutions have so far been focusing on scheduling
and managing computation and storage resources, whilst mostly ignoring network
resources. The Berkley solution for Managing Data Transfers in Networked Computer
Clusters is Orchestra.
In the last decade we have seen rapid growth of cluster computing frameworks in order
to analyze the increasing amounts of data collected and generated by web services like
Google, Facebook and Yahoo!. Hadoop frameworks (e.g., MapReduce , Dryad, CIEL and
Spark) typically implement a data-flow computation model - where a series of datasets
pass sequentially through a set of processing stages.
Many jobs deployed in these frameworks manipulate massive amounts of data and run on
clusters consisting of as many as tens of thousands of machines. Due to the very high
cost of these clusters, operators often aim to maximize the cluster utilization, while
accommodating a variety of applications, workloads, and user requirements. To achieve
these goals, several solutions have recently been proposed to reduce job completion time
utilising these clusters. Operators often aim to maximize the cluster utilization, whilst
accommodating a variety of applications, workloads, and user requirements. To achieve
these goals, several solutions have recently been proposed in order to reduce job
completion time.
Clustering and Managing Data.....
However, managing and optimizing network activity is critical for improving job
performance. Indeed, Hadoop traces from a 3000-node cluster at Facebook
showed that, on average, transferring data between successive stages
accounts for 33% of the running times of jobs with reduce phases. Existing
proposals for full bisection bandwidth networks along with flow-level scheduling
can improve network performance, but they do not account for collective
behaviours of flows due to the lack of job-level semantics.
We then measured what fraction of the jobs lifetime was spent in this shuffle phase.
This is a conservative estimate of the impact of shuffles, because reduce tasks can
also start fetching map outputs before all the map tasks have finished. We found that
32% of jobs had no reduce phase (i.e., only map tasks). This is common in data
loading jobs. For the remaining jobs, we plot a CDF of the fraction of time spent in the
shuffle phase (as defined above) in Figure 1. On average, the shuffle phase accounts
for 33% of the running time in these jobs. In addition, in 26% of the jobs with reduce
tasks, shuffles account for more than 50% of the running time, and in 16% of jobs, they
account for more than 70% of the running time. This confirms widely reported results
that the network creates a real CPU I.O Wait bottleneck in MapReduce
Hadoop Framework
Big Data Applications
Science and Technology Civil and Military Intelligence
Pattern, Cycle and Trend Analysis Digital Battlefields of the Future Data Gathering
Horizon Scanning, Monitoring and Tracking Future Combat Systems - Intelligence Database
Weak Signals, Wild Cards, Black Swan Events Person of Interest Database Criminal Enterprise,
Political organisations and Terrorist Cell networks
Multi-channel Retail Analytics
Remote Warfare - Threat Viewing / Monitoring /
Customer Profiling and Segmentation Identification / Tracking / Targeting / Elimination
Human Behaviour / Predictive Analytics HDCCTV Automatic Character/Facial Recognition
Global Internet Content Management
Security
Social Media Analytics
Security Event Management - HDCCTV, Proximity
Market Data Management and Intrusion Detection, Motion and Fire Sensors
Global Internet Content Management Emergency Incident Management - Response
Smart Devices and Smart Apps Services Command, Control and Co-ordination
Call Details Records
Biomedical Data Streaming
Internet Content Browsing
Care in the Community
Media / Channel Selections
Assisted Living at Home
Movies, Video Games and Playlists
Smart Hospitals and Clinics
Broadband / Home Entertainment
Call Details Records
Internet of Things (IOT)
SCADA Remote Sensing, Monitoring and Control
Internet Content Browsing
Smart Grid Data (machine generated data)
Media / Channel Selections
Vehicle Telemetry Management
Movies, Video Games and Playlists
Intelligent Building Management
Smart Metering / Home Energy
Smart Homes Automation
Energy Consumption Details Records
Comparing Data in RDBMS, Appliances and Hadoop
RDBMS DWH DWH Appliance Hadoop Cluster
Data size Gigabytes Terabytes Petabytes
Updates Read and write Write once, read many Write once, read many
1. SENSE LAYER Remote Monitoring and Control Devices WHAT and WHEN?
4. GEO-DEMOGRAPHIC LAYER Social Media, People and Places WHO and WHERE ?
5. INFORMATION LAYER Big Data and Internet Content data set mashing HOW ?
SENSE LAYER Remote Monitoring and Control Devices WHAT and WHEN ?
Public house
Market
TV Set-top Box Smart App Survey Data
Channel Selections Playlists
The Pyramid
Big Data
Administration Insights Analytics
Big Data
Consumers
Redis Redis is an open source key-value database for AWS, Pivotal etc.
SSD Solid State Drive (SSD) configured as cached memory / fast HDD
SAP HANA SAP HANA Cloud - In-memory Big Data Analytics Appliance
1010 Data Big Data Discovery, Visualisation and Sharing Cloud Platform
Dion Hinchcliffe
10 Ways To Complement the Enterprise RDBMS Using Hadoop
The relational database is finally showing some signs of age as data volumes and network
speeds grow faster than the computer industry's present compliance with Moore's Law can
keep pace with. The Web in particular is driving innovation in new ways of processing
information as the data footprints of Internet-scale applications become prohibitive using
traditional SQL database engines.
When it comes to database processing today, change is being driven by (at least) four factors:
Speed. The seek times of physical storage is not keeping pace with improvements in network speeds.
Scale. The difficulty of scaling the RDBMS out efficiently (i.e. clustering beyond a handful of servers is
notoriously hard.)
Integration. Today's data processing tasks increasingly have to access and combine data from many
different non-relational sources, often over a network.
Volume. Data volumes have grown from tens of gigabytes in the 1990s to hundreds of terabytes and
often petabytes in recent years.
Consume End-User Data Data Presentation and Display
Mobile Enterprise Platforms (MEAPs) Excel
Smart Devices Web
Smart Apps Mobile
Smart Grid
Data Delivery and Consumption
Clinical Trial, Morbidity and Actuarial Outcomes
Market Sentiment and Price Curve Forecasting Info. Management Tools
Horizon Scanning,, Tracking and Monitoring Biolap Business Objects
Weak Signal, Wild Card and Black Swan Event Forecasting Jedox Cognos
Targeting Map / Reduce Sagent Hyperion
Polaris Microstrategy
Analytics Engines - Hadoop
Apache Hadoop Framework
HDFS, MapReduce, Metlab R
Autonomy, Vertica
Performance Acceleration Data Management Tools
GPUs massive parallelism
Ab Initio DataFlux
SSDs in-memory processing
Ascential Embarcadero
DBMS ultra-fast data replication
Genio Informatica
Data Acquisition High-Volume Orchestra Talend
Data Discovery and Collection
News Feeds and Digital Media
Global Internet Content
Social Mapping
Social Media
Social CRM Data Warehouse Appliances
Data Management Processes Teradata
Data Audit SAP HANA
Data Profile Netezza (now IBM)
Data Quality Reporting Greenplum (now EMC2)
Data Quality Improvement Extreme Data xdg
Data Extract, Transform, Load Zybert Gridbox
Hadoop Framework
These datasets would previously have been very challenging and expensive to take on with a
traditional RDBMS using standard bulk load and ETL approaches. Never mind trying to efficiently
combining multiple data sources simultaneously or dealing with volumes of data that simply can't
reside on any single machine (or often even dozens). Hadoop deals with this by using a distributed
file system (HDFS) that's designed to deal coherently with datasets that can only reside across
distributed server farms. HDFS is also fault resilient and so doesn't impose the overhead of RAID
drives and mirroring on individual nodes in a Hadoop compute cluster, allowing the use of truly low
cost commodity hardware.
So what does this specifically mean to enterprise users that would like to improve their data
processing capabilities? Well, first there are some catches to be aware of. Despite enormous
strengths in distributed data processing and analysis, MapReduce is not good in some key areas that
the RDMS is extremely strong in (and vice versa). The MapReduce approach tends to have high
latency (i.e. not suitable for real-time transactions) compared to relational databases and is
strongest at processing large volumes of write-once data where most of the dataset needs to be
processed at one time. The RDBMS excels at point queries and updates, while MapReduce is best
when data is written once and read many times.
The story is the same with structured data, where the RDBMS and the rules of database
normalization identified precise laws for preserving the integrity of structured data and which have
stood the test of time. MapReduce is designed for a less structured, more federated world where
schemas may be used but data formats can be much looser and freeform.
Big Data Process Overview
Data Managers
Revenue Stream
Big Data
Management
While there are many non-relational database approaches out there today (see my emerging IT and
business topics post for a list), nothing currently matches Hadoop for the amount of attention it's
receiving or the concrete results that are being reported in recent case studies. A quick look at
thelist of organizations that have applications powered by Hadoop includes Yahoo! with over
25,000 nodes (including a single, massive 4,000 node cluster), Quantcast which says it has over
3,000 cores running Hadoop and currently processes over 1PB of data per day, and Adknowledge
who uses Hadoop to process over 500 million clickstream events daily using up to 200 nodes
Split-Map-Shuffle-Reduce Process
Split Map Shuffle Reduce
Insights
Big Data
Consumers
Data Stream
Data Provisioning Raw Data Key / Value Pairs Actionable Insights
RDBMS and Hadoop: Apples and Oranges?
From this it's clear that the MapReduce model cannot replace the traditional
enterprise RDBMS. However, it can be a key enabler of a number of
interesting scenarios that can considerably increase flexibility, turn-around
times, and the ability to tackle problems that weren't possible before.
With the latter the key is that SQL-based processing of data tends not to
scale linearly after a certain ceiling, usually just a handful of nodes in a
cluster. With MapReduce, you can consistently get performance gains by
increasing the size of the cluster. In other words, double the size of Hadoop
cluster and a job will run twice as fast, triple it and the same thing, etc.
Ten Ways To Improve the RDBMS with Hadoop
So Hadoop can complement the enterprise RDMS in a number of powerful ways. These include: -
1. Accelerating nightly batch business processes. Many organizations have production transaction systems
that require nightly processing and have narrow windows to perform their calculations and analysis
before the start of the next day. Since Hadoop can scale linearly, this can enable internal or external on-
demand cloud farms to dynamically handle shrink performance windows and take on larger volume
situations that an RDBMS just can't easily deal with. This doesn't elide the import/export challenges
depending on the application but can certainly compress the windows between them.
2. Storage of extremely high volumes of enterprisedata. The Hadoop Distributed File System is a marvel in
itself and can be used to hold extremely large data sets safely on commodity hardware long term that
otherwise couldn't stored or handled easily in a relational database. I am specifically talking about
volumes of data that today's RDBMS's would still have trouble with, such as dozens or hundreds of
petabytes and which are common in genetics, physics, aerospace, counter intelligence and other
scientific, medical, and government applications.
3. Creation of automatic, redundant backups. Hadoop can then keep the data that it processes, even after
it it's been imported into other enterprise systems. HDFS creates a natural, reliable, and easy-to-use
backup environment for almost any amount of data at reasonable prices considering that it's essentially a
high-speed online data storage environment.
Ten Ways To Improve the RDBMS with Hadoop
So Hadoop can complement the enterprise RDMS in a number of powerful ways. These include: -
4. Improving the scalability of applications. Very low cost commodity hardware can be used to power Hadoop
clusters since redundancy and fault resistance is built into the software instead of using expensive enterprise
hardware or software alternatives with proprietary solutions. This makes adding more capacity (and therefore
scale) easier to achieve and Hadoop is an affordable and very granular way to scale out instead of up. While
there can be cost in converting existing applications to Hadoop, for new applications it should be a standard
option in the software selection decision tree. Note: Hadoop's fault tolerance is acceptable, not best-of-breed,
so check this against your application's requirements.
5. Use of Java for data processing instead of SQL. Hadoop is a Java platform and can be used by just about anyone
fluent in the language (other language options are coming available soon via APIs.) While this won't help shops
that have plenty of database developers, Hadoop can be a boon to organizations that have strong Java
environments with good architecture, development, and testing skills. And while yes, it's possible to use
languages such as Java and C++ to write stored procedures for an RDBMS, it's not a widespread activity.
6. Producing just-in-time feeds for dashboards and business intelligence.Hadoop excels at looking at enormous
amounts of data and providing detailed analysis of business data that an RDBMS would often take too long or
would be too expensive to carry out. Facebook, for example, uses Hadoop for daily and hourly summaries of its
150 million+ monthly visitors. The resulting information can be quickly transferred to BI, dashboards, or mashup
platforms.
Informatica / Hortonworks Vibe
Ten Ways To Improve the RDBMS with Hadoop
So Hadoop can complement the enterprise RDMS in a number of powerful ways. These include: -
7. Handling urgent, ad hoc requests for data. While certainly expensive enterprise data warehousing software can
do this, Hadoop is a strong performer when it comes to quickly asking and getting answers to urgent questions
involving extremely large datasets.
8. Turning unstructured data into relational data. While ETL tools and bulk load applications work well with
smaller datasets, few can approach the data volume and performance that Hadoop can, especially at a similar
price/performance point. The ability to take mountains of inbound or existing business data, spread the work
over a large distributed cloud, add structure, and import the result into an RDBMS makes Hadoop one of the
most powerful database import tools around.
9. Taking on tasks that require massive parallelism. Hadoop has been known to scale out to thousands of nodes in
production environments. Even better, It requires relatively little innate programing skill to achieve since
parallelism is an intrinsic property of the platform. While you can do the same with SQL, it requires some skill
and experience with the techniques. In other words, you have to know what you're doing. For organizations that
are experiencing ceilings with their current RDBMS, you can look at Hadoop to help break through them.
10. Moving existing algorithms, code, frameworks, and components to a highly distributed computing
environment. Done right -- and there are challenges depending on what your legacy code wants to do -- and
Hadoop can be used as a way to migrate old, single core code into a highly distributed environment to provide
efficient, parallel access to ultra-large datasets. Many organizations already have proven code that is tested and
hardened and ready to use but is limited without an enabling framework. Hadoop adds the mature distributed
computing layer than can transition these assets to a much larger and more powerful modern environment.
EMC has announced
that it has resolved one
of the big limitations of
the Apache Hadoop
platform by finding a
way to use its
Greenplum massively
parallel processing
(MPP) database
appliance to directly
query data in the
Hadoop Distributed File
System (HDFS).
Introduction to Hadoop HDFS
The core Hadoop project solves two problems with big data fast, reliable storage and
batch processing. We are going to focus on the default storage engine and how to integrate
with it using its REST API. Hadoop is actually quite easy to install so lets see what we can
do in 15 minutes. Ive assumed some knowledge of the Unix shell but hopefully its not too
difficult to follow the software versions are listed in the previous post.
The default storage engine is HDFS a distributed file system with directories and files
Data written to HDFS is immutable although there is some support for appends
HDFS is suited for large files avoid lots of small files
If you think about batch processing billions of records, large and immutable files make
sense. You dont want the disk spending time doing random access and dealing with
fragmented data if you can stream the whole lot from beginning to end.
Files are split in to blocks so that nodes can process files in parallel using map-reduce. By
default a Hadoop cluster will replicate each file block to 3 nodes and each file block can
take up to the configured block size (~64M).
Starting up a local Hadoop instance for development is pretty simple and even easier as
were only going to start half of it. The only setting thats needed is the host and port where
the HDFS master namenode will exist but well add a property for the location of the file
system too.
Intel reveals its own Apache Hadoop
Like EMC and Hewlett-Packard, the overarching idea behind Intel's Hadoop distribution
is to exploit massive amounts of big data for the purpose of enabling better business
decisions while also identifying potential security threats more quickly.
The big picture for Intel is to beef up its portfolio for the data centre both analytics and
offering a framework that can connect and manage multiple distributed devices across
an entire enterprise infrastructure landscape in a scalable manner.
Intel is framing its deployment of the open source software framework as a ground-up
approach by baking Hadoop directly into the silicon level. The Santa Clara, California -
based corporation explained that it is utilizing Hadoop because it is open and scalable,
thus making it a prime technology for handling evolving data centre challenges in the
enterprise space.
We're now seeing many cases from Hadoop to OpenStack, that open source technology
is driving high-performance computing and the cloud infrastructure - auguring that the
Hadoop framework, in particular, has enormous potential, Hadoop will be a foundational
layer within enterprises that can support a variety of application stacks on top of, or via,
a horizontal distribution.
Intel added that deploying and managing this Intel-Hadoop distribution should be simple
for IT managers because it is "automatically configured to take the guesswork out of
performance tuning. The Ultrabook maker described that it optimized its Xeon chips, in
particular, for networking and I/O use cases to "enable new levels" of data analytics.
HP HAVEn Big Data Platform
This article describes
Orchestra the
Berkley solution for
Managing Hadoop
Data Transfers
across Networked
Computer Clusters.
1. Increasing volume from companies keeping detail data, not aggregates, from
many more information sources.
2. More variety in the types of data to be incorporated into queries such as
application logs, sensor time series data, geospatially tagged data, biomedical
data, genomics data, and social media feeds.
3. Diverse storage technologies due to an increasing variety of data technologies
being instead of traditional RDBMS for storing and managing this data.
4. Complex queries generated by advanced clustering, statistical analysis and
wave-form analytics algorithms being applied to Big Data.
Dimensioning a hadoop cluster depends on many factors. Whilst the main use is still
cantered around batch analytics, and queries crunching large files, other use cases
are emerging and becoming more common use. Think for instance of ad-hoc
queries, streaming analytics and in-memory workflows, and near-real-time analytics
Let's start with a very well known distributed computing paradigm specifically the
hadoop map-reduce operation. This is a batch Job Stream which is I/O-bound - the
limiting factor for throughput performance being CPU I/O waits which can typically
account for 90-95% of CPU time in a well-tuned Hbase environment and over 99.99%
of CPU time in a poorly-tuned Hbase environment .
In short, what hadoop does is to take some chunks of data from storage (typically a file
from a local HDD), processing the data while "streaming" input file and then writing back
the results on file (hopefully again locally). Once the "map" phase is finished, the data is
sorted and merged in buckets sharing the same "key" and the process is repeated once
again in the "reduce" phase of map-reduce.
Distributed Clustering Models
Performance Optimimisation
The map-reduce paradigm, when well tuned, can be very effective and efficient
way of dealing with a large class of parallel computing problems.
However, processing resources and data must be kept as close to each other as
possible but this is not always possible or feasible. Hence hadoop map-reduce is
an effective parellalization paradigm, so long as during shuffle-and-sort data is
kept relatively local. Provided that enough reducers are running in parallel during
the reduce phase, which in turns depends on way keys are crafted / engineered.
Moreover, this paradigm relies heavily on disk I/O in order to de-couple the various
stages of the computation. Although many classes of problems can be re-coded
by means of map-reduce operations, it is in many cases possible to gain speed
and efficiency by reusing the data already in memory and execute more complex
DAG (Directed Acyclic Graphs) as larger atomic dataflow operations.
The main idea behind hadoop is to move processing close to the storage, and
allocate enough memory and cores to balance the throughput provided by the
local disks. The ideal hadoop building block is an efficient computing unit with a
full process, storage, and uplink hardware stack tightly integrated.
Distributed Clustering Model Performance
Distributed Approximate Kernel K-means
2.3 GHz quad-core Intel Xeon processors, with 8GB memory in intel07 cluster
Run-time
Size of Benchmark
dataset Performance
(no. of (Speedup
Records) Factor )
10K 3.8
100K 4.8
1M 3.8
10M 6.4
The Big Data Processing Pipeline
The Data Processing Pipeline is characterized by just a few generic processes: -
1. Sourcing data: Multiple sources of data have to be fed into the Big Data Processing
Pipeline. These data sources have to be identified scheduled and collected, but they
also need to be checked, cleaned, de-duplicated and moved into a staging area.
2. Data shoe-horning : The staging of source data is necessary because of this next
step in the process , something I call data shoe-horning. This is something most
people dont even bat an eye-lid over its often not even identified often as a distinct
process in the pipeline. But pay attention, because this is where traditionally, data gets
re-formatted or "shoe-horned" into a relational model and loaded into an RDBMS.
3. Data Quality and Transformation rules: These include everything from de-duplication
and data scrubbing, data cleansing, data validation to (field) mapping and any other
business rules that need to be applied in the transformation processing of the data
4. "Sink" preparation: The processed data has to make itself to various consumer sinks
i.e. Intermediate Data Files that are going to be consumed by enterprise, group or end-
user applications
The Big Data Processing Pipeline
The Data Processing Pipeline is characterized by just a few generic processes: -
5. Data distribution: Finally the data has to be distributed and loaded into those various
data consumption stacks that service business applications and products .- and it is at
this point that multiple relational data sources are prepped with various target schemas -
very often with an operational data warehouse and data marts - possibly a columnar
database and maybe unstructured content stores serving a search platform (like Solr).
6. Archiving / Purging - Many original raw un-staged data sources as well as intermediate
data files are subject to a purge or archival policy. Where this becomes a source of
contention is when it comes time to re-claim or "re-surface all of this data from it's
archive (of course if you deleted it, you have a different kind of problem altogether.....).
As reported by znet, microservers because of their small size, and the fact
they require less cooling than their traditional counterparts, can also be
densely packed together to save physical space in the datacentre. Scaling
out capacity to meet demand simply requires adding more microservers.
Efficiency is further increased by the fact microservers typically share
infrastructure controlling networking, power and cooling, which is built into
the server chassis.
Turing Institute
Turing Institute
In his Budget announcement, the chancellor, George Osborne pledged government
support for the Turing Institute, a specialist centre named after the great computer
pioneer Alan Turing which will provide a British home for studying Data Science and
Big Data Analytics. Clustering and Wave-form algorithms in Big Data are the key to
unlocking Cycles, Patterns and Trends in complex (non-linear) systems Cosmology,
Climate and Weather, Economics and Fiscal Policy in order to forecast future trends,
outcomes and events with far greater accuracy.
The chancellor, George Osborne has announced a 42m Alan Turing Institute is to be
founded to ensure that Britain leads the way in Data Science, Big Data Analytics for
studying complex (non-linear) systems - Clustering and Wave-form algorithmic research
in both Deterministic (human activity) and Stochastic (random, chaotic) processes.
Drawing on the name of the famous British mathematician and computer pioneer Alan
Turing - who led the Enigma code-breaking work during the second world war at
Bletchley Park - the institute is intended to help British companies by bringing together
expertise and experience in tackling the challenges of understanding both deterministic
and stochastic systems such as Weather, Climate, Economics, Econometrics and the
impact of Fiscal Policy which require massive data sets and computational power.
Enigma Machine
Turing Institute
The Turing Institute comes at a time when Data Science, Big Data Analytics and
complex system algorithm research is front and centre on the commercial stage. The
Turing Institute will be the first step to realising the UKs digital innovation potential.
Exploitation of big data by applying analytical methods - statistical analysis, predictive
and quantitative modelling - provides deeper insights and achieves brighter outcomes.
The UK needs a centre of excellence capable of nurturing the talent required to make
British Data Science and Big Data Technology world-class. The cornerstone for the
new digital technologies isnt just infrastructure, but the talent thats needed to found,
innovate and grow technology firms and create a knowledge-based digital economy.
The tender to house the institute will be produced this year. It may be a brand-new
facility or use existing facilities and space in a university, a Treasury spokesman said.
Its funding will come from the Department for Business, Innovation and Skills, and its
chief will report to the science minister, David Willetts. Executive appointments and
establishment numbers for the Turing Institute have yet to be announced.
"The intention is for this work to benefit British companies to take a critical advantage
in the field of Data Science algorithms, analytics and big data," said the spokesman.
The Bombe at Bletchley Park
Turing Institute
Alan Turing was a pivotal figure in mathematics and computing and has long been
recognised as such by fellow mathematicians and computer scientists for his ground-
breaking work on Computational Theory. There already exists a Turing Institute at
Glasgow University, and an Alan Turing Institute in the Netherlands, as well as the Alan
Turing building at the Manchester Institute for Mathematical Sciences.
Osborne's announcement marks further official rehabilitation of a scientist who many see
as having been badly treated by the British establishment after his work during WWII.
Turing, who was homosexual, was convicted of indecency in March 1952, and lost his
security clearance with GCHQ - the successor to Bletchley Park. Turing killed himself in
June 1954 - but was only given an official pardon by the UK government in December
2013 after a series of public campaigns for recognition of his achievements.
Digital Village
Digital Village Strategic Partners
Digital Village is a consortium of Future Management and Future Systems Consulting firms for
Digital Marketing and Lifestyle Strategy Social Media / Big Data Analytics / Mobile / Cloud
Computing / GPS/GIS / Next Generation Enterprise (NGE) / Digital Business Transformation
Nigel Tebbutt
Future Business Models & Emerging Technologies @ INGENERA
Telephone: +44 (0) 7832 182595 (Mobile)
+44 (0) 121 445 5689 (Office)
Email: Nigel-Tebbutt@hotmail.com (Private)