Neuro-linguistic
programming (NLP) is a set of techniques or rituals and beliefs
that adherents use primarily as an approach to psychotherapy, healing,
communication and personal development.
NLP was proposed in 1973 by Richard Bandler and John Grinder as
a set of models and principles to describe the relationship between
mind (neuro) and language (linguistic, both verbal and non-verbal)
and how their interaction might be organized (programming) to affect
an individual's mind, body and behavior. It is described by the
original developers as "the study of the structure of subjective
experience". It is predicated that all behaviors have a practically
determinable structure.
NLP is based on the access to subconscious engrams, and body language
cues derived from the observation of “therapeutic wizards”.
Techniques include behavior change, transforming beliefs, and treatment
of traumas through techniques such as reframing and "meta-modeling"
proposed for exploring the personal limits of belief as expressed
in language. NLP has been applied to a number of fields such as
psychotherapy, communication, education, coaching, sport, business
management, interpersonal relationships, and spirituality.
Brain lateralization
Hemispheric differences (brain lateralization) is used to support
assumptions in NLP. Eye movements (and sometimes gestures) correspond
to visual/auditory/ kinesthetic representations systems and to the
specific regions in the brain. For example, the left side is said
to be more logical/analytical than the right side, which is said
to be more creative/imaginative or that regions of the brian are
specialised for certain functions such as mathematics or language.
Eye accessing cues and NLP representational systems
A core NLP training exercise involves
learning to calibrate eye movements patterns with internal representations.
• Visual: eyes up to left or right according to dominant hemisphere
access; high or shallow breathing; muscle tension in neck; high
pitched/nasal voice tone; phrases such as “I can imagine the
big picture”.
• Auditory: eyes left or right; even breathing from diaphragm;
even or rhythmic muscle tension; clear midrange voice tone, sometimes
tapping or whistling; phrases such as “Let's tone down the
discussion”.
• Kinesthetic: eyes down left or right; belly breathing and
sighing; relaxed musculature; slow voice tone with long pauses;
phrases such as “I can grasp a hold of it”
If human beings store information in a 'mind map' and NLP can access
an individual's model, then why can't a screen categorisation system
mirror that individual's 'internal representations'? Calibrate the
user experience and make it easier and quicker to find what you
need. A Four Corner Interface.
|