Some science behind the scenes
This last important sense is the final piece in the jigsaw that helps us to build up a picture of our view – limited though it is – of the world. As I have shown, the reality we perceive is not the Reality that exists. Our sight, smell, taste and hearing senses – both the sense organs and the software to do the processing – eliminates and manipulates a considerable amount of the data so that we receive a picture that is consistent with our design and purpose. [if such there be].
But the two senses we seem to trust the most in judging whether a thing is substantial – matter - are our sight and our touch.
I have demonstrated that sight produces a non-reality which is relatively consistent between humans in that we are able to build up a common frame of reference even if we cannot agree on such things as beauty and colour. Our sight gives us the ability to work together as a species. If two people see a train arriving at a platform, they can both agree that there is a train and a platform and co-operate to perhaps get on the train. We have enough information from our sight to recognise danger, recognise food and other means of survival and work co-operatively together.
But our sight does not give us the ability to ‘see’ reality.
But, you may say, we can touch and feel matter it must be substantial, it must be ‘there’ - surely this is reality?
The nervous system works mechanically, each nerve ending or nerve cell is specifically designed to respond only to certain stimuli, so exactly the same filtering takes place with touch as it does with all the rest of the senses.
If we are not allowed to see an angel, maybe we are also prohibited from touching an angel!!
Furthermore exactly the same level of complex ‘software’ is needed to further process the signal so that it is recognisable – how we know we are being touched by a feather, a hand or a wet rag even with our eyes closed. Electrical signals don’t really provide much guidance and there is clearly a lot of processing require to distinguish one signal from another. The problem is not dissimilar to that we saw for sight, where there need to be patterns of input for the ‘software’ to be able to accurately judge the nature of the stimulus.
Much of the final processing of touch sensations is learnt. Furthermore, what we learn is then used to enable us to perform further activities. A metal worker, for example, who has learnt something of the ‘feel’ of metal, its properties, and how to handle it would be lost if asked to work with embroidery, because the learnt functions will be related to what he has learnt from the touch of the metal and inconsistent with the delicate working needed for cloth. So there is considerable functional interplay being invoked when we touch anything, to enable us to judge the correct response to the material being touched and used.
There are also, it would seem, a number of functional inter-dependencies in place, between the process that enables us to recognise a touch, and our emotions.
Being touched by another human in a gentle way can evoke any number of positive emotions - pleasure, love, peace, happiness - general feelings of well being.
And being touched can also evoke strong sexual responses too, none of which we ‘control’. Thus the function of touch cannot be described as ‘mechanical’ – as ‘hardware’ - there is an extremely complex inter-play going on between the functions of the soul.
This is my view, but what may be of interest is that science backs me up. A new research area called ‘haptic technology’ is starting to provide insights into touch capabilities. This technology is also pointing to the fact that our sense of touch is ‘software’ controlled.
Haptic technology can provide carefully tuned mechanical stimulation – pressure, vibration and so on, much like that which might be experienced by a person. This information is fed into a computer where it is processed by ‘virtual objects’ (objects existing only in a computer simulation), which can respond to these inputs. Let me put this another way. The computer scientists have created software which simulates the sense of touch.
We use so called ‘haptic technology’ in quite a number of ways to interact with computer software. Practically all this software is what is called simulation software. Some of this simulation software, simulates the responses of physical objects that we have built – airplanes [flight simulators], radioactive handling machines, underwater exploration devices, medical equipment and so on. We give these virtual simulated devices sensory input and these virtual simulated devices then respond as the real physical object might.
But there is also software that simulates the reactions of a person and this is to be found in robotics.
In this case the software simulates the effects we might respond with if we had been touched in that way – so this is, in effect, software that responds to the same inputs we might receive as a human being.
The Shadow Dextrous Robot Hand, for example, uses the sense of touch, pressure, and position to reproduce the human grip in all its strength, delicacy, and complexity.
The SDRH was first developed by Richard Greenhill and his team of engineers in Islington, London, as part of The Shadow Project.
This was a research and development program whose goal was to complete the first convincing humanoid. The Dextrous Hand has haptic sensors embedded in every joint and in every finger pad which relays information to a central computer for processing and analysis.
For those computer people who may read this, the software was based initially on neural nets and then subsequently fuzzy logic programming.
What does this mean?
First, our sense of touch is partly mechanical - it relies on electrical impulses traveling from various parts of the body to the brain – so this is the hardware.
Secondly, the information gathered by the brain is actually not sufficient to enable us to operate without some form of software to process that information to deduce and infer what it is that we are touching or being touched by - to interpret feelings. It is only the software that tells us whether we are experiencing pain, or have just been touched by someone’s fingers, or whether we feel full or have sore feet, or are being tickled. It is software that tells us we are upright, upside down, swirling round, have been hit by a car, have fallen off our bike. And software can be manipulated to whatever we need to know as opposed to what is actually happening.
Our ‘touch software’ presents us with a view of reality consistent with that produced by all the other software of sight and sound, taste and smell.
It is consistent, we can operate reasonably effectively with it. But it isn’t reality.
For iPad/iPhone users: tap letter twice to get list of items.