About

Our aim is to reduce uncertainty by applying neuroscientific principles that will enable our clients to make better informed decisions at every stage of a product’s trajectory – from Discover to Design to Deploy.

Although we all share similar basic likes and dislikes – sugar and pain – we also have individual likes and dislikes – olives and art - and predicting the latter is NeuroSci’s business.

With the accumulated knowledge gained from 15 years in academic brain research, and 15 years in industrial R&D, NeuroSci Ltd has the ideal background for serving the needs of industries that rely upon predicting and understanding what people want, what they need, and what they like/dislike.  By translating recent advances in brain and behaviour research it is now possible to apply these insights – from the genesis and innovation stages of a product’s development, to its purchasing and use stages.

NeuroSci understands the complexities involved in predicting human behaviours and preferences.  By using methods that are predicated on understanding the more basics drivers of human needs and wants, their purchase decisions and product experiences – rather than the ones they consciously ascribe to them – we are able to reduce the uncertainty at every stage of a products genesis.  This model sees much of human behaviour as automatic and largely sub-conscious – and therefore inaccessible to conscious interrogation.  Vitally, it is also one that is driven by reward mechanisms in the brain – emotions.  The processes underpinning sensation, perception, memory, emotion and action are a balance between the (multi-) sensory information we gather via our senses, and the internal representations of the world ‘out there’ that are wired into the brain’s circuitry, and built up over a lifetime’s experience. A successful product strikes this balance perfectly, but any success is more often than not due to luck or empiricism (suck it and see), than to true evidence-based insight.

To find out about Francis McGlone, the man behind NeuroSci, please click here