Wednesday, 16 April 2014

Have you exercised your memory lately?

We’ve often heard someone’s memory described as 'weak' or 'strong'. But with the majority of psychological memory models drawing on information processing analogies with terms like 'storage', 'retrieval', and 'input', where did the idea of memory’s strength come from?

In a recent article published in the Journal of the History of the Behavioral Sciences, Alan Collins of Lancaster University reviewed British and American texts dating between 1860 and 1910 that focused on improving human memory. By extending his analysis to include those texts aimed at popular audiences as well as those intended more specifically for academics, Collins noticed a trend during this period in which the importance of enhancing natural memory was emphasised over the creation of artificial memory systems.

The idea of 'artificial' memory was used to describe systems created with the intention of supporting or improving one’s memory capabilities – be they mnemonics or some other form of memory aid. The criticism of such systems at the time was that they require too much mental effort and have only limited value in the practical sense. 'Natural' memory, conversely, was used generally to describe our innate memory systems.

Collins explains that the increasing tendency towards discussions of natural memory in the latter decades of the 19th century paralleled a wider emphasis on understanding everything as being a part of nature and therefore subject to natural laws. Guidebooks of the period connected all aspects of one’s life to their general health: a healthy diet, good (moral) habits, pure air, and both a strong mind and a strong body were key to a good character. Natural memory became wrapped up in these recommendations, often described as similar to our bodily functions, especially our muscles. The argument put forth contended that just as our muscles require exercise, training, and discipline, so too does our memory.


But just how does one 'exercise'; their memory? In short: repeated practice. The memory improvement texts examined by Collins advised readers to block out a period of time each day to actively exercise their memories. This time could be spent learning lists, reciting poetry, or recounting the events of the previous day. Focused attention on the chosen task was considered to be an especially critical component.

As Collins highlights in his conclusion, today we no longer draw on the muscle metaphor explicitly in discussions of memory, but the concept of 'strength' has remained. However, if we think about the advent of computer games and apps intended to strengthen (or, dare I say 'exercise'?) the mind, perhaps the idea of working out one’s memory is not quite as foreign as it may seem. Besides, learning a verse of poetry sounds a great deal more appealing than hitting the treadmill.

_________________________________ ResearchBlogging.org

Collins AF (2014). Advice for improving memory: exercising, strengthening, and cultivating natural memory, 1860-1910. Journal of the history of the behavioral sciences, 50 (1), 37-57 PMID: 24272820

Post written for the BPS Research Digest by guest host Jennifer Bazar, who is a Postdoctoral Fellow at the University of Toronto/Waypoint Centre for Mental Health Care and an Occasional Contributor to the Advances in the History of Psychology blog.

Monday, 14 April 2014

Does Psychology have its own vocabulary?

If you were to pick up the flagship journal from a discipline that is foreign to you and flip to an article at random, how much do you think you would understand? Put a different way: how much of the vocabulary employed in that article might you misinterpret?

The vocabularies used by any given discipline overlap with those of many other disciplines, although the specific meaning associated with a given term may be dissimilar from discipline to discipline. Anglophone psychology, for instance, has been previously shown to share much of its vocabulary with other disciplines, especially: biology, chemistry, computing, electricity, law, linguistics, mathematics, medicine, music, pathology, philosophy, and physics. But how much of psychology’s vocabulary may be said to be unique to itself?

In a recent article in History of Psychology, John G. Benjafield of the Department of Psychology at Brock University (Canada) compared the histories of the vocabularies of psychology and the 12 disciplines listed above. Constructing databases for each of the disciplines using entries in the Oxford English Dictionary, Benjafield examined the rate of primary vs secondary words (ie. how often a word was used for the first time by a discipline vs. how often a word was appropriated from the vocabulary of another discipline) along with the dates of first usage of these terms, and the polysemy of the vocabularies (i.e. the number of different meanings held by a given word).


So does psychology have its own vocabulary? The answer seems to be: somewhat. The majority of the vocabularies of all 13 disciplines were formed of secondary words; that is, the bulk of their vocabularies are formed of words that were first used in the English language by another discipline (often with another meaning). But, psychology was nonetheless found to have some unique characteristics with regards to its vocabulary that you may not have expected.

First, Benjafield found that computing and linguistics have the highest percentage of secondary words in their vocabularies (97 per cent and 94 per cent respectively) while psychology and chemistry had the lowest rates of the disciplines examined (65 per cent and 62 per cent). In light of these results, psychology’s vocabulary may been described as being less metaphorical in nature than previously assumed (especially when compared to computing and linguistics).

Moreover, whereas the other subjects in this study showed a collective tendency over time to increasingly assign new meanings to existing words, psychology has been following the opposite pattern – that is to say that, over time, psychology has tended more and more to invent new words for its purposes than the other disciplines.

Finally – and perhaps the most surprising conclusion to come out of Benjafield’s study – the history of the vocabulary of psychology has been shown to be most characteristically similar to chemistry. Personally, this one caught me by surprise: I would have expected closer connections to philosophy and physics based on the way the discipline of psychology developed over time. But Benjafield’s vocabulary analysis paints a different picture in which psychology has been strongly influenced by the naming practices of chemistry.

_________________________________ ResearchBlogging.org
Benjafield JG (2014). Patterns of similarity and difference between the vocabularies of psychology and other subjects. History of psychology, 17 (1), 19-35 PMID: 24548069

Post written for the BPS Research Digest by guest host Jennifer Bazar, who is a Postdoctoral Fellow at the University of Toronto/Waypoint Centre for Mental Health Care and an Occasional Contributor to the Advances in the History of Psychology blog.

Friday, 11 April 2014

Facial expressions as social camouflage

Can making faces mask your personality?

According to a group of University of Glasgow psychologists, Daniel Gill and colleagues, it can. Writing in the journal Psychological Science, these researchers say that human facial expressions can signal how dominant, trustworthy, or attractive we are – and that these ‘dynamic’ signals can mask or override the impression given off by the ‘static’ structure of the face.

In other words, someone might have a face that ‘seems untrustworthy’, but if they make the right face, they’ll still look like someone you’d trust with your housekeys.

To reach this conclusion, the researchers made use of software that allows them to generate realistic animated face images. These ‘faces’ are programmed with 42 different sets of muscles – called ‘action units’. Each one of these units could be switched on or off independently from the others, creating billions of possible animated facial expressions – only a tiny proportion of which are likely to be seen in real life. This software has been used before in studies of emotional expressions.

Gill and colleagues generated thousands of random expressions and got volunteers to rate each one for how dominant, trustworthy, and attractive it appeared. From all of these ratings they were able to determine the essence – or prototype – of, for example, a highly trustworthy look. Which, it turns out, involves the activation of the ‘Dimpler’, ‘Lip corner and cheek raiser’, and ‘Sharp lip puller’.

Can a facial expression tell you whether somebody is a good egg or not?
Armed with these dynamic prototypes of dominance, trustworthiness and attractiveness, Gill et al then tested whether they could counteract the effects of static impressions of the same traits. They used the same software to generate thousands of static faces, got volunteers to rate them, and worked out what made someone just look trustworthy, for example.

Then, they overlaid the dynamic expressions on top of the static ones. This revealed that, in general, the dynamic expressions were more powerful than the static traits. Mathematically speaking, the effect of static structure was linear while the dynamic effect was nonlinear and larger in magnitude.

They dub this social camouflaging: ‘Even the most submissive face [was] transformed into a dominant face by social camouflaging and reaches the same level of dominance as the most dominant static facial morphology.’

As well as with trustworthiness, the same effect worked for dominance and attractiveness as well, although it wasn’t quite as effective in the latter case, suggesting that ‘facial attractiveness is more difficult to mask than are facial dominance and trustworthiness’.

This, they say, is no big surprise: ‘Casting directors are probably aware that not all social traits are equal. An attractive character will require an actor with attractive morphology; however, social camouflage can help an actor fake a dominant or trustworthy character.’

However, all of this research was based on computer-generated faces. This provided Gill and colleagues with the ability to examine a wider range of expressions than would have been possible using actual models, but it does mean that these results might need to be confirmed with real faces to verify the relationships between dynamic and static faces.

_________________________________ ResearchBlogging.org

Gill, D., Garrod, O., Jack, R., & Schyns, P. (2014). Facial Movements Strategically Camouflage Involuntary Social Signals of Face Morphology Psychological Science DOI: 10.1177/0956797614522274

Post written for the BPS Research Digest by guest host Neuroskeptic, a British neuroscientist who blogs for Discover Magazine.

Wednesday, 9 April 2014

You don't have to be well-educated to be an ‘aversive racist’, but it helps

Are you a racist?

Most likely, your answer is no – and perhaps you find the very notion offensive. But according to two Cardiff University psychologists, Kuppens and Spears, many educated people harbor prejudiced attitudes even though they deny it. Their research was published recently in Social Science Research.

Kuppens and Spears analysed data from a large survey of the general US population, the American National Election Studies (ANES) 2008-2009. They focused on over 2,600 individuals of white ethnicity, and investigated the relationship between their level of education and their attitudes towards African-Americans.

In common with many previous studies, Kuppens and Spears found that more educated people were less likely to endorse anti-black views on questionnaires. For example, in response to the questions like: “Why do you think it is that in America today blacks tend to have worse jobs and lower income than whites do? Is it… because whites have more in-born ability to learn?”

However, while the educated participants reported less explicit prejudice, they did not show a corresponding tendency towards less implicit prejudice, as measured using the Implicit Association Test (IAT).

This method originated in cognitive psychology experiments and it has become widely used as a tool for probing people’s ‘unconscious’ attitudes.

As well as education, Kuppens and Spears explored IAT performance and explicit racial attitude measures across other demographics as well. They found that older white Americans reported less explicit prejudice than younger ones, yet they displayed more implicit bias. Women also endorsed less racist views, but were no different to men on the implicit measures.

Psychologists have long known that our ability to accurately perceive and self-report on our own behaviour is imperfect. If these results are anything to go by, being highly educated might not mean that we’re fully informed about our own implicit prejudices. Kuppens and Spears suggest that educated people were more likely to be ‘aversive racists’ – people who reject racism and consider themselves free of prejudice, yet still harbor implicit bias.


The researchers do note, however, that implicit measures like the IAT are open to several interpretations. In particular, they say, just because someone automatically associates a racial group with negative concepts doesn’t mean that they agree with that association. By itself, it only shows that they are familiar with it: ‘Because the nature of these measures prevents the influence of deliberative considerations on the measurement outcome, it is not clear to what extent they reflect attitudes that are endorsed by individuals, or result from information that individuals have been exposed to, but do not necessarily endorse.’

These concerns stem from the nature of the IAT procedure. In this test, the volunteers had to quickly press either a left button or a right button to categorise a target. In some cases the target was a word, and the object was to categorise its meaning as either good (e.g. ‘love’, ‘friend’) or bad (‘hate’, ‘enemy’). In other cases, the target was a picture of either a black or a white person’s face, and the task was to categorise their race.

The principle behind the IAT is that if someone mentally associates two concepts – say ‘black’ and ‘bad’ – they will find the task easier when they’re asked to use the same button to indicate these two concepts. Someone for whom these concepts are linked will tend to press the button faster, when the buttons match, as opposed to when they’re asked to use the opposite arrangement (e.g. same key for ‘black’ and ‘good’).

See also: a recently published study reported on an international contest to develop the best way to eliminate implicit racial bias on the IAT (paper, covered by the Neurocritic blog).

_________________________________ ResearchBlogging.org

Kuppens T, & Spears R (2014). You don't have to be well-educated to be an aversive racist, but it helps. Social science research, 45, 211-23 PMID: 24576637

Post written for the BPS Research Digest by Neuroskeptic, a British neuroscientist who blogs for Discover Magazine.

Monday, 7 April 2014

Around the world, things look better in hindsight

Human memory has a pervasive emotional bias – and it’s probably a good thing. That’s according to psychologists Timothy Ritchie and colleagues.

In a new study published in the journal Memory, the researchers say that people from diverse cultures experience the ‘fading affect bias’ (FAB), the tendency for negative emotions to fade away more quickly than positive ones in our memories.

The FAB has been studied previously, but the most previous research looked at the memories of American college students. Therefore, it wasn’t clear whether the FAB was a universal phenomenon, or just a peculiarity of that group.

In the new study, the authors pooled together 10 samples from different groups of people around the world, ranging from Ghanaian students, to older German citizens (who were asked to recollect the fall of the Berlin Wall). In total, 562 people were included.

The participants were asked to recall a number of events in their lives, both positive and negative. For each incident, they rated the emotions that they felt at the time it happened, and then the emotions that they felt in the present when remembering that event.

Ritchie and colleagues found that every cultural group included in the study experienced the FAB. In all of these samples, negative emotions associated with remembered events faded to a greater degree than positive emotions did. Importantly, there was no evidence that this effect changed with people’s age: it seems to be a lifelong phenomenon.


The authors conclude that our ability to look back on events with rose-tinted spectacles might be important for our mental health, as it could help us to adapt and move on from adversity: ‘We believe that this phenomenon is part of a set of cognitive processes that foster emotion regulation and enable psychological resilience.’

However, the authors admit that their study had some limitations. While the participants were diverse geographically and culturally, they all had to speak fluent English, because all of the testing was carried out in that language. In order to confirm that the FAB is truly universal, it will be important to examine it in other languages. Ritchie and colleagues also note that despite this apparent universality of the phenomenon, ‘We do not intend to imply that the FAB occurs for the same reasons around the world.’

_________________________________ ResearchBlogging.org
Ritchie TD, Batteson TJ, Bohn A, Crawford MT, Ferguson GV, Schrauf RW, Vogl RJ, & Walker WR (2014). A pancultural perspective on the fading affect bias in autobiographical memory. Memory (Hove, England) PMID: 24524255

Post written for the BPS Research Digest by guest host Neuroskeptic, a British neuroscientist who blogs for Discover Magazine.

Friday, 4 April 2014

Do television and video games impact on the wellbeing of younger children?

We’re often bombarded with panicky stories in the news about the dangers of letting children watch too much television or play too many video games. The scientific reality is that we still know very little about how the use of electronic media affects childhood behaviour and development. A new study from a team of international researchers led by Trina Hinkley at Deakin University might help to provide us with new insights.

The study used data from 3,600 children from across Europe, taken as part of a larger study looking into the causes and potential prevention of childhood obesity. Parents were asked to fill out questionnaires that asked about their children’s electronic media habits, along with various wellbeing measures – for example, whether they had any emotional problems, issues with peers, self-esteem problems, along with details about how well the family functioned. Hinkley and colleagues looked at the associations between television and computers/video game use at around the age of four, and these measures of wellbeing some two years later.

The results are nuanced. The researchers set up a model that controlled for various factors that might have an effect – things like the family’s socioeconomic status, parental income, unemployment levels and baseline measures of the wellbeing indicators. On the whole, after accounting for all of these factors, there were very few associations between electronic media use and wellbeing indicators. For girls, every additional hour they spent playing electronic games (either on consoles or on a computer) on weekdays was associated with a two-fold increase in the likelihood of being at risk for emotional problems – for example being unhappy or depressed, or worrying often. For both boys and girls, every extra hour of television watched on weekdays was associated with a small (1.2- to 1.3-fold) increase in the risk of having family problems – for example, not getting on well with parents, or being unhappy at home. A similar association was found for girls between weekend television viewing and being at risk of family problems. However, no associations were found between watching television or playing games and problems with peers, self-esteem or social functioning.


So it seems as if these types of media can potentially impact on childhood development by negatively affecting mental wellbeing. However, what we can’t tell from these data is whether watching television or playing games causes these sorts of problems. It may well be the case that families who watch lots of television are not providing as much support for young children’s wellbeing from an early stage – so the association with television or game use is more to do with poor family functioning than the media themselves. Furthermore, the results don’t tell us anything about what types of television or genres of games might have the strongest effects – presumably the content of such media is important, in that watching an hour of Postman Pat will have very different effects on a four-year-old’s wellbeing than watching an episode of Breaking Bad. And as the authors note, relying on subjective reports from parents alone might introduce some unknown biases in the data – “an objective measure of electronic media use or inclusion of teacher or child report of wellbeing may lead to different findings”, they note. So the results should be treated with a certain amount of caution, as they don’t tell us the whole story. Nevertheless, it’s a useful addition to a now-growing body of studies that are trying to provide a balanced, data-driven understanding of how modern technologies might affect childhood development.

- Post written by guest host Dr Pete Etchells, Lecturer in Psychology at Bath Spa University and Science Blog Co-ordinator for The Guardian. 

_________________________________ ResearchBlogging.org
Hinkley, T., Verbestel, V., Ahrens, W., Lissner, L., Molnár, D., Moreno, L., Pigeot, I., Pohlabeln, H., Reisch, L., Russo, P., Veidebaum, T., Tornaritis, M., Williams, G., De Henauw, S., & De Bourdeaudhuij, I. (2014). Early Childhood Electronic Media Use as a Predictor of Poorer Well-being JAMA Pediatrics DOI: 10.1001/jamapediatrics.2014.94

Thursday, 3 April 2014

How does stress affect your public speaking skills?

Having to give a talk or a speech in front of a large group of people is one of the scarier things we might find ourselves having to do at some point in our lives. In those situations, ideally we want to give a flawless, well-rehearsed delivery, and getting too stressed is often linked to becoming – literally – lost for words. But is there any actual evidence for this link?

Tony Buchanan and colleagues have recently investigated what sort of aspects of speech and language are affected in stressful versus non-stressful situations. They asked 91 people to participate in a social stress test, in which they had five minutes to prepare a speech, and then immediately deliver it. In the stressful condition, they had to imagine that they had been accused of shoplifting, and had to prepare a defence that they would deliver to the ‘store manager’ (an experimenter). Immediately after the speech, they were given a difficult mental arithmetic task. In the non-stressful condition, people spent five minutes preparing a summary of a travel article, which they then had to read aloud to a video camera. Immediately after, they completed a much simpler arithmetic task.

Buchanan’s team measured levels of the stress hormone cortisol in samples of saliva taken before the test, plus 10 and 30 minutes afterwards. They also measured heart rate, the speed at which people spoke during their speech, as well as the number of pauses and the number of ‘nonfluencies’ – words like um, er, and hmm.


The stressful speech condition seemed to increase stress levels – the measures of heart rate and cortisol levels showed an increase in this condition compared to the non-stressful situation. However, some of the speech variables that the researchers looked at didn’t seem to be affected in the way that you might expect. Regardless of the stress condition, the speed at which people talked during their speech didn’t differ. Strangely, the number of nonfluencies was higher for the non-stressful speech than in the stressful one. The only detrimental thing that stress seemed to have an effect on was pause time – as they progressed through their speech, people tended to stop increasingly more often in the stressful condition as opposed to the non-stressful condition.

Buchanan and his colleagues acknowledge the limitations of their study – as it’s correlational in nature, we can’t say for sure whether increases in cortisol levels cause a greater number of pauses in speech production, or whether noticing that you’re pausing more often in the task causes your cortisol levels to increase.

That being said, it seems like pause time is important, because it is thought to be an indication of lexical retrieval processes – if more thought is required for a certain part of a speech, or harder words need to be used, you’re more likely to stop for a moment before saying them. In stressful situations, these retrieval processes take a longer time, and so you’re more prone to pausing. So this study seems like an interesting step forward in understanding specifically how stress affects different aspects of speech production – you might even say it gives us pause for thought.

Post written by guest host Dr Pete Etchells, Lecturer in Psychology at Bath Spa University and Science Blog Co-ordinator for The Guardian.


_________________________________ ResearchBlogging.org Buchanan, T., Laures-Gore, J., & Duff, M. (2014). Acute stress reduces speech fluency Biological Psychology, 97, 60-66 DOI: 10.1016/j.biopsycho.2014.02.005

Wednesday, 2 April 2014

Inflated praise for your children: an 'incredibly' bad idea?


When you’ve done something good, or performed a task well, it feels great to get some praise for it. And parents and teachers, especially in Western cultures, are encouraged to dole out praise to children in an increasingly generous manner. A drawing might not just be 'good', it might be 'incredible'. That song wasn’t just 'beautiful', it was 'epic'. Such praise is often given with the best intentions, particularly in the belief that positive feedback, especially for children who don’t have much faith in themselves, might help to raise their self-esteem. But does it work?

Recent research by Eddie Brummelman and colleagues has tried to shed light on this question. In three studies, they looked at how adults dish out praise to children in both an experimental and naturalistic setting, and how children with varying levels of self-esteem take it. Their results suggest that overly positive praise might not have the intended effect for children who have low self-esteem.

In the first experiment, Brummelman’s team asked a group of adults to read short descriptions of hypothetical children, described as either having high or low self-esteem. People were told about something that the child had done – say, solving a maths problem, or performing a song. After reading through the description, they were asked to write down any praise that they might give the child. Brummelman’s team found that about a quarter of the praise was overly positive (e.g. "that sounded magnificent!"), and that people were more likely to give more extremely positive praise to the children who had low self-esteem.

The researchers then tried to replicate these findings in a more naturalistic setting, by observing how parents interacted with their children when giving them a series of maths exercises at home. Brummelman and colleagues found a similar result to their laboratory experiment – about a quarter of the time, praise was overly inflated, and children who had lower self-esteem were given more inflated praise than those who had higher self-esteem.

'You made an incredibly beautiful painting!'

In order to figure out whether this actually mattered or not, in the final experiment Brummelman’s team looked at how being given praise impacted on one particular aspect of children’s behaviour – challenge seeking. Two hundred and forty children first completed a questionnaire to assess their level of self-esteem, and then were asked to draw a copy of van Gogh’s Wild Roses. The children were told that a professional painter would then assess their drawing, and tell them what he thought of it. In reality, the painter didn’t exist, and children were simply given inflated praise, non-inflated praise, or no praise at all. Afterwards, the children were shown four complex and four easy pictures, and asked to have a go at reproducing some of them. Critically, they were told that if they picked the difficult picture, they might make a lot of mistakes, but they might also learn lots. In other words, the number of difficult pictures the children chose to draw was taken as a measure of challenge seeking.

Brummelman’s team found that if children with lower self-esteem were given overly-inflated praise, they were less inclined to seek a challenge in the second task – they would go for easy drawings over the harder ones, and therefore miss out on the chance for a new learning experience. On the other hand, children with high self-esteem were more likely to seek a challenge after being given inflated praise. Interestingly, the only difference between the inflated and non-inflated praise was a single word – incredible (“you made an incredibly beautiful drawing!” versus “you made a beautiful drawing!”).

What the study doesn’t tell us is why children with low-esteem might avoid challenges in these circumstances. The authors suggest that inflated praise might set the bar very high for children in the future, and so inadvertently activates a self-protection mechanism in those with low self-esteem – although they acknowledge that they didn’t actually measure this in the study.

At any rate, the finding builds on a number of experiments conducted in recent years showing that positive praise isn’t necessarily good for all children in all circumstances. For children with low self-esteem, although we might feel the need to shower them in adulation, this might end up having precisely the opposite effect. Even words like incredible can end up having a huge unintended impact – so when you’re telling children they’ve done a great job, choose your words wisely.

- Post written by guest host Dr Pete Etchells, Lecturer in Psychology at Bath Spa University and Science Blog Co-ordinator for The Guardian.

ResearchBlogging.org Brummelman, E., Thomaes, S., Orobio de Castro, B., Overbeek, G., & Bushman, B. (2014). "That's Not Just Beautiful--That's Incredibly Beautiful!": The Adverse Impact of Inflated Praise on Children With Low Self-Esteem Psychological Science, 25 (3), 728-735 DOI: 10.1177/0956797613514251

Tuesday, 1 April 2014

It's official: Psychologists DO know what you are thinking

It’s an accusation often fired at psychologists at parties: ‘I bet you can tell what I’m thinking’. Now psychologists, much to their own surprise, have found scientific evidence that this might actually be the case.

In a series of studies reported today in the Journal of Metacognition, researchers found that qualified psychologists significantly outperformed matched controls on experimental tasks measuring the ability to guess a target selected by others from a random stimulus array.

The original aim of the study was to assess whether there was any validity to parapsychology claims of ‘remote viewing’ abilities in the normal population. A participant selects one of five ‘target’ pictures – of former politician Lembit Opik, a duck, a map of Seattle, a weasel with a chainsaw, and some wool. Will a ‘viewer’ in another room – completely blind to the selection process – be able to tell which image the participant has in their mind?

We would expect the viewers to be right 20 per cent of the time, purely by chance. But the experimenters discovered something quite unexpected. Their colleagues – postgraduate researchers, lecturers and professors in psychology – appeared to be much more successful at the task than were people from other disciplines. ‘Initially, we were skeptical about the whole thing,’ lead researcher Professor Chris Turner told the Research Digest. ‘But on performing the statistical analysis, I spilled my latte all down my white lab coat. When we considered the results from the “trained psychologists” as one group, we found a hugely significant difference, with the psychologists outperforming the “controls” by more than two to one.’


Psychologists outperformed controls by more than two to one.

Professor Turner replicated the group’s own results before running a new experiment. Would the influence work the other way? Could psychologists actually be more successful at implanting a phrase into the mind of someone in another room? Using the script from a 1989 episode of British sitcom ‘Only Fools and Horses’, the one where Del Boy falls through the bar, ‘transmitters’ had to attempt to ‘send’ a snippet – ‘We’re on a winner here, Trig’, ‘play it nice and cool’, etc – to the ‘receiver’. Incredibly, the ‘trained psychologists’ group was significantly better at transmitting one of quotes to an isolated individual.

Professor Chris French, Head of the Anomalistic Psychology Research Unit at Goldsmiths, University of London, expressed doubt about the validity of the claims. He told the Research Digest that we should always be wary of dramatic claims until they have been reliably replicated by independent researchers, adding that ‘no such effects have been found in studies carried out by members of the APRU’.

As for the researchers themselves, how do they feel about their newfound ‘abilities’? ‘I knew you were going to ask that!’, Professor Turner said. ‘Seriously though, it has had the effect of bringing us closer together as researchers. Before the study, we were spread across different universities, but as we speak I’m in the process of bringing the whole group together into a newly formed research unit, on the site of an old abandoned old abandoned TV transmitter. There’s some resistance from our ethics committee, but we’re confident we can overcome this.’

Turner, C., Nilsson, R., King, P., Harkes, J., Shirtliff, P., Pearson, N., Wilson, D., Sheridan, J., Hirst, D., Williams, P. & Worthington, N. (2014). Brief report: Evidence of superior mindreading and control in professional psychologists? Journal of Metacognition, 4 (1), 91-92.

Update: Please note the date of this post… many thanks to Professor Chris French for being a good sport, as ever, and to the 1991 League Cup winning Sheffield Wednesday FC side for conducting the research.

_________________________________ ResearchBlogging.org
Post written by Dr Jon Sutton, Managing Editor of The Psychologist, for the BPS Research Digest.

Thursday, 27 March 2014

We were promised jetpacks! …


… and sofas you could hose down! It’s always entertaining to consider our future thinking of yesteryear with 20:20 hindsight. So as we await our ‘guest hosts’ who are going to usher in our own new era, we thought we would peer back into the archives of the Digest and The Psychologist to see how our consideration of technological advances has stood the test of time.

Bypassing articles with quaint titles such as 'The internet: A possible research tool?', our focus is virtual reality (VR), in the days after Facebook spent $2 billion on the technology.

Winding the clock back to 1999, we find this piece from The Psychologist claiming that although VR had been described as ‘a solution looking for a problem’, there were in fact several important research questions in psychology that it was poised to answer. And does indeed seem that it is in the field of research where virtual reality has proven most prolific, with everything from replications of Milgram’s famous study to providing Digested gems such as ‘athletes more skilled at crossing the road than non-athletes’. Immersive virtual environments were also used in a (failed) attempt to reduce racial bias.


But a decade later, it could be argued that the early promise of virtual reality had faded and the focus was on online ‘virtual worlds’ rather than necessarily going for the fully immersive experience. We considered the issue in both The Psychologist and the Digest. ‘If you remain undaunted,’ Christian Jarrett wrote in 'Get a Second Life', ‘[psychologist Simon] Bignell says the first place to start is to download the free SL software from the internet. “Get yourself an avatar, customise it and then just take the plunge.”’ But did many psychologists heed that advice? Despite marking its 10th anniversary last June, Second Life has seen a considerable drop in its user-base and workforce in recent years. Perhaps such environments will remain a very distant second best.

At the risk of falling into the ‘Alan Sugar iPod’ trap, I would like to make a prediction of my own: that 3D television will prove to be a passing fad. I’m not convinced the predictions in our 2001 article on immersive television have really been borne out, and there is research to suggest that 3D films are neither more enjoyable nor more psychologically arousing than their 2D equivalents. Life happens in 3D anyway, and perhaps most people manage to immerse themselves in the experience psychologically just fine without the help of expensive and cumbersome technology. Now those sofas would be much more useful…


_________________________________ ResearchBlogging.org
Post written by Dr Jon Sutton, Managing Editor of The Psychologist, for the BPS Research Digest. Our 'guest hosts' begin posting next week.