It is easy to think that many males may have a habit of invalidating women’s experiences.
But what also sometimes happens is that some takes another person’s experience and wants to own it, define it, take it away and completely bulldozers over the other person’s experience.
A few days ago, I saw a post online about some confusion about children who are merely young being “diagnosed” as having ADHD.
Another one of these “FFS!” moments.
Those naughty naughty kids! How dare they and oh, how incredibly SMART of the researchers in question to see that young children were being put in boxes and being accused of having Attention Deficit Hyperactivity Disorder.</sarcasm>
In the same discussion, I saw an educational psychologist moan about the difficulty of properly diagnosing children in a way that really annoyed me, but mostly in hindsight (as went for the topic of the discussion).
I have meanwhile identified why the comment annoyed me so much.
There was zero attention for the children, or even for only one child, in the comment. It was all about this psychologist’s need to be able to put children in the appropriate boxes.
If you do that, you steal a child’s experience and make it your own, to serve you. It focuses on what you need, not on what the child needs.
In doing so, you deny the child its own experiences, don’t you? You tell the child that his or her experiences don’t matter, as long as people get to put the child into a box, that that is all that matters.
To some degree, you are robbing the child of its own childhood.
The fact that children are being misdiagnosed as having “ADHD” because they are young almost seems to indicate that there is no such thing as ADHD and that people (psychologists?) are looking for problems that aren’t there.
Maybe they do that so that they can ignore more difficult problems that do exist. Why else would adults, and notably psychologists, do something as bonkers as this?
Or maybe it means that psychology is an utterly useless profession.
Maybe it means that standardized designer babies really are around the corner, with the option of creating children who never fidget and never run around and never dance or jump or scream.
Racial bias can seem like an intractable problem. Psychologists and other social scientists have had difficulty finding effective ways to counter it – even among people who say they support a fairer, more egalitarian society. One likely reason for the difficulty is that most efforts have been directed toward adults, whose biases and prejudices are often firmly entrenched.
My colleagues and I are starting to take a new look at the problem of racial bias by investigating its origins in early childhood. As we learn more about how biases take hold, will we eventually be able to intervene before any biases become permanent?
Measuring racial bias
When psychology researchers first began studying racial biases, they simply asked individuals to describe their thoughts and feelings about particular groups of people. A well-known problem with these measures of explicit bias is that people often try to respond to researchers in ways they think are socially appropriate.
Starting in the 1990s, researchers began to develop methods to assess implicit bias, which is less conscious and less controllable than explicit bias. The most widely used test is the Implicit Association Test, which lets researchers measure whether individuals have more positive associations with some racial groups than others. However, an important limitation of this test is that it only works well with individuals who are at least six years old – the instructions are too complex for younger children to remember.
Recently, my colleagues and I developed a new way to measure bias, which we call the Implicit Racial Bias Test. This test can be used with children as young as age three, as well as with older children and adults. This test assesses bias in a manner similar to the IAT but with different instructions.
Here’s how a version of the test to detect an implicit bias that favors white people over black people would work: We show participants a series of black and white faces on a touchscreen device. Each photo is accompanied by a cartoon smile on one side of the screen and a cartoon frown on the other.
In one part of the test, we ask participants to touch the cartoon smile as quickly as possible whenever a black face appears, and the cartoon frown as quickly as possible whenever a white face appears. In another part of the test, the instructions are reversed.
The difference in the amount of time it takes to follow one set of instructions versus the other is used to compute the individual’s level of implicit bias. The reasoning is that it takes more time and effort to respond in ways that go against our intuitions.
Some studies suggest that precursors of racial bias can be detected in infancy. In one such study, researchers measured how long infants looked at faces of their own race or another race that were paired with happy or sad music. They found that 9-month-olds looked longer when the faces of their own race were paired with the happy music, which was different from the pattern of looking times for the other-race faces. This result suggests that the tendency to prefer faces that match one’s own race begins in infancy.
These early patterns of response arise from a basic psychological tendency to like and approach things that seem familiar, and dislike and avoid things that seem unfamiliar. Some researchers think that these tendencies have roots in our evolutionary history because they help people to build alliances within their social groups.
However, these biases can change over time. For example, young black children in Cameroon show an implicit bias in favor of black people versus white people as part of a general tendency to prefer in-group members, who are people who share characteristics with you. But this pattern reverses in adulthood, as individuals are repeatedly exposed to cultural messages indicating that white people have higher social status than black people.
A new approach to tackling bias
Researchers have long recognized that racial bias is associated with dehumanization. When people are biased against individuals of other races, they tend to view them as part of an undifferentiated group rather than as specific individuals. Giving adults practice at distinguishing among individuals of other races leads to a reduction in implicit bias, but these effects tend to be quite short-lived.
In our new research, we adapted this individuation approach for use with young children. Using a custom-built training app, young children learn to identify five individuals of another race during a 20-minute session. We found that 5-year-olds who participated showed no implicit racial bias immediately after the training.
Although the effects of a single session were short-lived, an additional 20-minute booster session one week later allowed children to maintain about half of their initial bias reduction for two months. We are currently working on a game-like version of the app for further testing.
Only a starting point
Although our approach suggests a promising new direction for reducing racial bias, it is important to note that this is not a magic bullet. Other aspects of the tendency to dehumanize individuals of different races also need to be investigated, such as people’s diminished level of interest in the mental life of individuals who are outside of their social group. Because well-intended efforts to reduce racial bias can sometimes be ineffective or produce unintended consequences, any new approaches that are developed will need to be rigorously evaluated.
And of course the problem of racial bias is not one that can be solved by addressing the beliefs of individuals alone. Tackling the problem also requires addressing the broader social and economic factors that promote and maintain biased beliefs and behaviors.
The 2017 general election was highly unusual as far as the youth vote was concerned. The Labour party won 65% – the lion’s share – of the youth vote. The nearest comparisons are with 1964 and 1997. In both those years, Labour took 53% of the youth vote. In the 2015 election, just two years earlier, the party had won just 38% of the youth vote.
How the under-30s vote
The contrast between the youth vote in the 2010 and 2017 shows how radically youth voting patterns have changed. During this period, their turnout rose by 19%. This change in youth participation, combined with a massive swing to Labour, has unsurprisingly led some to talk of a “youthquake”.
What could have brought this about? Political and cultural drivers are clearly at work. That includes youth support for remaining in the EU and their preference for Jeremy Corbyn over Theresa May. Only a quarter of 18-to-25s voted to leave in the EU referendum compared with two-thirds of those over 65.
But economic drivers also played a crucial role. Young people, put simply, have lost out both in the economy and government policy making. Since 2010 the British government has been preoccupied with shoring up its political support among middle aged and retired voters. It has largely ignored the concerns of the young, very often dismissing them because, in the past, most young people did not vote. That all changed in 2017.
Paying for education
One obvious driver of youth voting is the rapid increase in student debt imposed by a government which sought to privatise higher education during the austerity years. Tuition fees were originally introduced in 1998 and had reached £3,000 per year by 2006-7. At the time, it was widely accepted that the considerable graduate premium which existed in lifetime earnings justified a contribution to the costs of higher education by the beneficiaries.
But things radically changed in 2010 when the coalition government introduced a fees cap of £9,000. Ironically, this increased privatisation of the costs of higher education was accompanied by ever-increasing regulation, so that the less the state supports higher education the more it wants to control it. This trend culminated in a 2016 proposal to scrap maintenance grants and raise fees to £9,250 while at the same time charging interest rates of 6.1% on student loans at a time when the Bank of England base rate was 0.25%.
Such a reckless disregard for the interests of more than 40% of the under-25s is quite hard to understand, particularly in light of the fate of the Liberal Democrats following their u-turn on tuition fees after they joined the coalition in 2010.
The bias against youth was not confined to university students. In April 2016, the minimum wage was raised to £7.50 an hour, but this change only applied to employed workers over the age of 25. The minimum wage for apprentices under the age of 19 was a meagre £3.50 and hour and this did not change. Young people were essentially ignored.
Another aspect of the same issue relates to the self-employed, none of whom receive the minimum wage. Historically, self-employed workers have been older than the workforce average age – but, in recent years, self-employment has grown faster among the under 25s than any other group with the exception of 40-year-olds. Between 2008 and 2015 the number of self-employed people in the UK increased from 3.8 million to 4.6 million people with part-time self-employment, often synonymous with under-employment, increasing by 88%. Thus young people have lost out on the increases in minimum wages, with many of them being underemployed and working part-time for wages that are well below average.
Are you even listening?
It was, therefore, no surprise that when the pollsters YouGov recently asked citizens to rank their priorities for the country, 46% of 18-24 year olds selected increasing the minimum wage to approximately £9 per hour. That compared to a national figure of 28% (and 19% among pensioners).
In our panel survey of the electorate conducted immediately before the 2017 general election, we asked respondents if they agreed or disagreed with the following statement: “The government treats people like yourself fairly”. We found that 18% of the under-25s agreed with this statement compared with 28% of the over-65s. In contrast, 49% of the under-25s disagreed with it compared with 32% of the over-65s. Youth have not only been left behind but many of them are aware of this fact and have a sense of grievance arising from it. The stark difference in the responses of youth and pensioners to this statement is related to the differences in the government’s treatment of them.
The so called “triple lock” on pensions was introduced by the coalition government in 2010. It was a guarantee to increase the state pension every year by the rate of inflation, average earnings or by a minimum of 2.5% whichever was the highest. By 2016 it produced a situation in which retired people had average incomes £2,500 higher than in 2007/8, while those who were not retired earned an average of £300 less over this period. The latter reflects the fact that real wages have been flat-lining for more than a decade.
Given all this it is no surprise that the 2017 election was a case of youth striking back.
This article is based on research by Paul Whiteley, Harold Clarke, Matthew Goodwin and Marianne Stewart. Paul Whiteley is speaking at Youthquake 2017! Can young voters transform the UK’s political landscape? a joint event between The Conversation and The British Academy on October 9, 2017.