Give your Brain the Gift of Christmas

Author: FREYA STORER | 09 DEC 2021

It’s that time of year again…The heating finally goes on and our once empty hallways are decorated with scarves, coats and muddied boots. A warm cloud of breath escapes the mouth as we stroll down avenues of naked trees, occasionally catching a glimpse of that familiar, glittery fern in a bay window. Whether you celebrate Christmas, Hannukah, Kwanza or your own version of a seasonal ritual, the festive season is punctuated by wholesome activities; consuming masses of food, cosying up with a Christmas film, and getting merry with family and friends. But before dreading  that post-holiday guilt due to the imminent extravagance, here are a few ways in which partaking in the festivities will help at least one of your organs, your brain!


Returning home to the smell of your mum’s perfume, the sight of haphazardly placed photographs, and the (almost nagging) sound of “Driving Home for Christmas” will undoubtedly drum up a powerful sense of nostalgia – a bittersweet, yet whimsical experience triggered by memory retrieval at the presence of familiar objects, music, smells and even people. Regarded as an overwhelmingly positive experience, nostalgia can have a significant impact on neuronal health (Wildschut et al., 2006). For one, re-encountering these memory cues can cause changes in our brain chemistry. Some studies have found a connection between nostalgia and a decrease in pro-inflammatory cytokines – molecules that are associated with local inflammation (Matsunaga et al., 2013).

So, when you’re embarrassed to put on “Love Actually” for the gazillionth time, remember that you’re doing everyone’s brain a favour.

Equally, revisiting memories can stimulate metabolic activity and blood flow, bringing more oxygen and nutrients to many parts of the brain. This was demonstrated in a study in 2016, where researchers showed participants images relating to childhood and looked at their brain activity using functional magnetic resonance imaging (fMRI) (Oba et al., 2016). This technology relies on visualising increases in blood-flow to areas of the brain that are ‘active’. The study found that “nostalgia-related activity” was localised to the memory (Hippocampus) and reward systems (Ventral Striatum) of the brain. When these reward centres are activated, they are flooded with dopamine – the feel-good hormone – which makes us feel good! Long-term this can increase psychological resilience and our sense of worth, and help decrease feelings of loneliness and depression (Dutcher & Creswell, 2018). However, this is not an invitation to go hunting for those embarrassing photos from your tweens, as the experience may not be as positive!


It won’t come as a shock to hear that socialising with friends and family can have many perks for your physical and mental health. Recent studies have unveiled the potential benefits of connecting with others on boosting cognitive skills and lowering the risk of dementia (Fratiglioni et al. 2004; Age UK). We all vary in our desire for company but share a fundamental need to create positive relationships. Socialising can build our sense of worth, increase our happiness and well-being, and may even help us to live longer (Harvard T.H. Chan School of Public Health). Research shows that people who engage in regular leisure activities, both mental and social, better retain functional abilities during the ageing process (Wang et al., 2012). Studies have also found that sociable participants often perform better in tests of memory and other cognitive skills (Smith et al., 2018). Hot new data even suggests it can slow down cognitive decline, which is an exciting prospect despite the research only being in its infancy (Floud et al., 2021; Fratiglioni et al., 2004; Marioni et al., 2015). It is noted that people with active social lives may also lead more active physical lives, which carry a host of benefits, both mental and physical – so make sure to go on the post-banquet walk as it will double the impact. For optimal results, why not pull out a dusty game of trivial pursuit and put your cognitive abilities to the test whilst enjoying the company of others. Also, if you feel a carol coming on make sure to belt it out, as singing with company does wonders by engaging and exercising multiple areas of your brain at the same time (Osman et al., 2016; Shakespeare & Whieldon, 2018).


Arguably the best part of the festive season is tucking into a hearty meal (with the intermittent tipple). This culinary highlight provides a celebration of many traditional ingredients and allows us to treat ourselves after a hard year’s work. In fact, the concept of “treating ourselves” is very important for our mental health and is encouraged in individuals who suffer from clinical depression (Cancino-Montecinos et al., 2020; Cooper, 2019). In addition, there are hidden health benefits in your Christmas dinner. Compared to the average British meal, the dinner is well balanced – so make sure to pack your plate with variety over quantity. A traditional turkey is stuffed with nutrients such as protein and vitamin B, however it is also rich in tryptophan, a precursor to serotonin which can aid in sleep and relaxation (Jenkins et al., 2016). For the vegetarian/vegan reader, a nut-roast could have even more benefits; being rich in essential fatty acids, it can promote cardiovascular health of the brain. Nuts are excellent brain-food, able to enhance cognition, memory, learning and other key brain functions. The best nut to include in your recipe is the humble walnut, containing a high concentration of docosahexaenoic acid (DHA), which boosts cognition and helps prevent age-related cognitive decline (Chauhan & Chauhan, 2020). Let us not forget the infamous sprout which in my opinion does not deserve the hate it gets, as it can be neuroprotective. Like other cruciferous vegetables, such as cabbage, watercress and kale, sprouts contain kaempferol, a type of flavonoid. Recent studies have linked flavonoids to decreased risk of developing dementia and protecting neurons against injury (Shishtar et al., 2020; Spencer, 2009). I guess this means we should reach for those delightful greens more often, but don’t forget the carrots for night-vision when you’re sneaking down the stairs for round 3 of the cheese board.

Overall, it’s easy to find yourself on the other side of New Year’s Eve, drenched in spirit-scented guilt, wondering to yourself why you always “overdo it”. But this year you can rest assured that a little jaunt down memory lane washed down with a guzzle of eggnog and a mince pie is nothing to be ashamed of and could in fact have benefits on your brain and mental health. So, when you’re embarrassed to put on “Love Actually” for the gazillionth time, remember that you’re doing everyone’s brain a favour.

Reviewed by: Matt and Uroosa


An active social life may help you live longer | News | Harvard T.H. Chan School of Public Health. (n.d.). Retrieved November 29, 2021, from

Cancino-Montecinos, S., Björklund, F., & Lindholm, T. (2020). A General Model of Dissonance Reduction: Unifying Past Accounts via an Emotion Regulation Perspective. Frontiers in Psychology, 11, 3184.

Chauhan, A., & Chauhan, V. (2020). Beneficial Effects of Walnuts on Cognition and Brain Health. Nutrients, 12(2).

Cooper, J. (2019). Cognitive dissonance: Where we’ve been and where we’re going. International Review of Social Psychology, 32(1).

Dutcher, J. M., & Creswell, J. D. (2018). The role of brain reward pathways in stress resilience and health. Neuroscience and Biobehavioral Reviews, 95, 559–567.

Floud, S., Balkwill, A., Sweetland, S., Brown, A., Reus, E. M., Hofman, A., Blacker, D., Kivimaki, M., Green, J., Peto, R., Reeves, G. K., & Beral, V. (2021). Cognitive and social activities and long-term dementia risk: the prospective UK Million Women Study. The Lancet. Public health, 6(2), e116–e123.

Fratiglioni, L., Paillard-Borg, S., & Winblad, B. (2004). An active and socially integrated lifestyle in late life might protect against dementia. The Lancet. Neurology, 3(6), 343–353.

Jenkins, T. A., Nguyen, J. C. D., Polglaze, K. E., & Bertrand, P. P. (2016). Influence of Tryptophan and Serotonin on Mood and Cognition with a Possible Role of the Gut-Brain Axis. Nutrients, 8(1).

Marioni, R. E., Proust-Lima, C., Amieva, H., Brayne, C., Matthews, F. E., Dartigues, J. F., & Jacqmin-Gadda, H. (2015). Social activity, cognitive decline and dementia risk: A 20-year prospective cohort study Chronic Disease epidemiology. BMC Public Health, 15(1), 1–8.

Matsunaga, M., Bai, Y., Yamakawa, K., Toyama, A., Kashiwagi, M., Fukuda, K., Oshida, A., Sanada, K., Fukuyama, S., Shinoda, J., Yamada, J., Sadato, N., & Ohira, H. (2013). Brain–Immune Interaction Accompanying Odor-Evoked Autobiographic Memory. PLoS ONE, 8(8), 72523.

Oba, K., Noriuchi, M., Atomi, T., Moriguchi, Y., & Kikuchi, Y. (2016). Memory and reward systems coproduce ‘nostalgic’ experiences in the brain. Social Cognitive and Affective Neuroscience, 11(7), 1069.

Osman, S. E., Tischler, V., & Schneider, J. (2016). ‘Singing for the Brain’: A qualitative study exploring the health and well-being benefits of singing for people with dementia and their carers. Dementia (London, England), 15(6), 1326.

Shakespeare, T., & Whieldon, A. (2018). Sing Your Heart Out: community singing as part of mental health recovery. Medical Humanities, 44(3), 153–157.

Shishtar, E., Rogers, G. T., Blumberg, J. B., Au, R., & Jacques, P. F. (2020). Long-term dietary flavonoid intake and risk of Alzheimer disease and related dementias in the Framingham Offspring Cohort. The American Journal of Clinical Nutrition, 112(2), 343–353.

Smith, B. M., Yao, X., Chen, K. S., & Kirby, E. D. (2018). A Larger Social Network Enhances Novel Object Location Memory and Reduces Hippocampal Microgliosis in Aged Mice. Frontiers in Aging Neuroscience, 10.

Spencer, J. P. E. (2009). Flavonoids and brain health: multiple effects underpinned by common mechanisms. Genes & Nutrition, 4(4), 243.

The benefits of social connections | Age UK. (n.d.). Retrieved November 29, 2021, from

Wang, H. X., Xu, W., & Pei, J. J. (2012). Leisure activities, cognition and dementia. Biochimica et Biophysica Acta (BBA) – Molecular Basis of Disease, 1822(3), 482–491.

Wildschut, T., Sedikides, C., Arndt, J., & Routledge, C. (2006). Nostalgia: content, triggers, functions. Journal of personality and social psychology, 91(5), 975–993.

Image credit: “★☆★ christmas ★☆★” by Puno3000 is licensed under CC BY-NC-ND 2.0

Getting a better understanding of life with Alzheimer’s Disease

Gabriella Nelligan | 26 October 2021

Alzheimer’s disease is a progressive, neurodegenerative condition and the most common type of dementia, making up to 70% of cases worldwide (Fish et al., 2019). Contrary to popular belief, dementia is not a disease but an umbrella term for the series of symptoms caused by pathological changes in the brain resulting in problems with memory, mental sharpness, language, understanding and judgement (NHS 2021).

Whilst not a normal part of ageing, age is undoubtedly the biggest risk factor for Alzheimer’s disease, although genetic and environmental factors do contribute. With a global increase of life expectancy and, thus, an increasingly ageing world population, the prevalence Alzheimer’s disease is on the rise. For instance, the World Health Organisation estimates that global cases will escalate in increments of 10 million per year, resulting in an annual cost of £55 billion to the UK economy by 2040 (WHO 2020, Prince et al., 2013). 

The cost of Alzheimer’s disease is not only economic, but also personally devastating – stripping patients of quality of life due to memory loss and cognitive defects in reasoning, judgement, and spatial awareness (Karantzoulis and Galvin 2011). Brain atrophy (i.e. ‘a loss of brain cells and/or a loss in the number of connections between brain cells’ (Kandola, A. 2020)) caused by Alzheimer’s disease also causes difficulty in swallowing, walking, and engaging in day-to-day activities (WHO 2020). In addition, because Alzheimer’s disease mainly affects older generations, there is a contemporary belief that the disease is just a ‘normal part of ageing’. However, many researchers have challenged this view (Alzheimer’s Research UK, 2021).

Changes in the brain in Alzheimer’s disease occur alongside an overproduction and deregulation of toxic proteins called amyloid-beta and tau. These proteins are vital for maintaining normal function of neurons, but when dysregulated they exert a toxic effect on the brain leading to synaptic and neuronal loss (Fish et al., 2019). Risk factors of Alzheimer’s disease include inherited genetic mutations, gender, age, and a generally unhealthy lifestyle. Smoking, being overweight, and other underlying health conditions such as diabetes, are also significant risk factors for Alzheimer’s disease (Alzheimer’s Society, 2021).

Altogether, life with Alzheimer’s disease paints a distressing picture, and the worst thing about it? There is still no effective treatment. Ever since Alois Alzheimer identified neurofibrillary tangles and amyloid plaques as hallmarks of the disease in 1906, the search has been on to combat these neuropathological events via pharmacological or gene therapies (Fish et al., 2019). Promising studies have included anti-amyloid and anti-tau therapies, as well as drugs targeting the associated neuroinflammation, however their efficiency has been unreliable (Panza et al., 2019). Nevertheless, the search pushes on, and government funding for dementia research has doubled from 2012 to 2015 – so there is hope (Department of Health 2016).

So, what do we do? An effective cure for Alzheimer’s disease is becoming increasingly urgent in today’s ageing population, but there is still a long road ahead. Therefore, in the meantime, we must understand and manage the disease, by developing new technologies such as GPS trackers to prevent dangerous wandering behaviours, and by raising awareness of the psychological distress the disease has on patients (Megges et al., 2018). By acknowledging the thought processes underlying Alzheimer’s disease, society can begin to understand the adverse behaviour from patients. 

As a child it was scary to watch my grandma succumb to the disease, scary to see her forget our names and scary to see her get cross at times because she couldn’t understand where she was or what was going on. Whenever my grandma had visitors, she would tell them that her mother was coming to collect her shortly, even though her mother had been dead for decades. Although towards the end of her life she couldn’t hold a conversation, and if she laughed, she wouldn’t remember what she’d laughed about, the feeling of being happy still remained. It can be isolating for patients, and devastatingly sad for family members, but memory isn’t everything that makes a person, they can still feel loved without remembering who is loving them. 

My own way to understand my personal experience with Alzheimer’s through my grandma was through educating myself. This eventually led me to attain an undergraduate degree in Neuroscience, and a master’s in research focusing on Alzheimer’s disease at Cardiff University. Throughout my education I have been incredibly proud to present, research and learn about Alzheimer’s disease and I hope to do the same for years to come. However, it wasn’t until my third year of university that I felt like I fully understood my grandma’s experience. During a lecture on schizophrenia the lecturer explained that people who suffer schizophrenic episodes experience a form of psychosis, unable to separate thoughts from reality. An example the lecturer used was one man with schizophrenia who would regularly believe that the clouds in the sky were tracking him as technology from the CIA. If you were to try and dissuade him from this perceived reality through reasoning by explaining the clouds were nothing to do with CIA, the man would become paranoid and upset. However, if you were to tell the man something about the clouds to make them less threatening but logical, such as today the CIA were not monitoring through the clouds, he would then understand. From this, I appreciated that my grandma could be in fact experiencing a form of psychosis, and that she truly believed her mother would come to collect her because she truly believed she was back at school. In fact, this is supported by a 2000 study of 329 Alzheimer’s disease patients by researchers at the University of California. Researchers found a cumulative incidence of 51.3% of hallucinations and delusions (psychosis) in patients by four years post-diagnosis (Paulsen et al., 2000). Therefore, there is no sense in trying to tell my grandma she is not at school and her mother has been dead for several years, because she won’t understand that in accordance with her own version of events. In this, I understood to be patient and agree with her, to conform to her version of events. In doing so, and avoiding contradicting or correcting her, I could also avoid distressing my grandma. 

Public learning of underlying thought processes behind conditions such as Alzheimer’s disease could really help progress towards a societal understanding, and adaptation of spaces to be more dementia-friendly. With schemes such as Dementia Friends with the Alzheimer’s Society doing their utmost to implement this, things are looking positive for Alzheimer’s patients and their friends and families. An example of how we can learn to educate ourselves and become more understanding towards Alzheimer’s patients is an initiative being pushed by the Alzheimer’s Society to replace black mats at the front of high street shops with blue ones. This is because the large black area can look like a sizeable pit to an Alzheimer’s patient. Although not obvious to you or I, this can be quite distressing for patients, who sometimes refuse to enter or leave shops for what looks like no apparent reason. Other visual pitfalls include patterned carpets and plates, which may look like snakes or insects meaning the patient is unlikely to walk over patterned flooring or eat off patterned china (AlzLive 2021). By educating society on the experiences of Alzheimer’s patients, changes can be made to accommodate for nuances in the Alzheimer’s brain whilst waiting for that all-important cure.

To de-stigmatise the experiences of people with Alzheimer’s disease, we must firstly appreciate that Alzheimer’s disease is not a normal product of ageing, but a treatable and preventable disease, and secondly understand that Alzheimer’s disease patients psychologically undergo different thought processes. Therefore, although we wait eagerly for a cure for Alzheimer’s disease, in the meantime we must educate ourselves, and more importantly, understand. 

Editors: Matthew Higgs, Ian Fox, Uroosa Chughtai.


AlzLive. 2021. How to Safeguard against Visual Pitfalls. Available at: [Accessed: 11/03/2021]

Alzheimer’s Research UK. 2021. Types of Dementia. Available at: [Accessed: 22/04/2021]

Alzheimer’s Society. 2021. Who gets Alzheimer’s disease? Available at: [Accessed: 27/04/2021]

Fish, P.V., Steadman, D., Bayle, E.D., Whiting, P. 2019. New Approaches for the treatment of Alzheimer’s disease. Bioorganic and Medicinal Chemistry Letters 29:2, pp. 125-133. doi: 10.1016/j.bmcl.2018.11.034

Department of Health. 2016. 

Kandola, A. 2020. What to know about brain atrophy. Available at: [Accessed: 19.05.21]

Karantzoulis, S. and Galvin, J.E. 2011. Distinguishing Alzheimer’s disease from other major forms of dementia. Expert Review of Neurotherapeutics 11, pp. 1579-91. doi: 10.1586/ern.11.155

Megges, H., Freiesleben, S., Jankowski, N., Haas, B., Peters, O. 2017. Technology for Home Dementia Care: A Prototype Locating System Put to the Test. Alzheimer’s and Dementia: Translational Research and Clinical Interventions 3, pp. 332-338. doi: 10.1016/j.trci.2017.04.004 

NHS. 2021. About Dementia. Available at: [Accessed: 21/05/2021]

Panza, F., Luzupone, M., Logroscino, G., Imbimbo, B.P. 2019. A critical appraisal of amyloid-B-targeting therapies for Alzheimer disease. Nature Reviews Neurology 15, pp. 73-88. doi:10.1038/s41582-018-0116-6 

Paulsen, J.S., Salmon, D.P., Thal, L.J., Romero, R., Weisstein-Jenkins, C., Galasko, D., Hofstetter, C.R… 2000. Incidence of and risk factors for hallucinations and delusions in patients with probable AD. Neurology 54:10, pp. 1965-71 doi: 10.1212/wnl.54.10.1965.

Prince, M., Bruce, R., Albanese, E., Wimo, A., Ribeiro, W., Ferri. C.P. 2013. The global prevalence of dementia: A systematic review and metanalysis. Alzheimer’s & Dementia  9:1, pp. 63-75 doi: 10.1016/j.jalz.2012.11.007

WHO. 2021. Dementia. Available at: [Accessed: 10/03/2021]

Catastrophic Thinking: Making a Mountain Out of a Molehill

Anonymous | 17 May 2021

“You have an anxiety disorder.” The first time I heard those words I felt relief wash over me. This wasn’t normal. The panic attacks, the low self-esteem, the wish to disappear from existence. Now it had a name, and I started to believe I could manage it.

The first time I experienced a panic attack, I didn’t know that’s what it was. I was in my late teens and was upset about a relatively minor problem. Dad had two tickets to an event, but Mam was struggling to get time off work to go. Dad’s solution was to give the tickets back, not once thinking of asking if I wanted to go instead. I lay awake that night, unable to fall asleep, trying to work out why Dad would rather not go at all than ask me to join him. Very quickly this spiralled out of control and I’d convinced myself that Dad didn’t love me. I’d done something wrong. I wasn’t good enough. I didn’t deserve to be loved. My chest was tight. I was short of breath. Silently I sobbed apologies for not being enough, whilst at the same time not understanding what I had done wrong.

Several years later, I now know that this was a classic case of catastrophic thinking. Catastrophising was defined in 1962 by Albert Ellis as a “tendency to magnify a perceived threat and overestimate the seriousness of its potential consequences” (Gellatly & Beck, 2016). To use a more common saying: making a mountain out of a molehill. In the mid-60’s, Aaron Beck developed Ellis’ ideas to propose a theory of depression, where patients hold exaggerated beliefs leading to misinterpreting situations as more negative than they actually are (Beck, 1963, 1964). This was also applied to anxiety disorders by Beck and expanded by David Clark in the 1980’s (Beck, 1986; Clark, 1986).

Sufferers of anxiety misinterpret both internal and external sensations as far more serious than they are in reality. A common example of this, within a panic disorder, is misinterpreting non-lethal pain as an imminent threat to life. Now, I believe it would be very difficult to find anyone who hasn’t experienced pain in the chest or head and not had at least a passing thought that they were experiencing a heart attack or brain aneurysm (especially after googling your symptoms). However, usually these thoughts pass, and you realise you had jumped to a worst-case-scenario conclusion (and need to stop believing everything you read on the internet). You then take a painkiller and move on with your life. For a person with a panic disorder this scenario can play out very differently. The belief that the pain in your head is due to an aneurysm becomes exaggerated. You cannot see that this is probably not the case, and you have jumped to an unlikely conclusion. Fear starts to set in and the only thing you can think about is the pain and how you are convinced a blood vessel has burst. You will die. You are unable to stop the escalating anxiety and experience a full-blown panic attack.

How catastrophic beliefs result in panic and other forms of intense anxiety can be thought of as a positive feedback loop; a cycle that once triggered, keeps triggering itself (Gellatly & Beck, 2016). An initial event (pain in your head) induces catastrophic beliefs (the brain aneurysm). This then leads to interpretive bias where the situation can only be terrible (“this pain means I will die”), reaffirming the catastrophic belief. Attentional bias also occurs, where only threatening information is taken in (the fact the headache is painful and ignoring that it’s not a “thunderclap headache” characteristic of a burst aneurysm (NHS, 2018)) and this reinforces the interpretive bias. Attentional fixation supports the attentional bias by forcing focus solely to the threat (the headache) so the situation cannot be re-examined. Physical symptoms of anxiety, such as chest pain or shortness of breath, occur from which new catastrophic beliefs form, kicking off the entire cycle again. Fixation also occurs on these physical symptoms, bringing the attentional and interpretive bias towards them. In this way, a single event inducing a catastrophic belief triggers a cycle of anxiety and panic that is difficult to break.

Why do some, but not others, fall victim to this cycle of catastrophic beliefs is not clear. Why can some people realise that the initial thought that they may be in a life-or-death situation is ridiculous and move on with their lives, whilst others spiral into panic? As is often the case with neurological disorders, the answer appears to be a combination of nature and nurture (Meier & Deckert, 2019). Various genes and mutations have been linked with anxiety disorders, but how it manifests is also influenced by the environment and individual experiences.

To help those prone to suffering from catastrophic beliefs, cognitive tools have been developed to promote ‘de-catastrophising’ (Knaus & Carlson, 2014; Whalley, 2017). It’s well known that acceptance is the first step towards recovery, and the same is true for catastrophic beliefs (Knaus, 2012). Accepting that you experience catastrophic thoughts and not placing blame on yourself is a crucial step towards breaking the cycle. Stopping and reflecting on these thoughts can help you see the bigger picture. Ask yourself what’s actually happening here? Is the leap you’ve made logical? Or are you fixating on something awful yet unlikely? Breaking out of this cycle of blame and judgement is not easy and may require professional help to guide you through de-catastrophising techniques.

When I think back to my first encounters with catastrophic beliefs, I still feel shame and embarrassment that something so small and inconsequential caused such drastic feelings of inadequacy, and I don’t expect those feelings to ever go away. Even back then, I knew this incident was not worth the pain I was experiencing and even recognised the situation itself was a privileged one to be in (which just added to the guilt). However, in that moment I felt it represented a much larger defect. I was unlovable for reasons I couldn’t fathom. Over the following years, relatively minor issues continued to spark panic attacks and only reinforced my low self-esteem and low confidence. I came to believe that this was just how I was. It became my defining personality trait. I experienced sustained periods of anxiety, believed it would never get better, and maybe the only thing to do was end it all. It was at this point, years after my first attack, I finally realised this wasn’t normal and I needed help.

Understanding what is going on in my head has helped me to recognise when I am catastrophising. I can remind myself that the situation is not as bad as I think. I can take the first step towards shedding (at least some of) the blame and judgement. It’s not an easy path and I often fall off it. I still sometimes believe this is all I could ever be. But knowing there’s a reason behind it, knowing these theories came from observations of others, helps remind me I am not alone. I don’t judge others on their mental health, so why should I judge myself? I will not be defined by anxiety. I will not let others define me by it. I refuse to be only this. I am so much more.

Editors: Matt Higgs and Steliana Yanakieva

Beck, A. T. (1963). Thinking and Depression. I. Iidiosyncratic Content and Cognitive Distortions. Archives of General Psychiatry, 9, 324-333.

Beck, A. T. (1964). Thinking and Depression. II. Theory and Therapy. Archives of General Psychiatry, 10, 561-571.

Beck, A. T. (1986). Cognitive Approaches to Anxiety Disorders. In B. F. Shaw, Z. V. Segal, T. M. Vallis, & F. E. Cashman (Eds.), Anxiety Disorders: Psychological and Biological Perspectives (pp. 115-135). Springer US.

Clark, D. M. (1986). A cognitive approach to panic. Behaviour Research and Therapy, 24(4), 461-470.

Gellatly, R., & Beck, A. T. (2016). Catastrophic thinking: A transdiagnostic process across psychiatric disorders. Cognitive Therapy and Research, 40(4), 441-452.

Knaus, W. J. (2012). Anxiety and Exaggerations: Get relief from amplifying possibilities into catastrophes. Psychology Today. Retrieved 18/06 from,Fortunately%2C%20catastrophic%20thinking%20is%20correctable.

Knaus, W. J., & Carlson, J. (2014). The Cognitive Behavioral Workbook for Anxiety : A Step-By-Step Program. New Harbinger Publications.

Meier, S. M., & Deckert, J. (2019). Genetics of Anxiety Disorders. Curr Psychiatry Rep, 21(3), 16.

NHS. (2018). Overview Brain Aneurysm.

Whalley, M. (2017). Psychology Tools For Overcoming Panic (1 ed.). Psychology Tools.



 Author: FRANCESCA KEEFE | 14 APR 2021


Is the public perception of illicit drug users wrong? The media often promotes the stereotype of illicit drug users as violent thugs, always on the hunt for their next “high”.  In reality 3.2 million adults, across England and Wales (that is nearly 10% of adults), reported illicit drug use in 2019 alone (Office of National Statistics, 2020). Crucially, these individuals were largely law-abiding, successful and respectable citizens (Harrison, 1994). So why does the social stigmatisation continue? Here, I challenge an assortment of neuromyths in drug addiction, which have, arguably, tainted public perception and have had severe repercussions on clinical research and innovation (Nutt et al. 2013; Ross, 2020). 


Drugs that act artificially to alter mood, perception and behaviour are classed as psychedelic drugs (Nutt et al. 2013). Depending on the psychedelic, the effect can be energising, calming and/or hallucinogenic. This is due to the different substances having different modes of action – targeting different brain pathways and different neurotransmitter signals (Nutt et al. 2015; De Gregorio et al. 2021). Of significance is the ability of some (but not all) psychedelic drugs to activate the reward pathway, an innate learning mechanism that positively reinforces specific behaviours (in this case, drug use). Unfortunately, this mode of action has the potential to drive drug misuse and drug dependency (aka drug addiction) in vulnerable individuals. 


Drug addiction is a state where an individual craves the drug to the extent of sacrificing other rewards, along with making irrational (often detrimental) tradeoffs to obtain the substance (Wakefield, 2020; Diagnostic and Statistical Manual of Mental Disorders, 5th Edition, 2013). Consequently, leading to the neglect of work, hobbies and relationships. When an addict attempts to become drug-free (abstinence) they inevitably suffer withdrawal symptoms. These can be physical (i.e. headaches, vomiting, shakes), or psychological (intrusive thoughts).  The common outcome being to relapse back into drug taking. People struggling with drug addiction are prone to repeated relapses, even after long periods of abstinence (Pang et al. 2019). 


The neural mechanisms underpinning this clinical profile have been described in a number of theories, the reward pathway being central to them all (Wakefield 2020). Addictive drugs manipulate neurotransmitter signals, be it GABA, serotonin, glutamate, or dopamine, to the same end: activation of the reward pathway (Nutt et al. 2015; De Gregorio et al. 2021). Significantly, addictive drugs overload the system in a uniquely detrimental way, not observed for non-addictive rewards. According to the “hijacking theory” of drug addiction, this unnaturally high activation of the reward pathway triggers maladaptive learning, which ultimately marks the point of drug dependency: when an individual’s ability to control drug consumption deteriorates.


By breaking free of the
narrow view that all
illicit drugs are bad, our
research community has demonstrated their
potential for good.

The hijacking theory of drug addiction provides a more favourable output for a drug addict’s recovery. Instead of being an incurable chronic disease, the hijacking theory suggests that addictive behaviour has the potential to be unlearnt. This outlook has led to rehabilitation strategies focused on retraining the reward pathway to associate rewards with being drug-free. The practice of these so-called contingency management schemes has transformed the treatment of addiction (Petry 2011, Ross 2020). Their efficacy far surpassing that achieved with a pharmacological intervention (i.e. substance substitute) (Petry 2011, Ross 2020). 

The hijacking theory of drug addiction provides a more favourable output for a drug addict’s recovery. Instead of being an incurable chronic disease, the hijacking theory suggests that addictive behaviour has the potential to be unlearnt. This outlook has led to rehabilitation strategies focused on retraining the reward pathway to associate rewards with being drug-free. The practice of these so-called contingency management schemes has transformed the treatment of addiction (Petry 2011, Ross 2020). Their efficacy far surpassing that achieved with a pharmacological intervention (i.e. substance substitute) (Petry 2011, Ross 2020). 


The illegal branding of specific psychedelics has been justified by the reported harm associated with the use of each drug. At the centre of this is the claim that illicit drugs are more addictive than their legal counterparts (such as tobacco and alcohol). However, this remains a matter of debate (Nutt et al. 2007; Hart, 2020). The ranking of a drug’s addictive potential is a combination of if/how strongly a drug activates the reward pathway, along with practical considerations of how easy the substance is to obtain/what does the substance cost. Unsurprisingly, at the top of the ranking are the illicit drugs: opioids (heroin, morphine, opium) and cocaine (Nutt et al. 2007). However, closely following are tobacco and alcohol (Nutt et al. 2007). Although both substances have high addictive potentials, tobacco and alcohol retain their legal status. Whereas, LSD and psilocybin (aka magic mushrooms) have no physical addictive properties, but are illegal.   


How addictive is a “highly addictive” drug? Despite the propaganda, the path of illicit drug use to addiction is not a simple one. 70-90% of illicit drug users never develop drug dependency (Grifell & Hart, 2018). Expectedly, the type of drug and the frequency of use significantly impacts an individual’s risk. The higher proportion of users that transition to addicts typically being those using higher ranked addictive drugs (i.e heroin and cocaine). Nonetheless, this fact awakens us to the truth: illicit drug use is not sufficient to cause addiction (Ross, 2020). Logically, it follows that additional factors mark an individual’s risk of drug addiction (Ersche et al. 2020).  Seemly unrelated factors have been robustly implicated in marking an individual as vulnerable, including: genetics, gender, social (i.e. life adversities) and economic status (Redonnet et al. 2012; Ersche et al. 2020; Oliverio et al. 2020; Munn‐Chernoff et al. 2021). 


Despite the above, the ban on illicit drugs remains.  The repercussions of which impacts the society, economy and health sectors (Nutt et al. 2013).  In regards to society, illicit drug users and addicts are often treated as outcasts. The negative impact of which can fuel continued drug use, and hinder the rehabilitation and acceptance of recovering addicts back into society. Moreover, our economy is drained by law enforcement costs against the illicit drug trade that amounts to an estimated £780 million annually (Fell et al. 2019). Despite this staggering figure, law enforcement impact is marginal at best. In terms of health, the transition of banned psychedelics into the clinic has been an uphill battle, despite evidence of their efficacy and safety (Krediet et al. 2020). Fortunately, the last decade has seen a boom in the clinical application of illicit drugs (including LSD, ecstasy and ketamine) in the treatment of neuropsychiatric disorders (Nutt et al. 2013; De Gregorio et al. 2021). By breaking free of the narrow view that all illicit drugs are bad, our research community has demonstrated their potential for good.

Edited by: Steliana Yanakieva and Peter Richardson


  • De Gregorio, D., Aguilar-Valles, A., Preller, K. H., Heifets, B. D., Hibicke, M., Mitchell, J., & Gobbi, G. (2021). Hallucinogens in Mental Health: Preclinical and Clinical Studies on LSD, Psilocybin, MDMA, and Ketamine. The Journal of Neuroscience, 41(5), 891–900.
  • Diagnostic and Statistical Manual of Mental Disorders, 5th Edition, 2013
  • Ersche, K. D., Meng, C., Ziauddeen, H., Stochl, J., Williams, G. B., Bullmore, E. T., & Robbins, T. W. (2020). Brain networks underlying vulnerability and resilience to drug addiction. Proceedings of the National Academy of Sciences, 117(26), 15253–15261. 
  • Fell, E., James, O,. Dienes, H., Shah, N., & Grimshaw, J. (2019). Home Office Research Report 103. Understanding organised crime 2015/16: Estimating the scale and the social and economic costs. Second edition. 
  • Grifell, M., & Hart, C. (2018). Is Drug Addiction a Brain Disease? American Scientist, 106(3), 160. 
  • Krediet, E., Bostoen, T., Breeksema, J., van Schagen, A., Passie, T., & Vermetten, E. (2020). Reviewing the Potential of Psychedelics for the Treatment of PTSD. International Journal of Neuropsychopharmacology, 23(6), 385–400. 
  • Munn‐Chernoff, M. A., Johnson, E. C., Chou, Y., Coleman, J. R. I., Thornton, L. M., Walters, R. K., … Agrawal, A. (2021). Shared genetic risk between eating disorder‐ and substance‐use‐related phenotypes: Evidence from genome‐wide association studies. Addiction Biology, 26(1). 
  • Nutt, D., King, L. A., Saulsbury, W., & Blakemore, C. (2007). Development of a rational scale to assess the harm of drugs of potential misuse. The Lancet, 369(9566), 1047–1053. 
  • Nutt, D. J., King, L. A., & Nichols, D. E. (2013). Effects of Schedule I drug laws on neuroscience research and treatment innovation. Nature Reviews Neuroscience, 14(8), 577–585. 
  • Nutt, D. J., Lingford-Hughes, A., Erritzoe, D., & Stokes, P. R. A. (2015). The dopamine theory of addiction: 40 years of highs and lows. Nature Reviews Neuroscience, 16(5), 305–312.
  • Office of National Statistics 2020
  • Oliverio, R., Karelina, K., & Weil, Z. M. (2020). Sex, Drugs, and TBI: The Role of Sex in Substance Abuse Related to Traumatic Brain Injuries. Frontiers in Neurology, 11.
  • Pang, T. Y., Hannan, A. J., & Lawrence, A. J. (2019). Novel approaches to alcohol rehabilitation: Modification of stress-responsive brain regions through environmental enrichment. Neuropharmacology, 145, 25–36. 
  • Redonnet, B., Chollet, A., Fombonne, E., Bowes, L., & Melchior, M. (2012). Tobacco, alcohol, cannabis and other illegal drug use among young adults: The socioeconomic context. Drug and Alcohol Dependence, 121(3), 231–239.

How being sick could make you sicker: The role of peripheral inflammation in depressive disorders.

Valentina Bart | 01 April 2021

“Mens sana in corpore sano” – a healthy mind in a healthy body

Mental health is a topic that is becoming increasingly important in everyday life. Presently, 1 in 6 children between the ages of 5 and 16 struggle with mental health issues, with the NHS reporting mental health problems to be the biggest cause of disability in the UK. It is often said that physical exercise is important for mental well-being. Taking this idea further and looking at current research, it becomes clear that a sick body could severely harm your mental health. 

But how does physical sickness link to our mental wellbeing? 

Inflammation is the natural reaction of the organism to an insult, such as exposure pathogens, severe trauma, stress, obesity, and normal aging. In an initial response, fast-responding immune cells, which are your body’s first line of protective soldiers, recognise pathogens by their specific proteins. Your “soldier cells” respond to these proteins by producing inflammatory mediators, which is the equivalent of sending out an emergency response team to do some damage control. This reaction causes the typical symptoms of inflammation: redness, swelling, pain, heat, and possibly loss of function. In an otherwise healthy organism this immune response is balanced so that after the damage caused by inflammation anti-inflammatory processes are induced to tidy up the mess. Eventually, that would restore homeostasis and  bring you back to your normal healthy self.

While these processes occur in most areas of the body, the brain was long considered an immune privileged site, meaning it was thought not to be affected by any of these immune processes occurring in the blood. However, over the last decades the idea that the central nervous system (CNS) is protected from inflammation has been challenged from two directions. On the one hand, there are CNS diseases such as Alzheimer’s disease and multiple sclerosis which are characterised and directly driven by immune responses and inflammation in the brain. On the other hand, it has been noted that peripheral immune responses, happening elsewhere in the body, also influence brain health. One example is “sickness behaviour”, a state of depressed mood, fatigue, and disrupted appetite associated with diseases. It drives sick individuals to rest and thus allows energy to be redirected to the immune system to combat pathogens. This is why you may feel lethargic and yucky when you get the flu. Interestingly, there is an overlap between the symptoms of “sickness behaviour” and major depressive disorder (MDD) which has led researchers to investigate the role of the immune system in neuropsychiatric disorders. 

MDD is a severe form of depression with symptoms that include loss of appetite, sleep disruption, fatigue and feelings of worthlessness. With a great increase in suicide (Chesney, Goodwin & Fazel, 2014) and around 30% of patients not responding to the standard treatments (Rush et al., 2006). It is troubling how little we know about this disease. Although there is a genetic component, this seems to interact with environmental factors, such as stress and trauma, to develop full blown MDD (Caspi et al., 2006). 

Additionally, several studies have described a direct link between inflammation and MDD. For example, the incidence of MDD is increased in patients suffering from inflammatory diseases such as rheumatoid arthritis (Dickens et al., 2002) or cancer (Linden et al., 2012) and vice versa. Also, approximately one-third of people struggling with MDD show increased inflammatory biomarkers in the absence of medical disease (Liu, Ho and Mak, 2012). Lastly, variants in some genes associated with inflammation are also associated with increased risk of MDD (Gałecki et al., 2012). With this increasing body of evidence, it is becoming very clear that peripheral inflammation is very important when it comes to understanding the biological mechanisms driving depression.

To appreciate how peripheral inflammation can affect the brain, we will follow two such pro-inflammatory mediators on their journey; IL-1 and TNF.

Naturally, we start at the beginning – the moment when our soldier immune cells are activated by the intrusion of pathogens. As a first line of defence, they produce and release a variety of pro-inflammatory substances, the job of which is to attract more specialised immune cells and direct the immune response. This is crucial, since different types of these specialised  cells are trained to respond to different types of pathogens more efficiently than the brute force offered by the soldier cells. You wouldn’t want an air force unit to deal with a marine attack. This is where IL-1 and TNF come in. 

During a regulated immune response, these cytokines alert your specialised forces, before being removed by anti-inflammatory signals to resolve inflammation and restore homeostasis. Although these inflammatory processes are great at dealing with pathogens, they also inflict a lot of self-damage which can easily be repaired after the inflammation has died down… assuming it does die down. If it doesn’t, this is known as chronic inflammation and can cause a lot of issues since the body is unable to repair itself. Indeed, studies consistently report high levels of pro-inflammatory mediators in patients struggling with depression (Dahl et al., 2014) and inflammation can even predict symptoms of depression later in life (Khandaker et al., 2014).

So, as now we know their role in an immune response, how could IL-1 and TNF affect mental health?

Crucially, they need to affect brain-resident cells. If they are present in the body in very high quantities, they can directly cross into the CNS. Considering the brain is the master regulator of the whole body and mediators are typically kept in a delicate balance, this can be dangerous. This is illustrated by the fact that while low doses of IL-1 are needed for memory formation, high levels as observed during inflammation actually impair the same process (Kelly et al., 2003), which is reflected in memory disturbances experienced by individuals with MDD (Lam et al., 2014).

Within the brain, IL-1 also activates the hypothalamic-pituitary-adrenal (HPA) axis, a complex system of glands in the brain and in the abdomen that influence each other and regulate how the body responds to stress. This axis is thought to be overactivated in MDD, causing an overproduction of stress-related hormones. Stress hormones once again have been linked to impaired memory formation in animal models (Alfarez, Joëls and Krugers, 2003).

Stress hormones also affect the release of neurotransmitters (brain hormones) and how they act on their receptors. One example is the serotonin 1A receptor, the expression of which in the brain is decreased by stress hormones (Meijer et al., 2001). This receptor normally binds to serotonin, the “happy chemical” that is known to play a role in depression.

When two neurons interact with each other in what is called a synapse, the one sending out a signal is called presynaptic, and the receiving neuron is called postsynaptic. Both of these neurons can express the Serotonin 1A receptor, but it will play different roles. When the receptor is expressed on the firing (presynaptic) neuron, it takes up serotonin released by the very same neuron and thereby prevents it reaching the receiving neuron. This is bad, as serotonin is important in regulating your mood via serotonin actually reaching the postsynaptic neuron. This presynaptic receptor is acted on by selective serotonin reuptake inhibitors (SSRIs, drugs prescribed for MDD) which cause its desensitisation, making serotonin available to the receiving neuron and alleviate symptoms of depression.

In contrast, the same receptor on the receiving neuron does not seem to be affected by SSRIs, which is good, as this receptor needs to bind serotonin to transmit the “happy signal”. This receptor however is affected by stress hormones, and a decrease of the postsynaptic receptor numbers has been linked to suicide by depression (Cheetham et al., 1990). 

IL-1 and TNF can also affect the brain without even gaining access. That is because they can interact with cells lining the barrier to the brain and cause them to produce other mediators that then act on brain-resident cells. In experiments with mice, researchers found that the injection of TNF triggers these brain-bordering cells to produce a lipid that has also been detected in the fluid surrounding the brain of depressed patients (Linnoila et al., 1983). Within the CNS this lipid stimulates the production of other pro-inflammatory cytokines that will further disturb the balance and has also been implicated in the activation of the HPA axis that we discussed before (García-Bueno, Serrats and Sawchenko, 2009).

In one last trick, IL-1 and TNF do not even have to be in close proximity to the brain to affect mental health: they can stimulate the vagus nerve which runs between the brain and the abdomen and relays information from the periphery into the brain. This is the major component of the parasympathetic nervous system involved in the control of heart rate, digestion, mood, and immune responses. In rats, researchers showed that peripheral IL-1 activates the afferent vagus nerve (Hansen et al., 2001), thereby informing the brain of inflammation detected in the body. By disrupting signalling through this nerve, the researchers demonstrated that they could prevent sickness behaviour in rats without affecting the amount of circulating IL-1 (Bluthé et al., 1994).

So, since IL-1 and TNF are thought to be key players involved in the symptoms of MDD, could they be a potential therapeutic target for MDD? Indeed, various anti-inflammatory agents have demonstrated anti-depressive effects: data from clinical trials in which people received medications that block inflammatory cytokines to treat medical diseases showed significant improvement of depressive symptoms (Kappelmann et al., 2018). Similar results were reported from another study where 5 out of 6 anti-inflammatory drugs could improve depression symptoms compared to placebos (Köhler‐Forsberg et al., 2019). While these are important findings, studies have failed to show consistent associations between inflammatory cytokines and disease severity, suggesting that only certain subtypes of depression are based on or exacerbated by inflammatory processes. Indeed, only a fraction of patients struggling with mental diseases present with markers of peripheral inflammation such as IL-1 and TNF. 

A study comparing cancer patients receiving pro-inflammatory cytokine therapy with MDD patients with no physical illness showed an overlap in depressive symptoms. However, this study also showed that psychomotor retardation and weight loss were stronger in depressed cancer patients while increased feelings of guilt were stronger in (otherwise healthy) MDD patients (Capuron et al., 2009). While this study investigated a very small number of people, the results fit with the idea of a high heterogeneity of mood disorders that would respond to different types of treatments. It is unclear at the moment whether the group of patients that benefits from anti-inflammatory drugs overlaps with the group of people that does not respond to traditional MDD medication.

To date, most studies of the interaction between immune activation and psychiatric diseases provide correlational evidence rather than causal relationships. While animal studies can show the direct production of depressive symptoms following cytokine exposure, such results cannot directly be translated into the complex reality of human mood disorders. Nevertheless, the link between peripheral inflammation and mental health can help explain mental side effects of immune activating therapeutics for the treatment of cancer. The mental health of patients receiving such medication should carefully be monitored.

In the future, continuous research into how exactly inflammatory mediators could trigger or exacerbate mental conditions will not only help us understand mental disease heterogeneity, but potentially also result in the development of immune-based strategies that might help improve the lives of mental health patients that do not respond to current therapeutics.

Editors: Steliana Yanakieva and Katie Sedgewick


  • Alfarez, D. N., Joëls, M. and Krugers, H. J. (2003) ‘Chronic unpredictable stress impairs long-term potentiation in rat hippocampal CA1 area and dentate gyrus in vitro’, European Journal of Neuroscience. John Wiley & Sons, Ltd, 17(9), pp. 1928–1934. doi: 10.1046/j.1460-9568.2003.02622.x.
  • Bluthé, R. M. et al. (1994) ‘Lipopolysaccharide induces sickness behaviour in rats by a vagal mediated mechanism – PubMed’, Comptes Rendus de l’Academie des Sciences – Series III, 317(6), pp. 499–503. Available at: (Accessed: 5 January 2021).
  • Capuron, L. et al. (2009) ‘Does cytokine-induced depression differ from idiopathic major depression in medically healthy individuals?’, Journal of Affective Disorders. NIH Public Access, 119(1–3), pp. 181–185. doi: 10.1016/j.jad.2009.02.017.
  • Cheetham, S. C. et al. (1990) ‘Brain 5-HT1 binding sites in depressed suicides’, Psychopharmacology. Springer-Verlag, 102(4), pp. 544–548. doi: 10.1007/BF02247138.
  • Dahl, J. et al. (2014) ‘The plasma levels of various cytokines are increased during ongoing depression and are reduced to normal levels after recovery’, Psychoneuroendocrinology. Elsevier Ltd, 45, pp. 77–86. doi: 10.1016/j.psyneuen.2014.03.019.
  • Dickens, C. et al. (2002) ‘Depression in rheumatoid arthritis: A systematic review of the literature with meta-analysis’, Psychosomatic Medicine. Lippincott Williams and Wilkins, pp. 52–60. doi: 10.1097/00006842-200201000-00008.
  • Gałecki, P. et al. (2012) ‘The expression of genes encoding for COX-2, MPO, iNOS, and sPLA2-IIA in patients with recurrent depressive disorder’, Journal of Affective Disorders. Elsevier, 138(3), pp. 360–366. doi: 10.1016/j.jad.2012.01.016.
  • García-Bueno, B., Serrats, J. and Sawchenko, P. E. (2009) ‘Cerebrovascular cyclooxygenase-1 expression, regulation, and role in hypothalamic-pituitary-adrenal axis activation by inflammatory stimuli’, Journal of Neuroscience. Society for Neuroscience, 29(41), pp. 12970–12981. doi: 10.1523/JNEUROSCI.2373-09.2009.
  • Hansen, M. K. et al. (2001) ‘The contribution of the vagus nerve in interleukin-1β-induced fever is dependent on dose’, American Journal of Physiology – Regulatory Integrative and Comparative Physiology. American Physiological Society, 280(4 49-4). doi: 10.1152/ajpregu.2001.280.4.r929.
  • Kappelmann, N. et al. (2018) ‘Antidepressant activity of anti-cytokine treatment: A systematic review and meta-analysis of clinical trials of chronic inflammatory conditions’, Molecular Psychiatry. Nature Publishing Group, 23(2), pp. 335–343. doi: 10.1038/mp.2016.167.
  • Kelly, Á. et al. (2003) ‘Activation of p38 plays a pivotal role in the inhibitory effect of lipopolysaccharide and interleukin-1β on long term potentiation in rat dentate gyrus’, Journal of Biological Chemistry. American Society for Biochemistry and Molecular Biology, 278(21), pp. 19453–19462. doi: 10.1074/jbc.M301938200.
  • Khandaker, G. M. et al. (2014) ‘Association of serum interleukin 6 and C-reactive protein in childhood with depression and psychosis in young adult life a population-based longitudinal study’, JAMA Psychiatry. American Medical Association, 71(10), pp. 1121–1128. doi: 10.1001/jamapsychiatry.2014.1332.
  • Köhler‐Forsberg, O. et al. (2019) ‘Efficacy of anti‐inflammatory treatment on major depressive disorder or depressive symptoms: meta‐analysis of clinical trials’, Acta Psychiatrica Scandinavica. Blackwell Publishing Ltd, 139(5), pp. 404–419. doi: 10.1111/acps.13016.
  • Lam, R. W. et al. (2014) ‘Cognitive dysfunction in major depressive disorder: Effects on psychosocial functioning and implications for treatment’, Canadian Journal of Psychiatry. Canadian Psychiatric Association, pp. 649–654. doi: 10.1177/070674371405901206.
  • Linden, W. et al. (2012) ‘Anxiety and depression after cancer diagnosis: Prevalence rates by cancer type, gender, and age’, in Journal of Affective Disorders. Elsevier, pp. 343–351. doi: 10.1016/j.jad.2012.03.025.
  • Linnoila, M. et al. (1983) ‘CSF Prostaglandin Levels in Depressed and Schizophrenic Patients’, Archives of General Psychiatry. American Medical Association, 40(4), pp. 405–406. doi: 10.1001/archpsyc.1983.01790040059008.
  • Liu, Y., Ho, R. C. M. and Mak, A. (2012) ‘Interleukin (IL)-6, tumour necrosis factor alpha (TNF-α) and soluble interleukin-2 receptors (sIL-2R) are elevated in patients with major depressive disorder: A meta-analysis and meta-regression’, Journal of Affective Disorders. Elsevier, pp. 230–239. doi: 10.1016/j.jad.2011.08.003.
  • Meijer et al. (2001) ‘Transcriptional Repression of the 5-HT1A Receptor Promoter by Corticosterone Via Mineralocorticoid Receptors Depends on the Cellular Context’, Journal of Neuroendocrinology, 12(3), pp. 245–254. doi: 10.1046/j.1365-2826.2000.00445.x.

The Mad King of England: Neuroscience behind the Royal Malady

Ian Fox | 17 MAR 2021

George III was King of Great Britain and Ireland from 1760 to 1820. He ascended the throne of Britain when he was only 23 years old and he reigned for just over 40 years – making him one of Britain’s longest ruling monarchs. His reign was marked by great national unrest, including the loss of the American War of Independence and then – only a few years later – the constant threat of invasion by Napoleonic France. Under his leadership, Britain navigated through the storm of war, eventually triumphing over France in 1815, and this brought about a 100-year long peace in Europe – known as ‘Pax Britannia’.

Now, I know what you are thinking – what does this have to do with neuroscience and the brain? Well, despite George III’s great political achievements, he is most commonly remembered in history as the ‘Mad King of England’ (Rohl, Warren, & Hunt, 1998). Although he was one of Britain’s longest reigning monarchs, George III was recorded to be relatively weak, both physically and mentally, especially during the latter half of his life. He suffered from a series of ailments, including ‘flying gout’, colic, insomnia, delirium, and acute mania (Rohl et al., 1998). Interestingly, he was also afflicted with hallucinations and delusions, often believing that two of his children – who had died in childhood – were still alive. It was also reported that he would talk rapidly and incoherently, apparently chattering endlessly for over 70 hours before he died, but this is disputed (Brooke, 1972). Altogether, he suffered from four, possibly five, manic attacks, the last of which occurred between 1810-1820 and eventually claimed his life.

It is clear then that the King was gripped by a terrible illness that overwhelmed his mind. But what exactly caused George III to go ‘mad’? This question has interested many historians and scientists for centuries, but to this day no one has solved the case of George III and ‘the Royal Malady’. For many years, the official diagnosis of the King was ‘madness’, a term widely used by 18th century medical professionals to describe any patient suffering from any form of mental disturbance causing a drastic change in character and personality (Rohl et al., 1998). Psychiatrists in the 19th century would eventually change this diagnosis to ‘mania’ which, as you can imagine, is still not very helpful, since mania can be implied of any mental disorder which causes rapid mood changes (Ray, 1855).

Whenever God of his infinite goodness shall call me out of this world, the tongue of malice may not paint my intentions in those colours she admires, nor the sycophant extol me beyond what I deserve.

King George III

It is clear then that the King was gripped by a terrible illness that overwhelmed his mind. But what exactly caused George III to go ‘mad’? This question has interested many historians and scientists for centuries, but to this day no one has solved the case of George III and ‘the Royal Malady’. For many years, the official diagnosis of the King was ‘madness’, a term widely used by 18th century medical professionals to describe any patient suffering from any form of mental disturbance causing a drastic change in character and personality (Rohl et al., 1998). Psychiatrists in the 19th century would eventually change this diagnosis to ‘mania’ which, as you can imagine, is still not very helpful, since mania can be implied of any mental disorder which causes rapid mood changes (Ray, 1855).

It would not be until 1966 that the case of George III and the Royal Malady would suddenly be reopened when two British psychiatrists, Ida Macalpine and Richard Hunter, proposed an intriguing and controversial hypothesis. They suggested that George III suffered from an acute and rare genetic disorder, known as ‘porphyria’ (Macalpine & Hunter, 1966). Porphyria itself is not considered a disorder of the brain in the classical sense such as Alzheimer’s disease and Parkinson’s disease. Instead, porphyria is actually a blood disorder that, in some instances, can cause detrimental effects on the brain. Macalpine and Hunter insisted their theory was correct via a retrospective diagnosis based on the fact that the King suffered from sporadic attacks with unusual symptoms that can also be found in modern day porphyria patients. For instance, the King would experience periods of severe and rapid mood changes, such as going from a state of depression to intense manic episodes. These derangements would also coincide with other health problems, such as gall stones, ‘bilious attacks’, chest infections, and mostly importantly – the dark discolouration of his urine, which is a hallmark feature of porphyria (Macalpine & Hunter, 1966). Like porphyria patients, the King would often recover from these attacks; his mental health would improve, and the colour of his urine would return to normal. Since his first major attack in 1788, he would not relapse until 1795 – almost 7 years later. However, from then on, the relapse periods would become shorter and the symptoms would worsen; the King’s psychotic episodes would become more intense and he even gradually lost his sight (Macalpine & Hunter, 1966).

As previously stated, porphyria is a genetic disorder and therefore it is often inherited. With this in mind and to further support their claim that George III suffered from porphyria, Macalpine and Hunter traced signs of the disease in some of the King’s ascendants, notably Mary Queen of Scots. Additionally, they also traced porphyria in some of George III’s descendants, including possibly Queen Victoria and two members of the current royal family, however their identities remain anonymous (Rohl, Warren, & Hunt, 1998). Ida Macalpine and Richard Hunter were simultaneously praised and despised upon their publication of this new theory. Many claimed it was a meaningful breakthrough, whilst others contested their diagnosis, suggesting that their interpretation was misleading and in some cases, fraudulent (Peters, 2011). Nevertheless, there is no doubt that the publication of Macalpine and Hunter’s theory sparked a huge public interest into porphyria, as demonstrated by the publication of the best-selling books ‘George III and The Mad Business’ and ‘The Purple Secret’. However, not much is actually known about porphyria, especially at the neuroscientific level. How, for example, does porphyria affect the brain and how does it cause ‘madness’?

As stated before, porphyria is a blood disorder, but more specifically it is a disease that affects how blood is made. Porphyria itself is not a single disorder but refers to a group of eight disorders, each of which is characterised by the unique effect it has on how blood is produced (Meyer, Schuurmans, & Lindberg, 1998). Blood, obviously, is very important and without it our tissues and cells would not be able to receive oxygen. Oxygen is transported in the blood through a molecule called heme which also gives blood its characteristic red pigment. Heme is made through the heme biosynthetic pathway – a multistep process – in which several intermediate molecules (heme precursors) are treated by specific enzymes and are eventually converted into heme. The proper function of these enzymes is imperative to the production of heme – if one enzyme is faulty, then the heme precursors cannot be properly processed. Porphyria is caused when one of these enzymes becomes faulty either through genetic changes (i.e. mutations), or sometimes through the use of specific drugs (Elder, Gray, & Nicholson, 1972). The heme precursors then begin to accumulate in the body and eventually become toxic, causing problems such as motor neuropathy (difficulty moving), gastrointestinal distress, skin lesions, and of course – neuropsychiatric problems (Meyer et al., 1998). It is important to note that not all of the porphyrias cause neuropsychiatric problems, but the ones that do are called the ‘neuroporphyrias’ (Lin et al., 2008).

So, what exactly are the neuropsychiatric symptoms of the neuroporphyrias, and what are the causes behind them? As mentioned previously, porphyria is not traditionally seen as a neurological disorder. This, in addition to the large variation in neuropsychiatric symptoms between patients, means that the brain-based symptoms of the disorder have not been well characterised. For instance, Kuo et al (2011) was one of the first to clearly define several neurological manifestations in patients with acute intermittent porphyria (AIP). Out of 12 patients, 8 had ‘conscious disturbances’, 4 had seizures, and 7 had motor paresis (inability to move arms or legs) (Kuo et al., 2011). Other researchers have also reported aggression, psychosis, and hallucinations, as well as a high comorbidity with other disorders such as depression and schizophrenia (Suh et al., 2019). Nerve atrophy (i.e. the degeneration of brain tissue overtime) has also been reported in AIP patients, especially in brain areas that control vision and where visual information is processed, known as the parieto-occipital lobes (Suh et al., 2019).

Some AIP patients also display a condition called posterior reversible encephalopathy syndrome (PRES) (Zhao, Wei, Wang, Chen, & Shang, 2014). PRES also affects the parieto-occipital lobes by degenerating the white matter in the region. The white matter is important for nerves in the brain to relay signals at a much faster speed; in other words, it acts as an insulator for nerve signalling. MRI scans from AIP patients with PRES show a white ‘bleeding’ in the parieto-occipital lobes, signifying the loss of white matter, which causes patients to have headaches, seizures, and visual abnormalities (Suh et al., 2019; Zhao et al., 2014). Blindness is not unusual with cases of porphyria, and so it was with the case of George III – could PRES explain why the King lost his sight, and does this further support the ‘porphyria hypothesis’? White matter atrophy has also been observed in the nerves that control the movement of the arms and legs, which may also explain why many porphyria patients experience motor paresis.

It is quite clear then that porphyria is more than just a blood disorder and it can have a detrimental impact on the functioning of the brain. But how exactly do the heme precursors exert their toxic effect on the brain? What are the underlying mechanisms behind the neuroporphyrias? The answer to these questions has remained elusive to neuroscientists for many decades, and to this day the exact mechanisms are poorly understood. One of the leading theories is that one of the heme precursors behind porphyria, called aminolevulinic acid (or ‘ALA’) looks and acts very similarly to a type of neurotransmitter called γ-aminobutyric acid (or ‘GABA’). GABA is vital for regulating nerve signalling activity and it has been theorised that ALA might interfere with normal GABA function (Windebank & Bonkovsky, 2005). This means that ALA may impair how nerves send their signals around the brain which could result in headaches, seizures, and possibly psychiatric problems like hallucinations and delusions.

Experiments have also been done on cells that have been grown in petri dishes. From these experiments, researchers have found that elevated ALA levels cause other harmful chemicals to build-up and damage the cells (Kazamel, Desnick, & Quigley, 2020). They also reported that ALA damages the DNA of the cells, and also that the cells showed decreased energy production – very harmful indeed! Researchers have also tried to model porphyria in mice, with some success. They have shown that when they genetically engineer the ALA enzyme and reduce its functionality, the mice show similar motor symptoms as porphyria patients, as well as damage to the brain (Kazamel et al., 2020). Although we still do not understand much about the mechanisms of porphyria, perhaps with this animal model researchers can begin to describe the exact neuroscientific underpinnings of the disorder.

There are still many questions that neuroscientists are asking about porphyria. For example, why is the white matter targeted? Why does porphyria seem to affect the posterior regions of the brain that control visual processing? And above all, why are some patients more affected than others? Although the porphyria theory about George III is still, well – a theory, and many researchers believe that George III instead suffered from bipolar disorder (Peters, 2011), there is no doubt that ever since the publication of Macalpine and Hunter’s paper, interest in porphyria has only gone up. From the perspective of a neuroscientist, porphyria becomes more and more interesting the more we unveil about it, and perhaps soon we will acquire a full picture of the disorder that may (or may not!) have driven one of Britain’s greatest Kings to madness.

Editors: Matt Higgs and Uroosa Chughtai


Why do we Parent? Ancient Brain Circuits for Parental Care

Matt Higgs | 03 MAR 2021

I for one predicted that all the extra time that couples have spent inside last year would result in a 2021 baby boom. In fact, PwC predicts that we might actually be facing a baby bust, since couples are postponing their pregnancy plans to later dates. While the factors causing couples to delay or pushforward pregnancy are interesting, the question that interests me as a neuroscientist is perhaps more fundamental – I want to understand what happens in our brain to motivate us to care for our children. Essentially – what is parenting and why do we do it?

Now this may seem a silly question but bear with me. It is currently estimated that a human child will cost £152,747 – £185,413 (Hirsch, 2020), consume an estimated 13 million calories, require a lot of attention, and cost you many hours of sleep over an 18-year period (Kohl, 2018). By the numbers, parenting is tough. However, parenting is obviously not without upsides, since – quite predictably – sociological research shows that being a parent “has a substantial and enduring positive effect on life satisfaction” (Pollmann‐Schult, 2014). But for many other people, the work of being a parent tends to outweigh the positives and, on average, parents tend to be similarly or less happy than their childless peers (Glass et al., 2016). Yet many still willingly choose the burden of parenthood. What in our brain motivates us to do this?

When we look across the animal kingdom, we see that many animals have a set of behaviours directed at their offspring to support their survival. This is particularly true for the class of animals that humans belong to – mammals. Mammalian offspring are particularly helpless and mammals are the only animals that feed their young directly from the teat. This means that both the parent and the offspring become tied into an intimate relationship from birth and to further see their offspring to adulthood requires a lot of parental motivation. Since this intensive parenting behaviour is crucial to the continuation of a species, yet is poorly rewarding and distinctly sacrificial for the caregiver, parenting is seen as an innate behaviour in most mammals (i.e. a neurally hardwired behaviour that an animal is able to perform, at least partially, in advance of experience). This is particularly telling since most mammals are able to take up parenting with little training or experience. They suddenly become motivated to care for their young and know how to feed and protect them. This combination of motivation for and instinctual knowledge of parenting suggests to neuroscientists that the basis of this behaviour across mammals is likely evolutionary shaped neural circuits ready to motivate us when we first become parents.

And this is exactly what has been found.

By focusing on mice and rats, and utilising the abundance of behavioural neuroscience technology available to them, scientists have been able to identify what is happening in rodent brains while performing parenting behaviour. Like humans and most other mammals, mice are heavily motivated to care for their offspring post-birth. This involves a more modest 3-5 weeks of feeding, grooming and protection but they perform these behaviours diligently in spite of the costs. So, what exactly is happening in their brains?

Several decades of research on rodents has shown that a structure within the hypothalamus of the brain (a region commonly associated with regulating core function such as body temperature, sleep and appetite – see image below for location of hypothalamus in the human brain) called the medial preoptic area (MPOA) is of central importance for some of the most fundamental motivated behaviours such as mating and parenting (Numan & Insel, 2003). When researchers specifically destroy this area of the brain, parenting behaviour is abolished, whilst other behaviours are left intact (Lee et al., 2000). This is all well and good, but scientists Johannes Kohl and Catherine Dulac recently went one step further and identified the specific neuronal populations within the MPOA in mice that are active during parenting behaviour – aka the parenting neurons.

These neurons were found to express the protein Galanin, which has become a useful marker to identify these cells. They make up a comparatively small population (10,000 neurons), especially when compared to the 100 million neurons in the whole mouse brain (Wu et al., 2014). Are these neurons really the hub of such a crucial behaviour? Specifically inactivating these neurons in mice caused disruption in parenting behaviour similar to destroying the whole MPOA, which was a good start. Taking it further, Kohl et al. (2018) used viruses to infect the Galanin neurons in the MPOA which can then infect, and allow us to visualise, the input neurons and the output neurons to the MPOA. This revealed the brain regions connected to the MPOA which formed the basis of a parenting circuit in the brain. The Galanin neurons were the hub of this circuit and, when tested, were active during all forms of parenting behaviour, while other parts of the circuit were only active for distinct parts of the parenting repertoire. For example, MPOA neurons projecting to the Ventral Tegmental Area and Nucleus Accumbens (key dopamine regions of the brain) are responsible for ‘motivating’ the parent to care for their offspring, while others projecting to the Periaqueductal gray (a region associated with motor control) are involved in mechanical behaviours such as grooming pups.

This parenting circuit appears to be present in both males and females, but in females the MPOA is heavily influenced by the rising concentrations of hormones of late pregnancy (e.g. estrogen, prolactin). These hormones travel through the maternal bloodstream and are detected by receptors in the MPOA. This hormonal signal primes the MPOA, and the mother, for caregiving behaviour just prior to birth (Rilling & Young, 2014). Life experience and physiological state also go a long way to changing these behaviours/circuits despite their evolutionary origin (Kohl, 2018). Yet this core motivational circuit is a powerful driver and likely does a lot of the heavy lifting to convince a mouse to negate its own wellbeing in favour of its offspring.

One of the core goals for behavioural neuroscientists is to understand the neural mechanisms and the evolution of complex social behaviours. The discovery of this neural network orchestrating parenting behaviour,  with a central coordinating hub (the MPOA) and pools of neurons responsible for producing distinct aspects of this complex behaviour, gives an insight into how neural circuits for complex behaviours can be organized. Hence, this finding can be utilised when it comes to investigating other motivated behaviours such as mating and feeding (Kohl, 2020).

However, to return to our question – why do we parent? Humans are not mice, and the existence of this parenting hub in the human hypothalamus is not confirmed. But since the hypothalamus is deeply conserved across mammals, and it is of central importance to all non-human mammals studied (Numan, 2017), it is highly likely this circuit exists in humans and is performing a similar function.

This combination of motivation for and instinctual knowledge of parenting suggests to neuroscientists that the basis of this behaviour across mammals is likely evolutionary shaped neural circuits ready to motivate us when we first become parents.

For humans, parenting behaviour was never going to be as simple as the output of a highly conserved hypothalamic behaviour circuit. For example, we know there is a strong sense of intentional and chosen effort to rise to the task of parenting. We also know from brain imaging studies that parenting behaviour in humans additionally relies on the cortical areas of the brain, likely infusing our parenting behaviour with feelings of love, devotion and care (Numan 2017; Rilling, 2013). However, this work suggests that an instinctual urge to care for children is driven by an evolutionarily conserved brain structure and not only is it fundamental to most mammals but perhaps to humans as well.  

This research is still a way off having clinical applications but considering the impact that aberrant parenting can have on the parent and offspring (Joseph & John, 2008; Letourneau et al., 2012), understanding this behaviour is a crucial step in the right direction. As to why we parent, naturally we have the capability to decide whether to be a parent or not, and many people chose to skip the ordeal completely, but it is highly likely that deep in your brain lies the ancient circuit that will contribute to the great reward and meaning that being a parent will likely bring you if you choose that path in life.

Editors: Ian Fox and Uroosa Chughtai


  • Glass, J., Simon, R. W., & Andersson, M. A. (2016). Parenthood and happiness: Effects of work-family reconciliation policies in 22 OECD countries. American Journal of Sociology, 122(3), 886-929.
  • Hirsch, D (2020) The Cost of a Child in 2020, Child Poverty Action Group
  • Joseph, M. V., & John, J. (2008). Impact of parenting styles on child development. Global Academic Society Journal: Social Science Insight1(5), 16-25.
  • Kohl, J. (2018). Circuits for care. Science, 362(6411), 168-169.
  • Kohl J, Babayan BM, Rubinstein ND, Autry AE, Marin-Rodriguez B, Kapoor V, Miyamichi K, Zweifel LS, Luo L and Dulac C (2018). Functional circuit architecture underlying parental behaviour. Nature 556(7701):326-331.
  • Kohl J. (2020) Parenting – a paradigm for investigating the neural circuit basis of behavior. Current Opinions Neurobiology. 60:84-91.
  • Lee, A., Clancy, S., & Fleming, A. S. (2000). Mother rats bar-press for pups: effects of lesions of the MPOA and limbic sites on maternal behavior and operant responding for pup-reinforcement. Behavioural brain research, 108(2), 215-231
  • Letourneau, N. L., Dennis, C. L., Benzies, K., Duffett-Leger, L., Stewart, M., Tryphonopoulos, P. D., … & Watson, W. (2012). Postpartum depression is a family affair: addressing the impact on mothers, fathers, and children. Issues in mental health nursing33(7), 445-457.
  • Numan, M. (2017). Reference Module in Neuroscience and Biobehavioral Psychology: Parental Behavior.
  • Numan, M., & Insel, T. R. (2003). The Neurobiology of Parental Behavior (Vol. 1). Springer Science & Business Media.
  • Pollmann‐Schult, M. (2014). Parenthood and life satisfaction: Why don’t children make people happy? Journal of Marriage and Family, 76(2), 319-336.
  • Rilling, J. K. (2013). The neural and hormonal bases of human parental care. Neuropsychologia51(4), 731-747.
  • Rilling, J. K., & Young, L. J. (2014). The biology of mammalian parenting and its effect on offspring social development. science345(6198), 771-776
  • Wu, Z., Autry, A. E., Bergan, J. F., Watabe-Uchida, M., & Dulac, C. G. (2014). Galanin neurons in the medial preoptic area govern parental behaviour. Nature509(7500), 325-330.

The Sixth Sense: How Your Brain Tells Time

Steliana Yanakieva | 17 FEB 2021

Every day we experience the world through our senses – we see colours, hear sounds, taste, and smell food, and feel the sun or the rain on our skin. But how do we sense time? It is certain that we do experience a sense of time, both consciously, for example when we look at a clock, and subconsciously (e.g. in the order in which we do things), and that our sense of time is highly integrated with our other senses. However, even if we lost our ability to see, smell or hear, we would still have a sense of time passing.

Sensing time (time perception) seems to be a product of evolution. As far as we know, humans are the only species consciously aware of the passage of time and our own mortality. Despite how it might seem, we do not perceive time itself, but rather we perceive changes in events occurring in time. Hence, unlike our other senses, time perception does not have a dedicated sensory system. Instead, time is a construction of the brain that enables us to perceive a unified sensory picture of the world, which underlies our conscious experience. This idiosyncratic sixth sense is fundamental to our understanding of sequential events, allowing us to perceive our lives as an uninterrupted stream of events.

We are only truly aware of a few seconds of time at any one moment, a phenomenon termed “specious present” by E.R. Clayand and later elaborated by William James (James, 1890). For example, whilst we can plan for events that have not yet occurred, we are incapable of perceiving durations in the future. In fact, durations of events (intervals) can only be perceived after they have ended, so technically the “specious present” moment you are aware of has already happened. David Eagleman (2009) explains this in his famous essay “Brain Time”. He argues that different types of information are not only processed by distinct neural pathways, but also at different speeds. In order to perceive a continuous unified picture, our brain has to overcome this difference by waiting for the slowest sensory information to arrive before making us aware of what is happening ‘now’. This delay of around 100 milliseconds allows us to watch TV unaware of the fact that our brain processes auditory stimuli faster than visual stimuli. So, if you have ever experienced the frustration of unsynchronised TV audio and video, there is a delay of over 100 milliseconds, that your brain is programmed to pay attention to.

In a cognitive sense, attention is a mental process that allows you to selectively attend to information relative to completing a task. Hence, if you are in a boring class, thinking about how long you have got to the end of it – you will be more aware of the passage of time and therefore overestimate its duration (e.g. time appears to pass slower). On the other hand, time will appear to pass much faster when you are having fun. This goes to show that intact perception of small intervals of times is essential to our day-to-day functioning.

Mechanisms of Time Perception

Durations in the milliseconds to seconds ranges, in psychology, are referred to as interval timing. Impairments have been observed in psychiatric disorders marked by disruptions of consciousness, such as schizophrenia (Allman & Meck, 2012), dissociative disorders (Simeon, et al., 2007; Spiegel et al., 2013), Parkinson’s disease (te Woerd et al., 2014; Gulberti et al., 2015) and Huntington’s disease (Beste et al., 2007). Specifically, impairments in time perception are associated with symptoms such as tremor and hallucinations. Therefore, understanding the neural mechanisms of interval timing would allow scientists to develop new therapies for these symptoms underlined by timing deficits. Over the years, there have been several theories about the neural mechanism of time perception (Gibbon, 1977; Matell & Meck, 2004), and even though scientists cannot agree on a unified model of interval timing, one thing we know for sure is that time perception is a multifaceted process, dependant on other cognitive process, particularly attention.

Both schizophrenia and Parkinson’s disease are associated with aberrant dopamine concentrations in the brain (Brisch et al., 2014; Davie, 2008), which, interestingly, have been linked to the speed of our “internal-clock” (Cheng et al., 2007). Excessive dopamine, as seen in schizophrenia, appears to lead to overestimation of time intervals whilst dopamine depletion, as seen in Parkinson’s disease, appears to lead to its underestimation (Hass & Durstewitz, 2016; Meck, 1996). We can see these effects without relying on neurological conditions because stimulant drugs, such as caffeine, cocaine and amphetamines, increase brain dopamine levels and can lead to overestimating time intervals, while depressant drugs, such as ketamine, have the opposite effect, likely through the effects such psychoactive substances have on attention.

One way of understanding this phenomenon is that psychoactive drugs will either excite or inhibit the firing of dopaminergic neurons in the brain. Whilst stimulants increase the rate of neuronal firing, allowing the brain to register more events within a given time interval and leading to the perception of time speeding up, inhibitory drugs decrease the firing rate of neurons, resulting in a slowing down perceived time. However, since such drugs also impact attention, it is difficult to disentangle whether the observed effects on timings are due to the dopaminergic manipulation, per se, or if they are caused indirectly due to increased/decreased attention to time.

A promising solution to this problem appears to be microdosing of hallucinogens, such as LSD, which appear to alter interval time
perception without marked disturbances to attention, concentration, and memory (Yanakieva, et al., 2018).

In a cognitive sense, attention is a mental process that allows you to selectively attend to information relative to completing a task.

Overall, our perception of time is one of the most fascinating sensory experiences. Despite research literature on timing dating back 150 years, how our brains process time is still a mystery. Can understanding the neural basis of time perception unravel the hard problem of consciousness? Can it explain the altered states of consciousness observed in schizophrenia and the dissociative disorders? Are animals aware of passage of time and does this make them conscious? These are just a few of the questions that remain to be answered. However, each scientific experiment raises more questions than answers, all highly intriguing and deserving of attention.

Editors: Matt Higgs and Uroosa Chughtai


  • Allman, M. J., & Meck, W. H. (2012). Pathophysiological distortions in time perception and timed performance. Brain, 135(3), 656–677.doi:/10.1093/brain/awr210
  • Beste, C., Saft, C., Andrich, J., Müller, T., Gold, R., Falkenstein, M., (2007). Time processing in Huntington’s disease: a group-control study. PLoS One, 2, e1263.
  • Brisch, R., Saniotis, A., Wolf, R. Bielau, H., Bernstein, H., Steiner, J., et al. (2014). The Role of Dopamine in Schizophrenia from a Neurobiological and Evolutionary Perspective: Old Fashioned, but Still in Vogue. Frontiers of Psychiatry, 5, 47. doi: 10.3389/fpsyt.2014.00047
  • Cheng, R.K., Ali, Y.M. &, Meck, W.H (2007). Ketamine “unlocks” the reduced clock-speed effect of cocaine following extended training: evidence for dopamine-glutamate interactions in timing and time perception. Neurobiolology of Learning and Memory, 88,149-159.
  • Davie, C.A. (2008). A review of Parkinson’s disease. British Medical Bulletin, 86(1), 109-127. Doi:10.1093/bmb/ldn013.
  • Dormal, V., Javadi, A.H., Pesenti, M., Walsh, V., & Cappelletti, M. (2016). Enhancing duration processing with parietal brain stimulation. Neuropsychologia, 85, 272-277.
  • Eagleman, D.M. (2009) Brain Time, in What’s Next? Dispatches on the Future of Science. Ed. M. Brockman. New York: Vintage.
  • Gibbon, J. (1977). Scalar expectancy theory and Weber’s law in animal timing. Psychological Review, 84, 279–325.
  • Gulberti, A., Moll, C. K. E., Hamel, W., Buhmann, C., Koeppen, J. A., Boelmans, K., Zittel, S., Gerloff, C., Westphal, M., Schneider, T. R., & Engel, A. K. (2015). Predictive timing functions of cortical beta oscillations are impaired in Parkinson’s disease and influence by L-DOPA and deep brain stimulation of the subthalamic nucleus. NeuroImage: Clinical, 9, 436-449.
  • Hass, J., & Durstewitz, D. (2016). Time at the center, or time at the side? Assessing current models of time perception. Current Opinion in Behavioral Sciences, 8, 238–244.
  • James, W. (1890). The Principles of Psychology, New York: Henry Holt.
  • Mattel, M. S., & Meck, W. H. (2004). Cortico-striatal circuits and interval timing: coincident detection of oscillatory processes. Brain Research. Cognitive Brain Research, 21(2), 139-170.
  • Simeon, D., Hwu, R., & Knutelska, M. (2007). Temporal disintegration in depersonalization disorder. Journal of Trauma & Dissociation, 8(1), 11-24. doi: 10.1300/J229v08n01_02.
  • Spiegel, D., Lewis-Fernandez, R., Lanius, R., Vermetten, E., Simeon, D., & Friedman, M. (2013). Dissociative disorders in DSM-5. Annual review of clinical psychology, 9, 299-326. doi: 10.1146/annurev-clinpsy-050212-185531
  • te Woerd, E.S., Oostenveld, R., de Lange, F. P., & Praamstra, P. (2014). A shift from prospective to reactive modulation of beta-band oscillations in Parkinson’s disease. NeuroImage, 100, 507 -519.
  • Yanakieva, S., Polychroni, N., Family, N., Williams, L.T.J., Luke, D.P., Terhune, D.B. (2018). The effects of microdose LSD on time perception: a randomised, double-blind, placebo-controlled trial. Psychopharmacology, 236(4), 1159–1170. 10.1007/s00213-018-5119-x

Do you really like it? Social media on the brain.

Lauren Revie | 3 MAR 2020

Social media is something that has become commonplace in most of our lives – we wake up, we scroll the feed, we post throughout the day, like, comment, tweet and share. Most of us are familiar with the concept, despite social media sites such as Instagram and Facebook only coming into popular mainstream use in the last 15 years.

The concept of social media is simple – create an account which allows you to share and connect with friends, family and colleagues across the globe. Many modern-day relationships would cease to exist if it weren’t for the advent of social media, with around 74% of adults connecting on a daily (if not hourly) basis (Meshi, Tamir & Heekeren, 2015). Social media allows us to feel connected, less lonely and can even lead us to feel happier (Mauri et al., 2011).

According to a recent study, the number of friends or followers we acquire on social media also influences the size of different structures related to emotional regulation, and both online and offline social network size, such as the amygdala (Kanai, Bahrami, Roylance & Rees, 2011). This could mean that interaction on social media is linked to our social perception – meaning that if we are more social online, we may also be more socially aware offline.

Picture 1.png

However, with the power to connect and reach thousands, if not millions, of people also comes a darker side to social media. Numerous instances of online bullying, fake news, and negative impacts on mental health have been reported in recent years. Despite this, we continue to use these platforms – so what is it about these sites that make them so hard to resist?

Social media provides our brains with positive reinforcement in the form of social approval, which can trigger the same kind of neural reaction as your brain would experience through behaviour such as smoking or gambling. This pathway – the dopamine reward pathway – is associated with behaviours which give us a good feeling, such as food or exercise, leaving us looking for the positive reinforcement or reward. So, in the same way that eating chocolate may release dopamine and lead you to seek more of it, so does social media. Neuroscientists have reported that social media related ‘addictions’ share similar neural activity to substance and gambling addictions (Turel et al., 2014). However, those individuals who used social media sites heavily also showed differences in their brain’s inhibitory control system, which could result in lower focus and attentional abilities (Turel et al., 2014).

Cognitive neuroscientists have also shown that the rewarding behaviour we engage in online, such as sharing images or receiving likes, stimulate behaviour in an area of the brain called the ventral striatum, which is responsible for reward behaviour. However, activity in this area in response to positive social media feedback may be related to the processing of gains in our own reputation (Meshi, Morawetz & Heekeren, 2013). This could mean that we use social media as less of a means to communicate and share with one another, but more to gain social reputation in an attempt to boost our egos.

With around 5% of adolescents considered to have significant levels of addiction-like symptoms (Banyai et al., 2017), it is clear that social media use may be detrimental to our well-being, as well as beneficial for us socially. Moving forward, users can only be mindful of how powerful connecting with contacts can be, as there is a dark addictive side to the likes and shares we interact with every day.


  • Bányai, F., Zsila, Á., Király, O., Maraz, A., Elekes, Z., Griffiths, M. D., … & Demetrovics, Z. (2017). Problematic social media use: Results from a large-scale nationally representative adolescent sample. PLoS One, 12(1).
  • Kanai, R., Bahrami, B., Roylance, R., & Rees, G. (2012). Online social network size is reflected in human brain structure. Proceedings of the Royal Society B: Biological Sciences, 279(1732), 1327-1334.
  • Mauri, M., Cipresso, P., Balgera, A., Villamira, M., & Riva, G. (2011). Why is Facebook so successful? Psychophysiological measures describe a core flow state while using Facebook. Cyberpsychology, Behavior, and Social Networking, 14(12), 723-731.
  • Meshi, D., Morawetz, C., & Heekeren, H. R. (2013). Nucleus accumbens response to gains in reputation for the self relative to gains for others predicts social media use. Frontiers in human neuroscience, 7, 439.
  • Meshi, D., Tamir, D. I., & Heekeren, H. R. (2015). The emerging neuroscience of social media. Trends in cognitive sciences, 19(12), 771-782.
  • Turel, O., He, Q., Xue, G., Xiao, L., & Bechara, A. (2014). Examination of neural systems sub-serving Facebook “addiction”. Psychological reports, 115(3), 675-695.

Frauds, fear of failure and finances: The mental health problem in academia

Lauren Revie | 10 OCT 2019

Mental health is a hot topic at the moment – and it is about time. Around 1 in 6 adults will experience anxiety or depression (Mental Health Foundation, 2016), with the number of people recognising suicidal thoughts increasing drastically (McManus et al., 2016). However, as a response to this growing problem, we have also seen a rise in the formation and support for mental health charities and research into different mental health and psychiatric conditions. More and more people are beginning to talk about our mental health openly; how we feel, what is affecting our mental health, and seeking support for problems we might be experiencing. Mental health awareness and advocacy is gaining momentum, and everything *seems* to be heading the right way in working towards normalization of sharing our feelings, emotions and mental state. 

But what about the researchers behind the mental health statistics and the breakthroughs? There is  growing evidence of a mental health epidemic that is often hidden behind academic success, with almost half of PhD students and graduates in academia struggling with mental health. Approximately 41% of PhD students demonstrate moderate to severe symptoms of anxiety and depression – almost threefold that of the general public – meaning mental health issues are rife in researchers (Evans et al., 2018). 

Perhaps, then, we may attribute this to the ‘type’ of person who is attracted to pursuing a career in academia – highly motivated, a perfectionist, and maybe a little hard on themselves. However, research by Levecque et al (2017) compares the incidence of these problems within PhD students to their highly educated counterparts in industry. The findings indicate that one in two PhD students experience psychological distress, and one in three is at risk of developing a common psychiatric disorder – findings which are significantly higher than those within the comparison group.

But why might this be? Why would seemingly driven, motivated and highly successful young individuals be battling with these staggeringly high rates of mental health problems? Levecque and colleagues (2017) attribute these statistics to the effect of their research on work-family life, and found strong predictors of poor mental health to be job demands, lack of job control, and supervisor’s leadership style. Others have attributed these rates to workplace ‘bullying’ of doctoral students (English, Flaherty & English, 2018) and a feeling of disconnection from the research community due to unfamiliar topics or long isolated work (Reeve & Patridge, 2017). 

Academics and postgraduate students alike attribute mental health problems and feelings of being overwhelmed to lack of support and isolation. Further research by Belkhir et al (2018) followed a group of young academics and early career researchers over four years.  It was reported that feelings of loneliness came from social isolation due to workplace culture, meaning individuals weren’t able to make meaningful relationships with those in their immediate groups. In addition to this, they also reported that they felt unable to participate in conversations with their peers and others in their field, as they felt they lacked both cultural and technical knowledge. 

This leads us on to an issue that many postgraduates and early career researchers can related to, known as the ‘Imposter syndrome’. Clance and Imes (1978) first coined the term ‘Imposter syndrome’ in a bid to collectively define the traits of high-achievers who were struggling to accept and internalize their own success. Often, someone struggling with imposter syndrome will claim to be a fraud, or underestimate their own knowledge, attributing their success to luck or circumstance. ‘Imposters’ will often compare themselves to others, and reject praise, leading to anxiety, stress and in some cases, depression. Positive correlations have been observed between imposter syndrome and academic success, neuroticism and perfectionism – all strong traits of a postgraduate student or early career researcher. And it isn’t just them! Many senior faculty members wake up believing they will one day be ‘found out’ Whilst the syndrome is not exclusive to academics, it is rife amongst university staff and students, and is a huge contributor to declining mental health in post graduate education. Watson and Betts (2010) attribute feelings of imposter syndrome to three main themes in an early career researcher’s experience: fear, family and fellowship. The researchers assessed email conversations of graduate researchers, in which a fear of being discovered as a fraud appeared to be one of the main factors driving feelings of imposter syndrome. In addition, this was further exacerbated by feelings of being drawn away from family responsibilities, and a lack of peer support or fellowship during study. 

There are a number of reasons why researchers and students may feel like imposters. Firstly, academia is a competitive world. Postgraduate study attracts the best of the best, and fairly often, those surrounding you are intelligent and also over-achieving. Partnered with the constant pressure to ‘publish or perish’, and the need to justify your project and area of expertise, this can result in stress, anxiety and often burnout (Bothello & Roulet, 2018).

Other factors which may also contribute to poor mental health in academia include difficulty in time management, organizational freedom (van Rijsingen, 2018) and perception of career perspectives, funding opportunities and financial problems. The struggle to manage your own work, produce innovation and progress whilst being largely self-taught can often come at the price of mental health issues. It is suggested that stress may stem from insecurity within this sphere – be it financial insecurity, or insecurity concerning ‘unwritten rules’ within the lab or school – and also from frequent evaluation, and a seemingly unmanageable workload (Pyhalto et al., 2012). 

All in all, the consensus seems to be that postgraduate researchers and academics alike are struggling in the University environment. This issue is beginning to be addressed more readily, however the phenomenon is not new. McAlpine and Norton (2006) note that the calls for action to rectify this growing problem have generally been ad hoc rather than theory driven (ironically!). As such, research which has been conducted has not been broad enough to integrate factors which could influence outcomes in a University context. And so the cycle continues. 

If you have been affected by anything in this article, please talk to a trusted friend or family member, or access help on


  • Bothello, J., & Roulet, T. J. (2018). The imposter syndrome, or the mis-representation of self in academic life. Journal of Management Studies, 56(4), 854-861.
  • Clance, P.R., & Imes, S. A. (1978). The impostor phenomenon in high achieving women: Dynamics and therapeutic intervention. Psychotherapy: Theory, Research, and Practice, 15(3), 241-247. 
  • English, S., Flaherty, A., & English, A. (2018). Gaslit! An examination of bullying on doctoral students. Perspectives on Social Work, 20.
  • Evans, T. M., Bira, L., Gastelum, J. B., Weiss, L. T., & Vanderford, N. L. (2018). Evidence for a mental health crisis in graduate education. Nature biotechnology, 36(3), 282.
  • Levecque, K., Anseel, F., De Beuckelaer, A., Van der Heyden, J., & Gisle, L. (2017). Work organization and mental health problems in PhD students. Research Policy, 46(4), 868-879.
  • McAlpine, L., & Norton, J. (2006). Reframing our approach to doctoral programs: An integrative framework for action and research. Higher Education Research & Development, 25(1), 3-17.
  • McManus, S., Bebbington, P., Jenkins, R., & Brugha, T. (2016). Mental Health and Wellbeing in England: Adult Psychiatric Morbidity Survey 2014: a Survey Carried Out for NHS Digital by NatCen Social Research and the Department of Health Sciences, University of Leicester. NHS Digital.
  • Mental Health Foundation. (2016). Fundamental Facts about Mental Health 2015. Mental Health Foundation.
  • Pyhältö, K., Toom, A., Stubb, J., & Lonka, K. (2012). Challenges of becoming a scholar: A study of doctoral students’ problems and well-being. ISrn Education, 2012.
  • Reeve, M. A., & Partridge, M. (2017). The use of social media to combat research-isolation. Annals of the Entomological Society of America, 110(5), 449-456.
  • van Rijsingen. (2018), E. Mind Your Head# 1: Let’s talk about mental health in academia.
  • Watson, G., & Betts, A. S. (2010). Confronting otherness: An e-conversation between doctoral students living with the Imposter Syndrome. Canadian Journal for New Scholars in Education/Revue canadienne des jeunes chercheures et chercheurs en éducation, 3(1).