Author: Azeem Majeed

I am Professor of Primary Care and Public Health, and Head of the Department of Primary Care & Public Health at Imperial College London. I am also involved in postgraduate education and training in both general practice and public health, and I am the Course Director of the Imperial College Master of Public Health (MPH) programme.

The Future of the Quality and Outcomes Framework (QOF) in England’s NHS

The Quality and Outcomes Framework (QOF) was introduced in 2004 as part of a new NHS GP contract with the aim of financially rewarding general practices for delivering evidence-based standards of care. While initially unique internationally, the QOF in the UK is now facing uncertainty, with calls to cut it back or abolish it due to various challenges faced by the NHS. In an article published in the journal BJJP Open, Mariam Molokhia and I discuss the role of the QOF in England’s NHS and argue for its importance in improving health outcomes and addressing public health challenges.

The Importance of Comprehensive Health Services

Primary care plays a vital role in providing comprehensive health services, covering both acute and long-term conditions. Beyond immediate patient needs, the focus should be on prevention, early diagnosis, and management of chronic diseases that contribute significantly to ill health, reduced quality of life, and increased NHS workload. Amid the COVID-19 pandemic, urgent care rightfully took precedence, but it is now crucial to restore high-quality care for long-term conditions.

The Role of QOF in Addressing Public Health Challenges

Public health challenges have underscored the importance of the QOF, especially in areas focused on secondary prevention and long-term condition management. Meeting QOF targets for conditions like type 2 diabetes leads to lower mortality rates, reduced emergency hospital admissions, and improved health outcomes. By using the QOF effectively, the NHS can alleviate pressures on other healthcare sectors and improve patient well-being.

Data Measurement and Research 

The QOF also facilitates data collection and measurement of healthcare quality, essential for planning health services, addressing health inequalities, and ensuring efficient use of public investments. The structured data entry required for QOF enables its use for clinical research, as shown during the COVID-19 pandemic. Abolishing or significantly cutting back the QOF would have far-reaching negative consequences, undermining these benefits.

Supporting Primary Care Teams and Addressing Challenges

Rather than discarding the QOF, it is crucial to support primary care teams in delivering structured care while addressing urgent patient needs. Adequate funding, including a review of funding allocation mechanisms, is necessary. Additionally, workforce issues should be addressed, promoting staff retention and expanding recruitment into new primary care roles. Integration of pharmacy and general practice services can also enhance primary care capabilities. Leveraging information technology and the wider primary care team can enable the delivery of QOF elements at scale, streamlining care processes and improving the efficiency of QOF.

Retaining Essential Elements of QOF

While criticisms exist regarding the QOF’s reporting domains and its evaluation of important dimensions of care quality, it is essential to retain its best elements. This includes focusing on early detection and management of long-term conditions while improving support through information technology and the wider primary care team. Recent research from Scotland demonstrates that the elimination of financial incentives can lead to reductions in recorded quality of care, emphasizing the importance of maintaining an effective QOF program.

Conclusion

The Quality and Outcomes Framework (QOF) remains an integral part of England’s NHS. Despite challenges faced by the healthcare system, the QOF’s role in improving health outcomes, addressing public health challenges, and promoting comprehensive care cannot be overlooked. By adequately supporting primary care teams, addressing workforce issues, and using technology and the wider primary care team, the QOF can continue to play a crucial role in reducing health inequalities and improving health outcomes in England.

Tools for measuring individual self-care capability

Our ability to engage in self-care practices plays a crucial role in promoting overall well-being and preventing and managing non-communicable diseases. To support individuals in assessing their self-care capabilities, many measurement tools have been developed. However, a comprehensive review specifically focusing on non-mono-disease specific self-care measurement tools for adults has been lacking. Our  scoping review in the journal BMC Public Health aims to identify and characterise such tools, including their content, structure, and psychometric properties.

Shifting Emphasis and Methodology

The review encompassed a thorough search of Embase, PubMed, PsycINFO, and CINAHL databases, covering a wide range of MeSH terms and keywords from January 1950 to November 2022. The inclusion criteria involved tools that assess health literacy, capability, and performance of general health self-care practices, targeting adults. Tools exclusive to disease management or specific medical settings were excluded. A total of 38 relevant tools, described in 42 primary reference studies, were identified from a pool of 26,304 reports.

A key observation from the descriptive analysis was the temporal shift in emphasis among the identified tools. Initially, there was a stronger focus on rehabilitation-oriented tools, while more recent tools have shown a shift towards prevention-oriented approaches. This reflects a growing recognition of the importance of proactive self-care practices to maintain optimal health and prevent the onset or progression of diseases.

Additionally, the method of administering these tools has evolved over time. Traditional observe-and-interview style methods have given way to self-reporting tools, which empower individuals to actively participate in assessing their own self-care capabilities. This shift in methods recognizes the value of self-awareness and self-reflection as integral components of self-care.

Content Assessment and Limitations

To provide a qualitative assessment of each tool, the review utilized the Seven Pillars of Self-Care framework. This framework encompasses seven domains of self-care: health literacy, self-awareness of physical and mental well-being, self-management of health conditions, physical activity, healthy eating, risk avoidance or mitigation, and good hygiene practices. Surprisingly, only five out of the identified tools incorporated questions that covered all seven pillars of self-care. This finding highlights the need for the development of a comprehensive, validated, and easily accessible tool capable of assessing a wide range of self-care practices.

While this review makes significant strides in identifying and characterizing non-mono-disease specific self-care measurement tools, it does have limitations. For example, the search was limited to specific databases and only included English-language studies. Therefore, some relevant tools and studies in other languages may have been overlooked.

Implications and Future Directions

The findings of this review underscore the importance of enhancing our understanding and assessment of self-care capabilities. By incorporating the Seven Pillars of Self-Care, a comprehensive tool can provide a holistic assessment, allowing for targeted health and social care interventions. Such interventions can empower individuals to improve their self-care practices, thereby promoting better health outcomes and reducing the burden of chronic diseases.

Moving forwards, future research should focus on developing a comprehensive, validated tool that encompasses a broader range of self-care practices. Additionally, efforts should be made to ensure the accessibility and usability of such a tool, considering diverse populations and their unique needs. Collaborative efforts between researchers, healthcare professionals, and technology experts can facilitate the creation of an effective and widely applicable self-care measurement tool.

Conclusion

Self-care is a fundamental aspect of promoting health and well-being across diverse populations. While several disease specific self-care measurement tools exist, this review highlights the need for a comprehensive, validated, and easily accessible tool that assesses a wide range of self-care practices. By embracing the Seven Pillars of Self-Care framework, we can effectively evaluate individual self-care capabilities, inform targeted interventions, and empower individuals to take an active role in their health and well-being. With continued research and collaboration, we can develop tools that facilitate and support the practice of self-care, ultimately leading to improved health outcomes for individuals and communities alike.

Reducing the number of people not working due to ill health

Data published today by the Office for National Statistics shows that the number of people who are not working due to long term sickness has risen to a record high.

From previous research, we know that the most common medical problems that can prevent individuals from working include mental health issues such as depression, stress, and anxiety, as well as musculoskeletal disorders like back pain. In addition, cardiovascular diseases, respiratory conditions, cancer, and neurological disorders are among the many other medical conditions that can significantly impact a person’s ability to work.

We have also seen higher death rates than usual in recent years in the UK; which probably reflects an increase in the number of people with long-term medical problems and their severity. A parallel challenge is demographic change to an older population, given that the disability employment gap increases with age. Trying to reverse the trend in the number of people in the UK who are not working due to poor health will not be easy and requires a much more active approach and collaboration from employers, the NHS and government.

The recent increase in the number of people in the UK who are not working due to poor health is likely to be due to short-medium term changes rather than to longer term problems such as rising rates of obesity. There was a rise in the number of people with mental health problems during the COVID-19 pandemic and this will account for some of the increase. We also saw some reductions in physical activity which would affect the number of people with musculoskeletal problems. There are also many people waiting for specialist NHS care (e.g. joint replacement surgery) who may not be able to work until they receive their treatment. Some people are also suffering from the post-COVID problems (i.e. Long COVID).

Reducing the number of people in the UK who are not working due to ill-health requires a multi-faceted approach involving various stakeholders and this is a priority for government, the NHS and employers.

Improving healthcare accessibility should be a priority. Many individuals are unable to work because they are waiting for NHS treatment, such as patients waiting for joint replacement surgery. To address this issue, we must ensure that healthcare services are readily available and accessible to all. This means reducing waiting times for consultations, diagnostics, and treatments, as well as increasing the number of healthcare professionals.

Mental health support is another crucial aspect that needs enhancement. The prevalence of mental health conditions is on the rise, and it is essential to provide better access to counselling, therapy, and mental health resources. Additionally, we must work towards reducing the stigma associated with mental health issues, allowing individuals to seek the help they need without fear or judgment.

To promote overall health and prevent illnesses, comprehensive public health campaigns are necessary. These campaigns should focus on encouraging healthy lifestyles, raising awareness about common health issues, and providing information on preventive measures. Initiatives targeting physical activity, healthy eating, smoking cessation, and mental health can make a significant impact on improving public health.

It is also important to empower individuals with chronic health conditions to self-manage their health. By providing education, resources, and support networks, we can help them better understand their conditions, make informed decisions, and actively engage in their own care. For instance, individuals can monitor their blood pressure at home to self-manage hypertension.

Employers play a crucial role in supporting individuals with health conditions. Encouraging them to provide reasonable workplace adjustments is essential for enabling these individuals to continue working or return to work. This may involve implementing flexible work arrangements, modifying duties, making ergonomic adjustments, and ensuring access to necessary support services.

Occupational rehabilitation programs can be instrumental in helping individuals with health conditions transition back into the workforce. These programs should provide training, education, job placement assistance, and ongoing support to improve employability and facilitate a smooth return to work.

Creating awareness about the benefits of work is essential. Evidence shows that a healthy and safe work environment has positive impacts on physical and psychological health. By raising awareness through campaigns and information leaflets, we can highlight the advantages of work, such as increased social contact and a sense of purpose, motivating individuals to return to work after an illness.

Increasing access to occupational health services is crucial for supporting individuals in the workplace. Occupational health teams can assess and support individuals with health conditions to work, and they can devise and implement policies to maintain a healthy workforce. Investing in occupational health services for those who currently lack access can reduce inequalities, support individuals working with ill health, and contribute to healthier workplaces overall.

Improving workplace health and safety is another area which should be a priority for employers. By encouraging a positive workplace culture and management practices, providing access to healthy food and drink, encouraging exercise, and offering support for smoking cessation, employers can enhance the wellbeing and safety of their employees, creating a productive work environment.

Strengthening social safety nets is vital for individuals who are unable to work due to severe health conditions. This can include providing income support, disability benefits, and ensuring access to appropriate healthcare services. These measures can help individuals maintain financial stability and access the necessary resources for their wellbeing.

Investing in research and data collection efforts is crucial for understanding the causes and impact of ill-health-related work absence. By gathering comprehensive data and conducting research, we can inform evidence-based policies, interventions, and programs aimed at reducing the number of individuals affected and improving overall health outcomes.

Fostering collaboration among government agencies, healthcare providers, employers, and community organizations is key to developing comprehensive strategies for reducing ill-health-related work absence. By sharing best practices, data, and resources, we can create a coordinated and effective response that addresses the various factors contributing to this issue.

Finally, the National Institute for Health and Care Excellence (NICE) has developed a guideline called “Workplace Health: Long-Term Sickness Absence and Capability to Work.” This guideline provides evidence-based recommendations and tools, including a Cost Calculator, to assist employers in managing sickness absences and determining the cost effectiveness of workplace health interventions.

By implementing these strategies effectively, we can improve overall workforce participation and wellbeing. This will benefit individuals, businesses, the NHS and the economy.

Azeem Majeed, Professor of Primary Care and Public Health, Imperial College London

Lara Shemtob, Academic Clinical Fellow in General Practice and Occupational Health Physician, Imperial College London

Kaveh Asanati, Professor of Occupational Health, Imperial College London

This blog was first published by the Society of Occupational Medicine.

How to successfully supervise your student’s research project

Postgraduate students in universities across the UK will currently be undertaking their summer research projects. How can academics successfully support their students and ensure they have a good learning experience and successfully complete their research project?

The first meeting with the student sets the foundation for a successful supervisory relationship. It’s essential for academics to establish clear expectations, foster effective communication, and provide the necessary guidance to support the student during their research project.

1. Introduction & Background: Begin the meeting by introducing yourself and providing an overview of your research expertise and experience. Ask the student to introduce themselves and their background, including their research interests and motivations for pursuing the project.

2. Research Project Overview: Provide a detailed overview of the research project, including its objectives, scope, and any specific research questions that need to be addressed. Ensure that the student understands the broader context of the project and its significance in the field.

3. Project Timeline & Deliverables: Discuss the expected timeline for the project, including key milestones and deadlines. Establish a clear understanding of the deliverables expected at each stage, such as literature review, research proposal, data collection, analysis, and thesis writing.

4. Roles & Responsibilities: Clarify the roles and responsibilities of both the student and yourself as the supervisor. Discuss how you will provide guidance, support, and feedback throughout the project. Establish a regular meeting schedule and preferred communication channels.

5. Research Methods: Discuss the proposed research methods and any specific techniques or tools that will be used. Provide guidance on the selection of appropriate research methods and data collection techniques. Address any concerns or questions the student may have.

6. Resources & Support: Inform the student about the resources available to them, such as research materials, databases, software, and equipment. Discuss any potential collaborations, access to lab facilities or data, and funding opportunities that may be relevant to the project.

7. Ethical Considerations: Discuss the importance of ethical conduct in research and ensure that the student is aware of the ethical guidelines and regulations that apply to their project. If applicable, provide guidance on obtaining necessary ethics approvals or permissions.

8. Literature Review: Emphasize the importance of conducting a thorough literature review to understand the existing knowledge in the field. Provide guidance on how to search for relevant literature, critically evaluate papers, and organise the findings.

9. Expectations for the first stage: Discuss the specific tasks or goals that the student should focus on initially. This may include conducting a literature review, refining the research questions, or drafting a research proposal. Set clear expectations for what should be achieved by the next meeting.

10. Questions & Concerns: Encourage the student to ask any questions or express any concerns they may have. Create an open and supportive environment where they feel comfortable discussing their research project and seeking guidance.

11. The evaluation process: Discuss how the student’s work will be evaluated and how they will be graded. Explain what is needed to achieve a good outcome from the assessment by the dissertation markers.

12. Create a positive and supportive environment for the student. Let them know that you are there to help them succeed and that you are interested in their work. Be respectful. Listen to the student’s ideas and be open to their suggestions.

Wastewater Surveillance for Covid-19

Wastewater surveillance is a technique that can be used to detect and track the spread of infectious diseases, including Covid-19. Wastewater is a rich source of genetic material from the people who use facilities in locations such as schools. By testing wastewater for the presence of viruses, public health officials can get an early warning of an outbreak before it becomes widespread.

Our recent study published in the journal PLOS One found that wastewater surveillance can be used to detect Covid-19 with high accuracy. The study, which was conducted in England collected wastewater samples over a period of six months. We found that wastewater samples from areas with high rates of Covid-19 infection had significantly higher levels of SARS-CoV-2 genetic material than samples from areas with low rates of infection.

We also found that wastewater surveillance can be used to track the spread of new variants of SARS-CoV-2. We were able to identify the Alpha and Delta variants in wastewater samples before these variants were detected in clinical samples.

Wastewater surveillance is a valuable tool for public health officials who are working to prevent the spread of Covid-19. It is a cost-effective and efficient way to identify outbreaks early and take steps to mitigate them. In addition to detecting COVID-19, wastewater surveillance can also be used to detect other infectious diseases, such as influenza and norovirus. This makes it a valuable tool for public health surveillance and outbreak response.

Wastewater surveillance will become increasingly important for protecting public health. It is a valuable tool that can be used to identify outbreaks early, track the spread of new variants, and monitor the effectiveness of public health interventions.

Strategies and Interventions to Improve Well-Being and Reduce Burnout in Healthcare Professionals

Our recent article in the Journal of Primary Care & Community Health discusses burnout, a psychological response to chronic workplace stress that is particularly common in healthcare workers and which has been made worse by the impact of the Covid-19 pandemic. Burnout is caused by factors such as increasing workload, inadequate support from employers and colleagues, and a stressful work environment. It has negative effects on both patients and healthcare professionals, including reduced patient satisfaction, an increase in medical errors, and decreased quality of care. Addressing burnout requires a multi-pronged approach involving individual and organisational-level strategies.

Managing people’s workload, providing individual-focused interventions like stress management, and offering professional development opportunities can help reduce burnout. Supportive leadership, peer support, and a healthy work-life balance are also important. Organisational culture and leadership play a crucial role in fostering these kind of supportive work environments. A culture of openness and support without stigma is also essential, as is the importance of appropriate support programmes rather than relying solely on individual resilience. Ultimately, preventing burnout and managing when it does occur requires collaborative efforts between healthcare systems and individual healthcare professionals.

Electronic health records: Don’t under-estimate the importance of implementation and staff training

One of the most significant changes I have witnessed during my medical career is the introduction of electronic health records (EHRs). While they have brought many benefits to the NHS, patients and clinicians, they have also posed some challenges.

On the positive side, EHRs have made medical records more legible, accessible and secure. Many doctors and patients will remember the era when a patient’s medical record was often “missing” when they attended for an outpatient appointment. This made the management of the patient more difficult as the clinician attending the patient did not have all the information they needed; usually requiring the patient to return at a later date when hopefully by which time their medical records would be found.

With EHRs, in contrast, clinicians can access patient records from anywhere at any time, which has made it easier to provide care to patients in different locations. EHRs have also made it easier to conduct medical research, as they allow researchers to access large volumes of data in a more streamlined manner. Quality improvement has also been enhanced as EHRs make it much easier to measure the quality of healthcare and the impact of any interventions and change to the provision of health services.

However, EHRs have also forced clinicians to modify how they work, which is not always a positive change. The increased use of technology in healthcare for example can sometimes result in decreased interaction between clinicians and patients; as the clinicians is often focused on reading the EHR and entering new data. In addition, the use of EHRs can be time-consuming, as clinicians have to enter information into the system, which can increase their workload.

Another potential issue with EHRs is the risk of data breaches, which can compromise patient privacy and confidentiality. Cybersecurity is a major concern for healthcare providers, and it is important that they take appropriate measures to protect patient data. We have seen example in the NHS of significant data breaches which have disrupted the delivery of health services and compromised sensitive patient information. We have also seen examples of major IT failures (for example, during the heatwave in the summer of 2022).

Despite the challenges associated with EHRs, they are here to stay. It is crucial that healthcare providers adapt to this new way of working, but also that the systems are designed in a way that minimises the burden on clinicians while maximising the benefits to healthcare providers and patients. The ongoing development of EHRs and other technological advancements must always prioritise patient care and safety. This means designing IT systems with adequate input from staff and patients; and ensuring that sufficient time and resources are devoted to areas such as implementation and training.

Why the NHS needs to put the joy back into being a doctor

A complaint I often hear from colleagues is that “the NHS has taken the joy out of medicine”. Modern healthcare delivery is increasingly seen by NHS staff and by patients as an industrial-type activity with strict performance targets. This has resulted in many healthcare professionals feeling that they have lost the much of the flexibility and autonomy that was once a defining characteristic of their professions.

This feeling can also concern patients, as they may feel that they may not be receiving the personalised care and attention that they feel they need. The focus on targets, metrics and finances can create an environment where patients feel they are being treated as numbers rather than as individuals with unique needs and circumstances.

It is important for politicians, NHS managers and clinicians to acknowledge these concerns and work to address them. While performance targets, metrics and financial monitoring are important tools for measuring the effectiveness of healthcare delivery, they should not be the only focus of the NHS. Healthcare professionals must be given the freedom and flexibility to exercise their judgement and provide personalised care to their patients.

The NHS should also work to ensure that patients are seen as individuals with unique needs and circumstances, rather than simply as numbers on a spreadsheet. This can be achieved through providing adequate resources (both financial and personnel) fpr the NHS, better training for healthcare professionals, improved communication with patients, and greater emphasis on patient-centred care.

Ultimately, the goal of the NHS should be to provide high-quality, personalised care to all patients. This requires a shift in mindset away from the purely target-driven approach we often see in today’s NHS towards a more holistic approach that prioritises the needs and well-being of patients and healthcare professionals alike.

Uncertainty in public health and clinical medicine

I joined Twitter 10 years ago in May 2013. One of the lessons I’ve learned from social media is that too many people want “certainty”. But in public health and medicine, there often aren’t certainties; just probabilities of certain outcomes or unknowns due to a lack of evidence. This can be frustrating for people who are looking for clear answers, but science is a process of discovery, and there is always more to learn; either from new research or from summarising and synthesising evidence from current and past research. By looking at the existing evidence, we can make informed decisions about our health and the health of our communities.

Uncertainty is a critical aspect of scientific inquiry and helps researchers refine their understanding of health-related issues over time. Uncertainty can arise due to factors such as incomplete data, limitations in research, or the complexity of the systems being studied. Another way to deal with uncertainty is to be open to new information. As new research is conducted, we may learn more about the risks and benefits of different interventions. It is important to be willing to change our minds in light of new evidence.

Uncertainty doesn’t necessarily mean that nothing can be done to address health issues. Rather, it means that we need to rely on the best available evidence and make informed decisions based on that evidence, while recognising that there may still be unknowns and potential risks. Communicating clearly and transparently about the state of evidence, the limitations of that evidence, and the potential implications for health can help build trust and ensure that people have the information they need to make informed decisions about their health.

Finally, we are all in this together. Public health and medicine are complex areas, and we need to work together to find solutions. By working together and gaining public support, we can have a positive effect on the health of our communities.

The academic publication process: how it works

I am sometimes asked by junior researchers or by the public how the publication process for academic articles works. The academic peer review timeline varies depending on the journal, but it typically takes several months (sometimes even longer) from submission to publication.

1. Submission: You submit your paper to the journal. Make sure your paper is well-written, checked for spelling and grammatical errors, follows the journal’s style and formatting requirements, and that you submit your paper to a journal that is a good fit for your work.

2. Initial screening: An editor at the journal reviews your paper to make sure it is within the scope of the journal & meets the journal’s style and formatting requirements. Some articles are rejected at this stage, without external peer review (particularly, by larger journals).

3. Peer review: The editor sends your paper to one or more external experts in your field for review. Reviewers are asked to assess the originality, significance, rigour of your research methods, & the validity of your work. They may suggest revisions to your paper or rejection.

4. Initial decision: The editor reviews the reviewers’ comments and decides whether to accept, reject, or revise your paper. Acceptance without any revisions is unusual and generally, the authors have to respond to the comments from the referees and editor, and revise the paper.

5. Revisions: If your paper is accepted with revisions, you will be usually given a deadline to make the necessary changes. When sending back your revised paper, it is also normal practice to send a letter explaining how you have changed the paper in response to the comments.

6. Your response. Respond promptly to reviewer comments. Make sure your revisions are comprehensive and address all of the reviewer’s concerns and any comments from the editor. Be respectful and cooperative with the editor and reviewers.

7. Final decision: Once your paper has been revised, it may be accepted without further changes; you may be asked to revise it again; or it may be rejected. If accepted, the editor will send you a copy of the proofs for your final approval. This is your last chance to make changes.

8. Publication: Once you have approved the proofs, your paper will be published in the journal. Some journals (such as the BMJ) offer readers the opportunity to comment on a paper. It’s important to respond to these comments, which may sometimes highlight problems with your paper.

9. Responding to comments. When responding to comments, aim to be polit and respectful in your reply. Some comments can be constructive and others can be very critical of your paper. This post-publication review of a paper is an important part of the academic publication process.

10. The total time it takes to go through this process can vary from a few months to a year or more. It is important to be patient and to follow the instructions of the editor and reviewers. By doing so, you can increase the chances of your paper being published in a high-quality journal.