Contact Sue Gardner via the Society’s Leicester office,
or e-mail: [email protected]
Psychologists are often accused of sitting on fences rather than making confident assertions. We understand about weighing the evidence, balancing probabilities and living with uncertainty. Given the uncertainties about the future in terms of life after the general election these skills have even greater value currently. One piece of good news recently was confirmation from the Higher Education Funding Council for England (HEFCE) about the funding for universities and colleges. As expected the plan is to reduce funding for next year in real terms by just under 2 per cent. The good news is that this is less than was expected and that spending on science, technology, engineering and mathematics (the STEM subjects which include psychology) will be prioritised as will research. This will enhance the expected research funding for psychology by about 50 per cent. Higher education in the UK is said to generate nearly sixty billion pounds for the economy with a multiplier effect of 3:1 from public investment and the government is taking account of this in setting priorities for the effective use of public money.
There are still concerns about the situation in Scotland as we await the official outcome of the consultation on the proposed changes to
the methods for calculating higher education teaching funding from the Scottish Funding Council. Questions were asked about the validity of the data used to draw up proposals, which were based on figures from just one academic year. The plans as originally conceived would have disadvantaged the newest universities, and there was a strong reaction from several institutions across a range of subjects, not just psychology. We understand that the feedback has been accepted and that new proposals are on the way. The outlook now seems more hopeful.
Educational psychology research is receiving a boost this month due to a wealth of important studies published by the Society. The latest addition to the British Journal of Educational Psychology (BJEP) Monograph Series entitled ‘Understanding Number Development and Difficulties’ aims to explore differences in children’s number attainment and effective interventions for those requiring support. The BJEP features an annual review paper by Professor Neil Mercer on methods for analysing classroom talk as well as a special section on developmental outcomes in resource-poor settings featuring studies undertaken in different parts of the developing nations. The British Journal of Developmental Psychology has a special edition on ‘Developmental Disorders of Language and Literacy’ that brings together the best current research in this field. Our journals continue to enrich the knowledge base, showcase excellent research work and stimulate further applications of psychology to improve people’s lives.
The Society certainly wants to reach out to members to offer support, especially where people have specific issues that are not addressed through the current member networks. Several such groups are under discussion at the moment including those who are independent practitioners. A day event was recently held for these members who are providing services to the public. Many, but not all, are members of Divisions. Some are also registered by the Health Professions Council (HPC). All have businesses of different sizes and often feel that their specific interests are not catered for. We know of about five hundred members in this situation. The event was an opportunity for the 50 or so participants to receive some information, to network as a group and, hopefully, to form a member network of their own.
In the last column I mentioned that there was to be a review meeting between the Society and the HPC as part of the ongoing collaboration between professional bodies and the regulator. We had a useful and friendly discussion, which will be reported more fully once notes have been agreed. In summary those of us from the Society reiterated our view that the public can only be properly protected if the name ‘psychologist’ is protected. The HPC pointed out that this would require a change of legislation and would only come about if there was more concrete evidence to support our belief. Obviously we will be monitoring the public protection issues that arise over the next few years and will accumulate any evidence that emerges. The HPC regretted any inconvenience caused by the recent renewals coinciding with a postal strike and the plan is for future renewals to be made online. We discussed the methods used by the HPC to stop unregulated practitioners from repeatedly claiming the competencies of regulated practitioner psychologists. The HPC does not put titles on certificates and the fees are scrutinised by Parliament and the Privy Council. These issues were raised by members both with the Society and with HPC directly.
Finally, thank you for
your e-mails. Many of you are interested in consultations on public policy or on matters relating to Parliament. Do look in The Psychologist and on our website for updates. I hope you are on the list for the
e-newsletter. A demonstration
of the new website will be given to the next Representative Council so please ask your member network representative for news. I’m certain that this will be a welcome development.
The Special General Meeting held on 22 January 2010 announced
the result of the recent membership ballot for President 2011/12. Professor Noel Sheehy will take up office as President Elect in June 2010 and will become President at the 2011 AGM.
BPS/POST Postgraduate Award
Fiona Duff describes her experience at the Parliamentary Office of Science and Technology
With the generous funding of the BPS, I was fortunate enough to take a three-month secondment from my PhD in summer 2009 and relocate from the University of York to the Parliamentary Office of Science and Technology (POST). POST’s offices are situated at the heart of politics in Westminster, and as such this fellowship offered a unique experience, quite different from university life!
The ultimate aim of these fellowships is to write a short briefing note for parliamentarians, clearly summarising an area of psychological science, its current policy context and related issues that may need to be addressed in future policy. With a research interest in reading development, disorders and interventions, my task was to produce a briefing note on the area of teaching children to read.
This is a topic that receives a considerable degree of attention in policy, with obvious links to education, but also employment and economics. Recent policy documents include the Independent Review of the Teaching of Early Reading (DfES, 2006) and Identifying and Teaching Young Children with Dyslexia and Literacy Difficulties (DCSF, 2009). Indeed, around the time of the note’s publication, the Science and Technology Committee announced a series of checks on how research evidence is used to inform policy; the first topic was literacy interventions.
In order to write my note, I spent time familiarising myself with past and present policy documents in the area of literacy instruction, together with research evidence from the psychology literature. In addition, I had the privilege of consulting a variety of people regarding their opinions on the subject. This included expert psychologists from different institutions, directors of charities that lobby in this area, MPs, policy makers, and those charged with the task of converting policy into practice.
In all, the secondment was a fantastic and valuable experience. It helped me to challenge and broaden my understanding of my research area, and will undoubtedly help me as I continue in my career. Its greatest impact on me was in helping me to realise afresh how fundamentally important it is that children learn to become good readers; reading really is the gateway to learning.
I am grateful to the BPS for granting me
this opportunity, and to Dr Peter Border for
his supervision of my work. I would encourage supervisors to inform their PhD students of this scheme, and I wholeheartedly encourage students to apply. These secondments help to bring alive the research–practice circle; a feature that is central to so many aspects of our discipline.
I The published POSTnote can be found at www.parliament.uk/parliamentary_offices/post/pubs2009.cfm
Department for Children, Schools and Families (2009). Identifying and teaching young children with dyslexia and literacy difficulties. Nottingham: DCSF Publications.
Department for Education and Skills (2006). Independent review of the teaching of early reading: Final report. London: DfES.
Accreditation through partnership
Lucy Kerry, the Society’s Quality Assurance Manager, onmembers’ views and next steps
Accreditation through partnership is the British Psychological Society’s new model of engagement with universities and their students, applying to all accredited courses. The model is based on the principle of collaborative working with education providers in the spirit of supportive enquiry. It emphasises quality enhancement, ultimately aiming to make the experience of psychology students, trainees, staff and employers as positive as possible. Society members and education providers have been asked to look at our proposals to see if they achieve these aims, and are in agreement that the new model is a positive change. Consultation responses have identified a range of measures that will further improve accreditation through partnership, and we will work with both members and education providers to test these suggestions out during the coming months. A number of education providers have agreed to help us pilot the new model across a range of psychology courses. These visits will feature a more interactive, tailored approach to visit planning, and will be based upon collaboration and conversation, rather than on inspection. In practice, we expect that this will offer greater freedom to education providers to shape their discussions with the Society and focus upon the things that matter to them. We will be taking as our starting point the view that education providers are generally in the business of running good courses, and hope that this will encourage suitably open and constructive discussions.
Professor Don Mitchell at the University of Exeter told us: ‘Here in the School of Psychology, we are happy to participate in the pilot process because we welcome the promise of a collaborative process rather than a regime of inspection.’ Dr Liz Charman at London Metropolitan University shared this view: ‘We very much welcome the move towards a more explicit recognition of the partnership between the professional body and the university. The collaborative principles of accreditation through partnership reflect our own values and culture.’ One criticism that was levelled at the initial proposals for accreditation through partnership was that they did not spell out in sufficient detail the implications of a shift from quality assurance to quality enhancement. They also failed to clearly outline the differences between our existing model of accreditation and these new proposals.
We expect our pilot visits, and subsequent evaluation of these, to illuminate some of the implementation issues we have already identified, and they will allow us to test out the types of questions that visiting teams may wish to ask as a means of exploring our standards with a range of individuals and groups. In the case of doctoral courses, they will also offer an opportunity to explore ways of working alongside the Health Professions Council and their approvals process, as well as other quality-assurance processes, such as university (re)validation. We will address all of these points in the handbooks that are being produced to support the launch of the new model in time for the 2010/11 academic year.
Partnership not policing
Sue Gardner wrote in her ‘President’s column’ last month that the Society is developing an approach to education governance that is ‘about partnership rather than policing’. What does this mean for course accreditation? Overall, we are seeking to establish a tone of enquiry that strikes an appropriate balance between bigger-picture matters for discussion and points of finer detail. Increasing our emphasis upon quality enhancement must not come at the expense of solid, robust quality assurance; indeed, QE can only build upon good QA, and where we are not clear that standards are being met, we will need to probe this as appropriate. Similarly, although we are proposing that accreditation through partnership should operate on an open-ended basis (subject to periodic review), this should not suggest that we would not retain the option to remove accreditation were it appropriate to do so. However, we want to work with education providers to ensure that the accreditation process proceeds in a way that is proportionate
to risk, and without generating excessive workload or anxiety for those involved. We will also be looking to education providers to offer feedback on our standards and our processes to maintain the Society’s position as the authoritative voice on psychology in the UK.
‘It is important that the Society continues to have a central role at a time when the HPC has become the regulator, and we want to contribute to that process,’ said Dr John Franey, Director of the Doctorate in Educational Psychology at the University of Bristol. ‘We are particularly looking forward to more open-ended discussion on the challenges associated with running a programme, and greater honesty and openness on how things might be done differently. All of this depends on a more open process and conversations.’
We’re not there yet…
Managing the change from existing arrangements to accreditation through partnership is the responsibility of the Quality Assurance Review Group, which includes representatives from the Graduate Qualifications Accreditation Committee, each of the postgraduate Training Committees, and is jointly chaired by Dr Peter Banister (Chair of the Membership and Professional Training Board) and Dr Richard Latto (Chair of the Psychology Education Board). The group has identified some important revisions to the accreditation through partnership model based on consultation feedback. A set of agreed programme standards is now in place: these are the standards that all accredited courses will need to achieve. Alongside those are our domain-specific standards, which replace our existing accreditation criteria. These two sets of standards will be used during our pilot phase.
We have agreed extensive changes to the programme standards in light of respondents’ feedback. There are now nine standards in total (from an original 13), which are presented as a brief standards statement, together with a rationale that outlines the relevance of the standard in question to psychology and, particularly, to the student experience. Figure 1 shows how our standards have been reorganised.
Our new standards particularly emphasise the importance of working ethically, and attracting a diverse range of applicants to study psychology. They look specifically at how students are developed as the psychologists of the future, and offer a new focus on employability. The new standards also recognise the range of approaches taken to developing and resourcing psychology courses in a way that is consistent with delivering psychology as a science, whilst setting clear continued expectations in relation to resource levels.
The programme standards provide a framework for considering our minimum evidence requirements and the flexible ways in which these can be met. Professor Rudi Dallos, Director of the Doctorate in Clinical Psychology at the University of Plymouth, told us: ‘The greater emphasis on streamlining by using existing documentation that we provide for the University has been very welcome. This has been a great relief and the production of the paperwork has not been as laborious as we had feared.’
Accreditation through partnership also aims to make life easier for education providers wishing to develop new accredited courses by putting in place a process that allows the Society to work with them to reach a decision on accreditation before the course is up and running. We will outline the new processes more clearly in our handbooks, as different processes apply to different types of course (particularly for those that require approval by the Health Professions Council).
For all education providers, accreditation through partnership promotes flexible engagement with the Society. We currently tend to visit courses on an individual basis. We will continue to offer this to those education providers who feel it best meets their needs. However, we will also be writing to Heads of Department and Programme Directors to invite them to propose their preferred future visit configuration. For example, they may choose to group a number of programmes across their Department, or across their whole institution. Alternatively, they may wish to plan for a partnership visit to take place alongside another QA process, which in itself may require a change to the timing of their next visit. The range of pilot visits weare undertaking incorporates a variety of approaches along the lines suggested above, so it will be interesting to learn from those what works well (and what works less well!).
A further important feature of the new model of accreditation is that it aims to make explicit all those with an interest in psychology education and training. However, a number of key groups were felt to be missing or lacking in prominence in our initial proposals. We will endeavour to remedy this in our handbooks, and in so doing will highlight that the groups with an interest in specific types of course will vary, dependent on the type of course in question. Our handbooks will also clarify that the interactive approach to agenda planning that is proposed with accreditation through partnership offers education providers the opportunity to include any stakeholder groups that they feel are central to the process – which, again, may vary depending on the nature and scope of the visit that is being planned.
We are confident that this new approach to working with education providers, students and all of the other stakeholders with an interest in psychology courses represents a positive shift. Dr Peter Banister, Co-Chair of the Quality Assurance Review Group, agrees: ‘The move towards quality enhancement will assist in the improvement of psychology education at both undergraduate and postgraduate levels as the Society develops its post statutory regulation role.’ Responses to our consultation show that members share our confidence. We will be formally evaluating our pilot partnership visits, and will also be holding a shared learning event in the summer to give pilot participants the chance to reflect on their experiences with us.
Review Group Co-Chair Dr Richard Latto added: ‘It is worth stressing that by considering the coverage of the undergraduate curriculum against the relatively loose specification in the Benchmark Statement for Psychology, the Society will be recognising and accepting the fact that different institutions will achieve this coverage in different ways and that the Society will be encouraging and helping the development of new approaches to the learning and teaching of psychology.’
Accreditation through partnership is due to be formally launched in September 2010. Our website will be updated on an ongoing basis – see www.bps.org.uk/partnership – and we will continue to provide updates via our usual channels, as well as through The Psychologist and the Higher Education Academy. If you have any questions on how accreditation through partnership affects you, please contact the QA team at [email protected].
To have your CPD event approved by the Society and for a catalogue of forthcoming opportunities, see www.bps.org.uk/learningcentre or call 0116 252 9512.
To advertise your event in The Psychologist, e-mail [email protected] or call +44 116 252 9552.
A diary of non-approved events can be found at www.bps.org.uk/diary.
Delivering standards for tests and testing
Dave Bartram and Pat Lindley, Steering Committee on Test Standards
This year will see a major launch of a new approach to test user qualification by the Society. Here, we provide the background on why we are engaged in this venture, explain its implications for the Society’s role as a standard setter and discuss where it could lead to in the future.
The Steering Committee on Test Standards (SCTS) was set up in 1987 as part of the process associated with the advent of chartered status. Before this, a Standing Committee on Tests and Testing had approved test user training courses on the basis of their curricula. When the new committee was established it was agreed that a different route should be followed: the focus should move to the competence of the test user rather than the content of the training courses.
In the late 1980s the SCTS set up working groups and developed a set of standards defining the competences required of test users. While these standards were intended to be usable in all areas of testing, it was decided to focus on occupational testing in the first instance, as this was the area where there was the greatest need being expressed and where the Society was in a position to set standards for non-psychologists and psychologists alike. We defined qualifications that mirrored the then current course accreditations: The occupational testing courses became Level A and the personality instrument-related training courses became Intermediate Level B. We also defined a ‘Full Level B’ qualification to represent an upper benchmark on the competence of a test user (see Bartram, 1995, for an account
of the project).
The shift from course approval to competence-based qualifications was made possible by the fact that chartered status provided a mechanism for quality controlling the assessment of competence. Those who were going to assess people’s competence for Society’s qualification would have to be chartered and hence accountable to the Society. They would also have to undergo a process of verification whereby their assessment tools and procedures would be checked to ensure they were adequately checking competence against the Society’s standard. The verifiers were also chartered and were accountable to the SCTS.
The Occupational Level A qualification was introduced in 1991, with grandparenting arrangements for existing test users. It was the first qualification produced by the Society for non-members and non-psychologists. Since then until the end of 2009, 26,798 Occupational Level A qualifications have been issued. From 1995, 9215 Intermediate Level B and 480 Full Level B qualifications, and from 2003, 726 Occupational Test Administration qualifications have been issued. During this time a range of materials were produced by the SCTS to support verifiers, assessors and test users. We also produced a comprehensive set of open learning materials for Level A (Bartram & Lindley, 1994) and more recently did the same for the Test Administration qualification (Bartram & Lindley, 2006).
In 2004 the first educational test user qualification was launched (the Certificate of Competence in Educational Testing or Educational Level A), and there have now been 1564 of these issued. The SCTS also established a Register of Competence in Testing which qualification holders could choose to be on. There are currently 9150 people on that Register of Competence.
In parallel with all this work, the SCTS was also supporting the development of a test review process, based on the work done by Bartram et al. (1990) on a government contract. This review was taken over by the Society and initiated a series of published volumes through BPS Books. However, we soon came under pressure to find a more rapid way of getting reviews carried out and published.
Towards the end of the last century it was becoming apparent that all the operational activities of the SCTS were becoming increasingly difficult for a Society committee to handle. The qualifications involved employment of verifiers, assessor verification procedures, managements of applicants for qualifications, and the management of the register database. Other activities, such as test reviewing and the development of guidance documents needed office support. The SCTS remit to promote standards through its guidance documents was also made difficult by there being no user-friendly website people could come to for such information. Finally, there was no single focus within the Society’s office to support and manage the administration of with these diverse activities.
The solution to this was the establishment of the Psychological Testing Centre within the office as the centre for operational management and delivery of services and products. The main change this introduced was that there was to be a full-time office administrator and support staff with a dedicated budget. All the test reviews would go online, and a regular review schedule would be initiated. There would be investment in the development of a web ‘Testing Portal’ (www.psychtesting.org.uk) and this would include information and support services for test takers, test users, and test developers and researchers. There would be opportunities for service providers to advertise through and link into this site. Verifiers, web editors, test reviewers and consultant editors would all be employed through the PTC with an executive committee set up to manage the transmission of policy and accountability, and to provide the executive management for the Centre.
The main objectives set for the PTC were:
I to implement SCTS policy and disseminate Society standards on best practice in testing and test use;
I to establish and maintain a comprehensive psychological testing website and gateway;
I to manage all the Society’s test user certification procedures;
I to manage the Society’s verification of test user competence assessments; and
I to be self-financing.
Since 2001, the PTC has been very successful in meeting its objectives and has evolved into a positive revenue-generating function within the Society with a small but dedicated staff.
Since 2004 the SCTS has been pursuing a policy of harmonisation. This has three strands to it:
I operational harmonisation of the various differing procedures associated with verification of qualifications and other PTC office procedures in order to manage costs and increase; effectiveness.
I structural harmonisation of the format and content of test user standards to provide greater clarity and facilitate modularisation of content within and between areas of practice (occupational, education and health-related); and
I international harmonisation of our developments with those taking place in Europe under the aegis of EFPA.
Important progress has been made in all three areas over the past few years (a revision to the occupational test user standards was approved in 2005, EFPA-based standards contextualised for all areas of test use are now available for use in qualification design, harmonisation of verification procedures has been progressed, etc.). However, the most obvious outward sign of all this work is due to come to fruition this year, starting with the launch of the new qualification system.
While we will provide easy grandparenting routes to the new system for people who hold current Society qualifications, the form and structure we are moving to is radically different from what we have now. Rather than a discrete set of qualifications, the standards are now defined in terms of levels of complexity (related to the European Qualification Framework) and described in modular form. Modules are of different types (psychometric knowledge, psychological knowledge or practitioner skills) and qualifications are defined as collections of modules at one or more levels (for details of this, the interested reader is referred to articles from Assessment and Development Matters, which are freely downloadable from www.psychtesting.org.uk). A new database is being designed for what will in future be called the Register of the Society’s Qualifications in Test Use, and from July 2010, people who want to claim competence in test use on the basis of having obtained one of the Society’s test user qualifications will need to maintain a current entry on that register. Certificates will become Certificates of Registration and will no longer be Certificates of Competence.
At the same time as we are moving towards the launch of this new system, we are working with our European colleagues on establishing criteria and procedures for the European accreditation of national test user qualifications. We are hopeful that this will shortly lead to us being able to offer people the opportunity of obtaining a European test user qualification in the occupational area (corresponding to the combination of the current Level A and Intermediate Level B). We are also working in Europe on the specification of qualifications aimed at the level of competence expected of practising psychologists. We have already produced versions of the new standards contextualised for educational and health-related test use as well as for occupational, and with the new modular structure we expect it will be much simpler to produce and launch new qualifications as and when the market demand becomes known.
Our work with Europe has been broader than just qualifications in test use. When the PTC was established, the initial online test reviews were based on the British Psychological Society’s test review criteria. With the emergence of the EFPA test review criteria, which the SCTS was instrumental in shaping, all the reviews were updated to fit the EFPA model and we now use this as the basis for all our reviewing. This has had the advantage of providing a more generic set of review criteria that can cover tests in all areas of application. The Society’s reviews now cover an increasing number of educational tests as well as occupational ones. It is hoped that in future this will expand to include clinical and health-related measurement tools as well.
Furthermore, we introduced a test registration procedure in 2004 which allows publishers to
BPS Members can discuss this article
Already a member? Or Create an account
Not a member? Find out about becoming a member or subscriber