Tuesday 16 August 2011

Charles Rivers: international HTA organisations

The CRA ratings: how did Scotland do?

The consultancy company Charles Rivers Associates recently released a report “A Comparative Analysis of the Role and Impact of Health technology Assessment”.  Commissioned by the European pharmaceutical industry, it rated the HTA systems of 15 countries against 44 criteria using a traffic light system (green = good, red  = bad)[1]. 

First, the good news …
Scotland was rated as ‘green’ (the best rating) on 26 of the criteria, amber on 7, and red on 10 (with 1 deemed not applicable).

Only 2 countries had more green ratings (28 for Australia, 27 for Sweden). 

The following table compares four key criteria based on data for 2009, including 12 medicines used as case studies across each system.



Guidance issued
Typical speed (days)
Yes or no?
(5=all yes, 1=all no)
Cost of agency
Scotland
82
150
3.2
€ 1m
England
17
290
3.7
€ 7m
Sweden
30
190
3.2
€ 11m
Canada
28
190
3
€ 4m
Netherlands
41
150
3.9
?
Australia
73
150
4
?
New Zealand
58
320
2.8
?
Poland
66
320
2.8
€ 3m
France
657
150
3.9
?
Germany
6
1500
?
€ 23m
Italy (national)
?
320
4.3
?
Spain
6
?
?
€ 2m


Data sources for the table:
Guidance issued – taken from CRA pages 62-63, Table 18.
Speed – median days from approval by EMA to HTA guidance (approximations taken from bar chart), taken from CRA page 88, Figure 24.
Yes or no? – scoring based on Raftery’s classification where CRA assumed accepted without restriction = 5, accepted with minor restriction = 4, accepted with major restriction = 3, accepted with further evidence required = 2, rejected = 1, taken from CRA page 93, Figure 26.
Cost – taken from CRA page 71, Table 25.
South Korea, Brazil and Turkey excluded due to lack of data.

CRA commented that Scotland was the only country where there was a “relationship between the therapeutic value of the medicine … and the speed of the review, with higher value products progressing more quickly through the review.” (page 88)

Scotland was rated as green on the following criteria, which seem important to public confidence in the system:
  • The rationale for HTA decisions / recommendations is clearly stated
  • The process and rationale for selecting and prioritising topics is clearly defined and publicly available
  • The approach used in HTA is clearly stated and the methods are deemed appropriate by experts
  • A number of relevant stakeholders are invited to contribute to the HTA process and they are involved throughout the HTA process with opportunity for contribution to assessment methodology, submission of evidence, review of recommendations
  • Outcomes are available on a publicly available website and decisions are explained in several levels of clinical/technical detail so that all relevant audiences may understand the decision
  • Length of time taken for reviews

The cost per piece of guidance issued was € 12,195 in Scotland.  The only other country that came close to this figure was Poland at € 45,455.  Other costs were as follows:
€ 142,857 Canada
€ 333,333 Spain
€ 366,667 Sweden
€ 411,765 England
€ 3,833,333 Germany

So on the positive side Scotland has much to commend its system of new medicines evaluation: in terms of transparency of process, involvement of stakeholders, speed and the value for money of the system itself (guidance produced for the budget).


Yes but …
While it is important to disseminate the good news, continuous improvement is not based on self-congratulation, of course.  Why didn’t Scotland do even better, and why did it receive 10 red ratings in particular?

Seeing red
The 10 criteria where CRA rated Scotland as red were as follows:

1. Scientific advice is available to manufacturers during development stage to enable the availability of evidence required for HTA.  SMC was assessed as offering no advice – it is certainly true that SMC has no formal role in this area.  Offering informal advice would have given an amber rating, but it was not clear how informal this advice could be.  For example, SMC posts its guidance for a submission on its website: that sets out the requirements and it is then up to the company how to interpret those.  There may even be dangers in getting involved in decisions about clinical trials if this subsequently restricts criticisms the HTA organisation can make of the submission.

2. and 3. HTA is conducted for pharmaceuticals, devices, procedures, diagnostics and treatment strategies (1 criterion) and proportion of guidance in 2009 that was for non-medicines technologies (1 criterion).  This is not a part of the SMC’s remit, so the low assessment is unavoidable.

4. and 5. HTA is conducted for old as well as new technologies (1 criterion) and proportion of HTAs conducted for old technologies (1 criterion).  Once again, assessing old medicines is not a part of the SMC’s remit, so it gets the lowest assessment.  Taking account of 2. and 3. above, it seems slightly odd that CRA feel an HTA organisation that just does new medicines is inherently lower-rated than an HTA organisations that does everything.  The counter-argument would be that the specialisation of concentrating on one task has advantages.

6. Proportion of 12 case study including information on societal benefits (in guidance).  SMC was rated as providing no evidence in its guidance that factors such as productivity and costs to carers had been considered.  Some evidence was required for an amber rating and evidence in at least 7 of 12 case studies for a green rating.  SMC has stated that its main analysis will focus on NHS costs and savings and health benefits measured in QALYs, but the guidance states other factors are not ruled out and may be considered.

7. Low number of appeals against HTA decisions.  The report was based on calendar year 2009 in which there were no appeals at SMC (called ‘independent reviews of process’).  This is given a red rating.  Amber would have been a few appeals and green some successful appeals.  NICE was rated as green because it has had 58 appeals in 10 years, 23 of which have been successful.  It seems strange that a system that generates no appeals is given a worse rating – would the world have been a better place if SMC had said no to all fringe cases and then reversed some of those decisions on appeal?  The CRA rating ignores the fact that companies can resubmit their case in the SMC system at a time of their choosing after meeting with the SMC to discuss ‘not recommended’ guidance.  Around 50% of resubmissions are accepted.

8. Guidance doesn’t identify gaps in evidence base requiring further research.  SMC do not see this as being part of their remit.  An amber rating involved providing some suggested additional evidence requirements without being specific.

9. Product (medicine) should be accessible / reimbursable before guidance issued.  The report says that in theory medicines can be used before guidance is issued but in practice local decision-makers will restrict use.  SMC aim to produce their guidance very rapidly after the licensing decision so there is no period of ‘planning blight’ where there is no guidance; indeed, where a company delays too long before submitting, interim ‘not recommended’ guidance is issued; companies can submit subsequently.  The speed of the SMC system is widely admired and it seems strange to have a red rating on this criterion.

10. Value of HTA to the health care system should be measured.  Scotland was assessed as having no evidence of measuring the value to the healthcare system.  An amber rating required some evidence and green rating required systematic measurement.  Sweden was awarded a green rating for making estimates of the savings to be realised from its evaluations of therapeutic classes.

Summary.  Of the 10 red ratings, 4 were outside the SMC’s remit.  Three others would have cost/time implications (providing advice before submissions, identifying research gaps, estimating the value to the healthcare system). Two red ratings are highly debatable: it is not clear why having no appeals is taken as a mark of failure, and it is not clear why having medicines available before national guidance is issued is a good thing – both these criteria could be seen as coming from the pharma companies point-of-view.  A further criterion relating to societal costs and benefits is acknowledged by CRA to be ‘controversial’.


Amber gamblers?

In addition to the red ratings the 7 criteria where SMC was rated as amber were as follows:

1. HTA is conducted independently of parties with a vested interest in the outcome – SMC was rated as “sometimes influenced by payers or other parties with a vested interest in the outcome”, whereas a green rating would have been that HTA was conducted independently of payers and other parties with a vested interest in the outcome.  CRA seem to have defined a vested interest purely in terms of the involvement of payers (e.g. NHS boards in England or PCTs in England).  This ignores other partners at the table, such as the ABPI which has three seats on SMC, and goes against the preferred model of partnership working.

2. HTA considers unpublished trial data – SMC was rated as considering unpublished data in limited circumstances, whereas a green rating would have been “routine consideration of unpublished data where appropriate”.  This seems to have been based on CRA’s interpretation of a comment in SMC’s guidance to submitting companies.  The CRA does not report this but the gist seems to be that SMC said that if a published paper and an unpublished paper are available reporting the same thing then the published version will be preferred.  It’s hard to see why the SMC is wrong about this: surely a paper published in a peer-reviewed journal is preferable, other things being equal?  However, SMC has a long record of considering unpublished data, including some that could not be published in its final guidance so the amber rating seems harsh.

3. HTA takes into account cost on the public purse; non-healthcare and indirect costs and benefits to patients and society.  This rating is based on non-healthcare factors being allowed in guidance to companies whereas the green rating was given when submissions were required to have this information.  As stated above, this is not currently part of SMC’s remit.

4. Proportion of assessments which are re-evaluations.  This relates to whether there would be a re-evaluation when there is new evidence (not a resubmission following an initial ‘not recommended’ decision).  An amber rating implies a few re-evaluations were carried out whereas a green rating would require routine re-evaluation.  Pharma companies can resubmit their case whenever new evidence becomes available on a product, s it is not clear Scottish patients are losing out.

5. Relationship between HTA and reimbursement restrictions.  Scotland was rated as having a system where this relationship was ‘somewhat’ defined; a green rating would have been a formal and clear definition.  CRA looked at decisions by three Scottish NHS boards for the 12 case study medicines and found some had not made a decision about whether to accept the medicine locally, several months after SMC guidance had been issued and in two cases local NHS boards had decided not to recommend a medicine accepted by SMC.

6. Impact on diffusion.  SMC rated amber because there was said to be evidence of negative impact of HTA on medicines diffusion in some cases.  A green rating would have been no evidence of a negative impact. The basis seems to be the claim that Scotland has lower survival rates than other countries – while it was not clear this seems to have been linked to other evidence that Scotland says no more often to medicines or is more restrictive when it says yes.  At best, this point seems controversial.

7. Explicit treatment of innovation.  This criterion had three tests: early dialogue with companies, transparent assessment process, and explicit consideration of innovation.  SMC was assessed as achieving 2 of these.  A green rating required all three be achieved.

Summary.  Of the 7 amber ratings, SMC could take issue with 3 (sometimes being pressured by vested interests, no successful appeals, not routinely considering unpublished evidence).  The evidence base for there being a negative impact of SMC decisions on diffusion was not available and is questionable.  The link to re-imbursement decisions is a matter for the Scottish Government Health Department.  SMC has decided not to hold early dialogue with companies about submissions.


Overview
There is much that is positive for SMC and Scotland in the CRA Report.  The Scottish system gives similar decisions to other agencies but does so at least as quickly (and far quicker than some) and much more cheaply.  The CRA report highlights several aspects of good practice in the SMC’s way of working.
Where the ratings are not so good, Scotland needs to understand why.  There are some ratings that are highly debatable, some that apply at a Scotland level outside of SMC’s remit, and others that are outside of SMC’s agreed pattern of working.
The report is designed to be the basis for a repeat exercise in future years. CRA and its sponsors are to be congratulated on an excellent starting report, full of useful information and comparisons.  Some of the criteria (and the way they were applied) could be usefully debated before the second version is commissioned.

No comments:

Post a Comment