Which established the Medicare clinical laboratory fee schedule which is a data set based on local fee schedules for outpatient clinical diagnostic laboratory services?

  • Journal List
  • Health Care Financ Rev
  • v.8(2); Winter 1986
  • PMC4191544

Health Care Financ Rev. 1986 Winter; 8(2): 45–52.

Abstract

The Health Care Financing Administration is in the process of designing a competitive bidding model for the purchase of outpatient clinical laboratory services. One segment of this process involves the development of a relative value scale (RVS). The RVS could be used as part of the bidding process and as the basis of payment. The RVS could also be used as the basis of a national fee schedule, as stipulated in the Deficit Reduction Act of 1984. Potential problems with the development of an RVS from local (carrier) fee schedules for outpatient clinical laboratory services were investigated.

Introduction

Today, more than 4,000 independent clinical laboratories receive Medicare reimbursement for outpatient tests performed on Medicare beneficiaries (Health Care Financing Administration, 1986). Their size, measured by volume of services, ranges from fewer than 50,000 tests to more than one-half of a million tests performed annually. Some laboratories perform tests in only one or two specialties; others perform tests in all specialties certified by Medicare. Besides independent laboratories (laboratories that are independent of a physician's office or hospital), both hospital laboratories and physician's offices perform tests for ambulatory Medicare patients (Health Care Financing Administration, 1984).

In 1984, the Laboratory Task Force of the Health Care Financing Administration (HCFA) estimated that Part B Medicare reimbursement for noninpatient diagnostic laboratory services in fiscal year 1984 would total approximately $1.6 billion. About 50 percent of that amount was expected to be paid to hospitals for outpatient testing. The remaining 50 percent ($800 million) was expected to be paid to independent laboratories and physicians. The task force also estimated that Part B expenditures for laboratory services would increase by 15-20 percent annually over the next several years, mostly as a result of increases in the volume of testing rather than increased prices.

Currently, HCFA is in the process of developing a competitive bidding model to be tested as an alternative purchasing and reimbursement method for clinical laboratory services. HCFA's Office of Research and Demonstrations is planning to test the model in six demonstration sites beginning in January 1987. One segment of this effort calls for the development of a relative value scale (RVS).

An RVS is a weighting instrument that assigns a discrete value to each procedure in a set of related procedures. The values indicate the relative expense, complexity, and/or worth of performing a particular procedure as compared with a selected standard procedure. Most often, an RVS is used as a reimbursement tool for setting relative prices. For example, assume that, with regard to radiology procedures, a chest X-ray has a relative value of 1 and a CAT (computerized axial tomography) scan of the head has a value of 25. If a price factor of $50 is specified, then the payments for a chest X-ray and head scan would be $50 and $1,250, respectively.

When developed under this demonstration, the RVS could be used as part of the bidding process on which laboratory payments would be based. In addition, the developed RVS may have use as the basis of a national fee schedule for payment of outpatient clinical laboratory services, as stipulated in the Deficit Reduction Act of 1984.

It is our purpose in this article to investigate whether or not the fees paid by Medicare for a specified set of clinical laboratory procedures exhibit an underlying value structure. This investigation begins with a review of Medicare's reimbursement history for outpatient clinical laboratory services. Next, the set of clinical laboratory procedures is identified. The set of procedures is then analyzed, using various statistical methods, to determine if there is any underlying relationship among the procedures.

Reimbursement history

Medicare's reimbursement of clinical laboratory services has received much criticism in the past and a fair share currently. During the 1970's, reports by the Office of the Inspector General, the Subcommittee on Oversight and Investigations of the House Energy and Commerce Committee, and the General Accounting Office all contained recommendations for reform of Medicare's reimbursement system for laboratory services.

Although these reports tended to highlight fraudulent or abusive practices, two distinct problems were identified. First, laboratories billed Medicare at a higher rate than they billed physicians for the same service. Second, physician markups of the prices of clinical laboratory services performed outside their offices were often well in excess of the amount that the physician was charged by the laboratory (Health Care Financing Administration, 1984).

In 1980, with the passage of the Omnibus Reconciliation Act, an attempt was made to rectify these problems. The Act was also intended to enable the Medicare program to benefit directly from reduced rates charged to physicians by the laboratories performing the services. This legislation contained specific provisions for physician billing that prevented reimbursement for markups physicians added when billing for services performed by independent laboratories, thus allowing proper determinations of third-party reimbursement.

As a result, payments made for laboratory services after April 1, 1981, are subject to the following conditions, stated in 42 Code of Federal Regulations (CFR) 440 and 447.

  • If laboratory tests are performed by a physician or by personnel under his or her supervision, payment is made on the basis of the physician's reasonable charge for the service.

  • If laboratory tests are performed by an independent laboratory but are billed by a physician who identifies the laboratory and the amount the laboratory charged the physician for the service, payment is the lesser of the physician's actual submitted charge for the laboratory's service or the amount the laboratory charged the physician.

  • If a physician does not identify the laboratory or the amount it charged him or her for the test, payment is based on the lowest amount at which the Medicare carrier estimates the test could have been obtained by the physician from a laboratory serving the physician's locality.

Prior to June 1984, the amount reimbursed by Medicare to independent laboratories and laboratories based in physician's offices was based on reasonable charge calculations (i.e., the lowest of customary, prevailing, or actual charges). In addition, the reasonable charges for certain commonly performed laboratory procedures specified by Medicare could not exceed the lowest charge levels (i.e., the 25th percentile) at which these tests were widely and consistently available in the locality (42 CFR 405.511). This method of determining the amount of reimbursement was inherently inflationary. Although increases in charges may not have been reimbursed immediately, they became the historical data used to determine customary and prevailing charges the following year.

In section 2303 of the Deficit Reduction Act of 1984 (DEFRA), Congress amended the Social Security Act regarding payments for clinical diagnostic laboratory tests. The amendments affect clinical laboratory testing performed in physician's office laboratories and independent laboratories and testing by hospitals for their outpatients. The main purpose of this legislation was to contain the growth of Medicare Part B payments for clinical laboratory tests by altering payment methods and mandating assignment for independent and hospital laboratories. In addition, a waiver of coinsurance and deductibles was intended to reduce the administrative burden on laboratories of fee collection and to act as an incentive for physicians to accept assignment. At the same time, such a waiver was a means to reduce the out-of-pocket payments of Medicare beneficiaries.

DEFRA legislation also stipulated that claims for tests may be submitted only by the physician, independent laboratory, or hospital laboratory performing the tests. There were, however, two exceptions to that requirement:

  • When a test, at the request of a laboratory, is performed by another laboratory, payment may be made to the referring laboratory.

  • When a physician performs or supervises the performance of a laboratory test, payment may be made to another physician with whom he or she shares a practice.

This direct-billing requirement may aid in reducing the occurrences of physician markups for laboratory services.

Payments for clinical laboratory tests under DEFRA are based on a fee schedule. Actual payments made to physicians, independent laboratories, or hospital laboratories are the lower of either submitted charges or the fee schedule rate. Payment is made at 100 percent for assigned claims (whether assignment is voluntary or mandatory) and at 80 percent for unassigned claims (from physicians only, because only physicians have the option to reject assignment). Independent and hospital laboratories are required to accept assignment and are consequently reimbursed at 100 percent of the fee schedule. This reimbursement methodology was effective July 1, 1984.

Fee schedules are established on a carrier-wide basis. Fees are set at 60 percent of the prevailing charge levels for tests performed by independent laboratories and by physicians in their offices and at 62 percent of prevailing charge levels for tests performed by a hospital laboratory for outpatients. However, payment is set at 60 percent of the prevailing charge levels for tests performed by a hospital laboratory for nonhospital outpatients, (i.e., persons for whom the hospital is acting only in the capacity of an independent laboratory). The resulting fee schedule is adjusted annually for changes in the Consumer Price Index and may be adjusted for technological changes and the relative difference between regional or local area wage rates.

This reimbursement methodology was to be in effect until June 30, 1987. For tests furnished by physicians in their offices or by independent laboratories beginning July 1, 1987, the fee schedule was to be established on a nationwide basis. However the passage of the Consolidated Omnibus Budget Reconciliation Act in 1986, implementation of the national fee schedule was delayed to January 1, 1988. Whether the national fee schedule will apply to laboratory tests performed by a hospital for their own outpatients depends on the results of a study to be conducted by the Department of Health and Human Services and on further legislative action by the Congress.

Data

A good discussion of the various methods utilized to construct RVS's is provided by Hadley et al. (1983). Five basic methods are evaluated: charge-based methods, statistical cost function approaches, time-based models, microcosting and time/motion study methods, and consensus development/social preference methods. The authors' contend that a meaningful RVS should incorporate the concept of value or worth which would be inclusive and more subjective than either costs or complexity alone. Consequently, the payment for a service, in this case a clinical laboratory test, should be influenced not only by its cost and complexity but also by its benefit to the patient, its diagnostic utility, its implications for spending on other laboratory tests, and how it meets societal objectives, just to name a few (Hadley et al., 1985). Charged-based methods are the most likely to reflect the above factors. With this in mind, we analyzed 1984 Medicare fee schedules, which are charge based, to determine whether or not an underlying relative value relationship existed among fee schedule payments for a group of selected procedures.

A data base for the analysis was assembled from HCFA's 1983 Part B Medicare annual data (BMAD) procedure file and from 1984 fee schedules. (1983 Medicare prevailing charges were used as the basis of the 1984 Medicare fee schedules.) The 1983 procedure file contains information on all outpatient clinical laboratory procedures processed by Medicare that were performed in hospital, independent, or physician's office laboratories. The data elements for each procedure contained in the file include frequency, total submitted charges, total amount of Medicare allowed charges, and total amount of Medicare reimbursement.

Slight problems exist with this data base. Validation efforts by HCFA staff do not preclude the presence of errors in the carriers' counting of individual procedures. In addition, claims may have been pending and not included in the carriers' submitted frequency counts. Nonetheless, these limitations would not likely affect identification of the most frequently performed procedures.

The BMAD file data on hospital laboratories cover only procedures performed on people who were not hospital outpatients. (These are the instances in which the hospital laboratory acts as an independent laboratory.) The BMAD file may not accurately reflect the actual number of procedures performed by hospitals acting as independent laboratories.

The BMAD data file utilized for this preliminary study does not include any data on the type and frequency of outpatient clinical laboratory procedures performed in hospital laboratories for a hospital's own outpatients. It is unlikely that the addition of hospital procedure data would change the most frequently performed procedures as identified in this study. However, the relative rankings of these procedures based on performance frequency could change. The inclusion of such data would have little or no impact on this analysis because the methodology for development of carrier fee schedules does not include hospital charge data.

Finally, the 1983 BMAD file does not allow definitive determination of whether the provider or supplier who billed for service actually performed it. (As mentioned previously, that shortcoming was corrected with passage of the 1984 DEFRA legislation.)

The 1984 Medicare fee schedules analyzed are from 11 specific Medicare carriers across the country (Alabama, Arkansas, Florida, Maryland, Minnesota, Montana, Greater New York, North Dakota, South Carolina, Washington, and Wisconsin). Data and fee schedules from these carriers were used because they were all using the HCFA common procedure coding system (HCPCS). Thus, there was assurance that the coding of procedures was uniform across the carriers.

An initial review of the data revealed that more than 1,000 unique outpatient clinical laboratory procedures are designated as reimbursable under Medicare's fee schedule methodology. The 60 procedures selected for this study represent the procedures most frequently performed for Medicare beneficiaries. (See Figure 1 for a listing of the 60 procedures.) Those procedures account for more than 80 percent of the total volume of Medicare outpatient clinical laboratory services performed in hospital, independent, or physician's office laboratories. In addition, they represented approximately 80 percent of Medicare's total allowed charges for outpatient clinical laboratory services in the carrier areas analyzed for this study. (They actually accounted for 78.09 percent of the total allowed charges for services reimbursed on the fee schedule basis.) Each of the remaining fee schedule procedures accounted for less than 0.254 percent of the total volume and was therefore not included in this study.

Outpatient clinical laboratory procedures analyzed in this study and Health Care Financing Administration common procedure coding system (HCPCS) designation

Methods and results

In general, it was found that the Medicare fee schedule amounts for all of the 60 selected procedures ranged from a low of $1.80 to a high of $65.70 across the 11 carriers. The average fees for each of the individual procedures ranged from $3.54 to $30.86. These results confirm that wide variation existed in the fee schedule amounts. The coefficient of variation (CV), which is a measure of variability around the mean, was lowest for HCPCS code 85014 (blood count: hematocrit) at 8.5 percent and was highest for HCPCS code 80006 (automated multichannel test: 6 clinical chemistry tests) at 73.1 percent (Table 1).

Table 1

Distributions of fee schedule amounts for the 60 highest volume Health Care Financing Administration common procedure coding system (HCPCS) clinical laboratory procedures: 11 Medicare carriers, 1984

HCPCS codeMeanMinimumMaximumCoefficient of variation
80003 $10.85 $5.60 $23.40 .4803
80004 11.38 6.00 28.80 .5600
80005 13.14 6.00 28.80 .5426
80006 15.66 6.00 43.80 .7311
80007 14.05 7.20 33.40 .5011
80012 14.06 7.20 20.70 .2908
80016 15.85 9.60 24.30 .2870
80018 17.78 10.80 27.90 .3209
80019 17.99 12.60 26.60 .2317
81000 4.82 3.60 6.00 .1386
81002 3.54 2.98 5.10 .1913
82150 9.16 6.90 12.60 .1795
82270 3.80 3.00 5.40 .1984
82435 6.35 3.60 8.40 .2394
82465 6.56 4.80 7.80 .1544
82550 9.43 6.00 12.00 .1926
82552 15.86 4.20 23.40 .3757
82565 7.02 4.80 8.40 .1551
82640 19.87 17.40 24.00 .1089
82643 19.00 15.30 21.60 .1039
82756 15.28 11.50 18.20 .1708
82803 30.86 12.60 65.70 .4275
82947 6.13 4.20 7.80 .1848
82948 4.82 3.00 6.20 .1921
83615 8.94 6.00 12.60 .2403
83625 16.01 6.00 23.40 .3144
84132 6.95 4.80 9.00 .1828
84295 6.57 4.80 8.00 .1361
84420 18.26 14.40 22.20 .1101
84435 10.22 7.20 13.40 .1938
84443 22.68 17.30 27.00 .1370
84450 7.99 5.40 11.00 .2543
84478 8.48 4.80 13.90 .2813
84520 6.55 4.80 8.40 .1599
84550 6.63 4.35 8.00 .1527
85007 5.39 3.60 7.20 .1912
85014 3.64 3.00 4.10 .0846
85018 3.67 3.00 4.20 .0944
85021 8.34 6.00 12.00 .2581
85022 9.35 6.00 13.50 .2657
85028 10.40 6.00 19.95 .3922
85031 9.85 6.60 14.60 .2663
85044 6.24 4.50 7.20 .1224
85048 3.85 3.00 4.20 .0994
85580 6.21 4.50 8.70 .1642
85595 6.41 4.80 9.00 .1613
85610 6.11 4.20 7.20 .1323
85650 5.15 3.60 6.00 .1290
85651 5.36 3.60 6.50 .1546
85730 8.33 4.98 11.70 .2199
86151 24.20 21.00 27.30 .0933
86592 5.56 3.84 6.60 .1364
87040 13.47 9.00 19.62 .2530
87070 12.19 9.00 17.60 .1859
87086 11.45 8.40 15.00 .1815
87101 10.46 4.80 18.00 .3760
87184 9.53 6.00 16.40 .3114
87205 5.58 3.00 6.60 .2081
88150 7.12 6.00 8.40 .1322
89205 4.05 1.80 6.00 .3195

The first step of the investigation was to determine if any type of relationship existed among the procedures across the carriers. An identified relationship would indicate uniformity, which is essential if an RVS is to have universal applicability. One approach used to make such a determination is to calculate Pearson correlation coefficients. This is accomplished by making simple pairwise (carrier versus carrier) comparisons of the actual data values, which in this case are the Medicare fee schedule amounts.

The correlation coefficient (r) is a summary measure of the similarity of the prices across the carriers for each procedure. The degree of similarity is represented by the value of the coefficient, which has a range of -1 to 1. Because a direct relationship across the carriers is expected, the r values should be positive (i.e., range from 0 to 1). The closer the values are to 1, the greater the similarity, and vice versa. A Pearson correlation coefficient equal to 1 indicates that the data sets being compared are perfectly correlated. In this instance, it would imply that either the fee schedules of the two carrier areas being compared are exactly the same or that the fee for each procedure in one carrier area differs from the fee in the other carrier area by a constant factor or amount. For example, one carrier area's fee for each procedure could be either 10 percent or 2 dollars above or below the corresponding fee of the comparison carrier area, and the Pearson correlation would be 1.

The calculated Pearson correlations for the fee schedules range from 0.51 (Greater New York versus North Dakota) to 0.95 (Alabama versus Florida). Overall, 69 percent of the pairwise correlations have values of 0.75 or greater (Table 2).

Table 2

Correlations (Pearson's r) of fee schedule amounts for the 60 highest volume Health Care Financing Administration common procedure coding system (HCPCS) clinical laboratory procedures: 11 Medicare carriers, 1984

Carrier (State)AlabamaArkansasFloridaMarylandMinnesotaMontanaNew YorkNorth DakotaSouth CarolinaWashington
Arkansas .9450
Florida .9492 .8816
Maryland .9062 .8870 .9170
Minnesota .8026 .6652 .8467 .7218
Montana .7887 .7895 .8160 .8029 .6120
New York .8429 .7773 .8772 .8403 .6877 .6048
North Dakota .7075 .7071 .7452 .7259 .6065 .8550 .5129
South Carolina .8576 .7926 .9136 .9098 .7833 .7614 .8476 .6737
Washington .8378 .8867 .7568 .8071 .5140 .5908 .7677 .5446 .6772
Wisconsin .8830 .8754 .8934 .8803 .6817 .8241 .7549 .7615 .8132 .8021

Another method used to test for the presence of an underlying relationship is the Spearman rank-order correlation. This method requires the use of rankings rather than the absolute values of the variables. Thus, each procedure's fee schedule amount for each carrier is ranked from lowest to highest. Those rankings are then compared on a pairwise basis across carriers. The computed correlation coefficient (R) again is a summary measure, in this case, a measure of the rankings of the procedures across carriers. The coefficient can vary from -1 to 1, but because a direct relationship is expected, the values should be positive. As with the Pearson correlation, the closer the R values are to 1, the greater the similarity of the rankings. A Spearman correlation coefficient that is equal to 1 implies that the rank orderings of the two data sets being compared are identical. In this particular case, a value of 1 would signify that when the procedures of two carrier areas are arranged by price, the order of the procedures is the same for each area.

The Spearman correlations that were calculated indicate a relationship among the 60 procedures that is even stronger than that indicated by the Pearson correlations. Using this method, the correlations range from 0.69 (Greater New York versus North Dakota) to 0.94 (Florida versus Wisconsin), with 91 percent of the correlations at the 0.75 level or greater (Table 3).

Table 3

Correlations (Spearman's R) of fee schedule amounts for the 60 highest volume Health Care Financing Administration common procedure coding system (HCPCS) clinical laboratory procedures: 11 Medicare carriers, 1984

Carrier (State)AlabamaArkansasFloridaMarylandMinnesotaMontanaNew YorkNorth DakotaSouth CarolinaWashington
Arkansas .9099
Florida .9102 .9191
Maryland .8742 .9016 .8970
Minnesota .8883 .8772 .8901 .8738
Montana .8794 .9279 .9152 .9105 .8750
New York .8156 .8044 .8285 .7771 .7272 .7428
North Dakota .8504 .8840 .8845 .8734 .9132 .9292 .6869
South Carolina .7973 .8511 .8538 .8225 .7660 .8061 .7086 .8241
Washington .8490 .8957 .8765 .8880 .8420 .8244 .8386 .8059 .7314
Wisconsin .9096 .9388 .9420 .9101 .8953 .9212 .7893 .8932 .8438 .8958

Next, each fee schedule amount was weighted by the volume of procedures performed in each carrier area. By weighting the fee schedule amounts, the distribution of procedures was taken into account, thereby placing greater emphasis on the high-volume procedures. As a result, the Pearson correlation coefficients increased. The correlations computed in this instance ranged from 0.60 (Greater New York versus North Dakota) to 0.96 (Alabama versus Arkansas). Again, the majority (82 percent) of the correlations were at the 0.75 level or greater (Table 4). These results were not dramatically different from those found for the unweighted fee schedule amounts.

Table 4

Correlations (Pearson's r) of fee schedule amounts weighted by volume of claims per carrier for the 60 highest volume Health Care Financing Administration common procedure coding system (HCPCS) clinical laboratory procedures: 11 Medicare carriers, 1984

Carrier (State)AlabamaArkansasFloridaMarylandMinnesotaMontanaNew YorkNorth DakotaSouth CarolinaWashington
Arkansas .9609
Florida .9537 .9321
Maryland .9407 .9272 .9383
Minnesota .8231 .7482 .8245 .7342
Montana .8976 .9046 .8904 .8919 .7321
New York .8789 .8509 .9035 .8646 .7075 .7320
North Dakota .7789 .7838 .7925 .7571 .7105 .8977 .6001
South Carolina .9452 .9246 .9377 .9314 .8464 .8851 .8695 .7889
Washington .8811 .8864 .8401 .8477 .6845 .7630 .8684 .6770 .8689
Wisconsin .9066 .8729 .8756 .9068 .7474 .8745 .7718 .7962 .9064 .8286

These results indicate that a strong relationship exists by procedure among the fee schedule amounts in the 11 carriers that were analyzed; that is, certain procedures are consistently more expensive, and others are consistently less expensive. Being able to establish that strong payment relationships exist by procedure across carriers is the first step necessary for the development of an RVS that could be used as the basis of a national fee schedule or as part of the competitive bidding demonstration.

Because these high correlations were found, a method was tested for constructing an RVS-type standardized fee schedule for the 60 procedures. The technique used in this analysis was to divide the individual fee schedule amounts for each procedure for each carrier by the local fee schedule amount for a specific high-volume procedure. In doing so, an assumption was made that the fee associated with that high-volume procedure closely approximated the procedure's true economic cost (i.e., the cost of production plus a normal profit margin). Consequently, the procedure might be used as the standard of comparison among the 60 procedures. The two highest volume procedures selected as numerares for this analysis were HCPCS 81000 (urinalysis: routine, with microscopy) and HCPCS 82947 (glucose, except urine). These procedures accounted for more than 13 percent and almost 8 percent, respectively, of the total volume of Medicare laboratory procedures performed on an outpatient basis in independent, hospital, or physician's office laboratories during 1983.

Using the HCPCS 81000 code as the numerare for these adjustments, the carrier-specific relative weights for each of the 60 procedures ranged from a low of 0.33 to a high of 11.90; the average relative value weights across carriers ranged from a low of 0.74 to a high of 6.40. When using HCPCS 82947, the carrier-specific relative value weights ranged from 0.26 to 9.52; the average relative value weights across the carriers had values ranging from 0.59 to 5.14.

Unfortunately, as a consequence of these adjustments, variability increased across the 11 carriers. Specifically, the minimum CV of the adjusted figures rose from 8.5 percent to 10.6 percent when using HCPCS 81000 as the numerare and to 15.4 percent for HCPCS 82947. More importantly, the CV's increased 80 percent of the time as a result of these adjustments. For both numerares tested for developing an RVS, it was determined that such adjustments increased the differences among the 11 carriers' fee schedule amounts rather than decreasing them.

Further analysis of the data continued with intercarrier comparisons. Such comparisons can establish whether certain carriers had uniformly high or uniformly low fee schedule payment amounts. This was expected to occur. However, it was also anticipated that by making adjustments through the use of numerares the fee schedule payment amounts would become more comparable among the 11 carriers. To test that hypothesis, the Friedman test, which is a two-way analysis of variance, was used. Using this particular statistical testing technique, we ranked the carriers within each procedure from lowest to highest. Next, we established an aggregate ranking for each carrier based on their ranking within each individual procedure.

The findings indicate a highly significant difference in the relative positions of the carriers included in this study (chi-square = 155). Use of the total data set without the RVS adjustments just described resulted in a range of average ranks from 3.09 (South Carolina) to 8.37 (North Dakota), with an overall mean rank of 6.00. The RVS-adjusted figures emphasized the differences among the rankings to an even greater degree. The HCPCS 81000 adjustment resulted in a chi-square score of 221, and the HCPCS 82947 adjustment gave a chi-square score of 208. In essence, this means that the adjustments described could not account for the differences in the fee schedules.

To test the sensitivity of these results, we attempted to identify a subset of the 60 procedures that might provide a more consistent picture across the carriers with regard to the relative pricing of the procedures. Again, a two-way comparison (Friedman test) was performed, but this time, on selectively fewer and fewer numbers of procedures. The technique used in selecting the subsets was to use only procedures with an unadjusted CV of; (a) 30 percent or less (44 procedures); (b) 25 percent or less (36 procedures); and (c) 20 percent or less (31 procedures).

In all of the two-way comparisons performed on the subsets, significant differences were found among the 11 carriers analyzed. The chi-squares for the unadjusted figures were 122 for (a), 98 for (b), and 83 for (c). These differences persisted and intensified when the RVS-adjusted values were substituted. For HCPCS 81000, the chi-squares were 194 for (a), 156 for (b), and 137 for (c); for HCPCS 82947, chi-squares were 186, 170, and 162 respectively. Consequently, even with subsets of procedures that were selected because of their low CV's, the disparity of the fee schedule amounts among the carriers could not be explained. This sensitivity analysis indicated substantial differences among the payment amounts in the fee schedules of the carriers studied, even when only relatively price-homogeneous procedures were used.

Discussion

In the preceding analysis, we have presented strong evidence that there is a high correlation across procedures in the fee schedules of the 11 carriers studied. This means that the high-cost procedures of one carrier are also likely to be high-cost procedures in other carrier service areas; the same holds true for low-cost procedures. At the same time, however, there is substantial diversity across those same carriers in the payment amounts for each procedure. Attempting to adjust for the local fee schedule amounts by the technique described increases the differences rather than decreasing them. Using procedures that are more homogeneous across carriers in terms of their respective payment amounts (i.e., with low CV's), improvements are seen. However, there is not enough improvement to make the differences statistically insignificant.

Some of the observed variance may be attributable to Medicare current payment methodology. Although adjusted periodically for inflation, the methodology does not account for technological advances associated with the performance of tests. As a result, current payments for laboratory services are likely to reflect the costs and complexities of dated technologies. Thus, fee schedule payment differences across carriers tend to increase disproportionately to costs over time.

Additionally, unique carrier market area characteristics may account, in large measure, for the variance observed. For instance, some carrier areas utilized for this study contain several large metropolitan areas, whereas other carrier areas could be considered predominately rural. It could be hypothesized that transportation costs are higher for laboratories located in rural carrier areas. Consequently, rural area fee schedules might reflect those higher costs. Further, the type of laboratory in which a procedure is performed (e.g., an independent laboratory or a physician's office laboratory) could also account for pricing differences. As a result, fee schedule amounts may be different in carrier service areas dominated by independent laboratories than in areas where physician's office laboratories are predominant. Unfortunately, the data necessary to test these hypotheses are not yet available. Further, the data base for such a test would necessarily have to be larger than the one used in this study.

Two major implications with regard to the development of a national fee schedule result from this study. First, establishment of a national fee schedule based on an inconsistent RVS could dramatically affect, either positively or negatively, the financial viability of providers or suppliers of clinical laboratory services. The net effect on an individual provider or supplier would depend on the relative value relationships among the procedures.

Second, the techniques for adjusting for carrier payment differences that were tested in this study are inadequate for the identification of an RVS to be used in the competitive bidding process or as the basis of a national fee schedule. Other techniques, such as using the average price of the 10 most frequently performed procedures or lowest priced procedures as a numerare, may prove to be more appropriate. Another technique would be to group the procedures by specialty (i.e., hematology, chemistry, microbiology, etc.) and to perform the analysis separately for each group. However, before any other technique is tested, the hypothesis regarding local market features, mentioned earlier, should be addressed.

In conclusion, the findings indicate that a substantial difference exists among the relative carrier fee schedule amounts for the 60 procedures studied. Consequently, an underlying value structure could not be identified utilizing the techniques described. Nonetheless, an RVS to be used as the basis of a national fee schedule could be developed for the 60 procedures by taking the average or median of the fee schedule amounts. However, it is unlikely that such an RVS would accurately reflect the relative expense, complexity, and/or worth of each particular procedure.

With that in mind, a properly designed competitive bidding model could aid in establishing prices that are relatively proportional to costs as long as no artificial bid submissions (such as could result from collusion among bidder laboratories) are made. The established prices could then be analyzed by the statistical methods described in order to evaluate whether an underlying value structure exists that could be used as the basis of a national fee schedule.

Acknowledgments

We wish to express our gratitude to Earl Swartz, Mike Herman, Edye Fisher, Sarah Sullivan, and Pat Ramsey of HCFA's Bureau of Data Management and Strategy, without whose help this article would not have been possible. Special thanks go to Arlene Bradt and Beverly Schrader for typing the manuscript.

Footnotes

Reprint requests: Paul A. Gurny, Health Care Financing Administration, 2302 Oak Meadows Building, 6325 Security Boulevard, Baltimore, Maryland 21207.

NOTE: All of the statistically significant findings referred to in this study had p values of 0.001 or less.

References

  • Health Care Financing Administration. Report of Laboratory Task Force. Feb. 1984. [Google Scholar]
  • Health Care Financing Administration. Medicare, Medicaid Automated Certification System. Mar. 1986. [Google Scholar]
  • Hadley J, Juba D, Berenson R, et al. Alternative Methods of Developing a Relative Value Scale of Physicians' Services: Year 1 Report. Washington, D.C.: Urban Institute; Feb. 1983. Contract No. HCFA-500-81-0053. Prepared for the Department of Health and Human Services. [Google Scholar]
  • Hadley J, Juba D, Berenson R, et al. Final Report on Alternative Methods of Developing a Relative Value Scale of Physicians' Services. Washington, D.C.: Urban Institute; Apr. 1985. Contract No. HCFA–500–81–0053. Prepared for the Department of Health and Human Services. [Google Scholar]

Articles from Health Care Financing Review are provided here courtesy of Centers for Medicare and Medicaid Services

Which required establishment of an ambulance fee schedule payment system for ambulance services provided to Medicare beneficiaries?

The Balanced Budget Act of 1997 required establishment of an ambulance fee schedule payment system for ambulance services provided to Medicare beneficiaries (replacing a retrospective reasonable cost payment system for providers and suppliers of ambulance services (because such a wide variation of payment rates ...

Which is a data set developed to measure the outcomes of all adult patients receiving home health services?

The Outcome and Assessment Information Set (OASIS) is a comprehensive assessment designed to collect information on nearly 100 items related to a home care recipient's demographic information, clinical status, functional status, and service needs (Centers for Medicare and Medicaid Services [CMS], 2009a).

Which software is used to collect Oasis assessment data for transmission to state database?

A7. HAVEN is a stand-alone software program designed solely for the purpose of creating files of OASIS data to transmit to the State agency.

Which is responsible for data elements reported on the UB 04?

The UB-04 is maintained by NUBC, which is a voluntary and multidisciplinary committee that develops data elements for claims and claim-related transactions and is responsible for the design and printing of the current UB-04 form.

Toplist

Neuester Beitrag

Stichworte