The Changing Tides of Immunosuppression

Post by:
Aimen Liaqat, MD 
Nephrology Fellow, Cleveland Clinic

We have come far since the first successful kidney transplant in 1954 when Ronald Herrick donated a kidney to his twin brother Richard at the Brigham and Women’s Hospital in Boston. This historic breakthrough permanently shaped the future of organ transplantation as we know it today. This progress can be best appreciated as an evolution over four distinct periods.

Success in the early period from 1954-1962 relied on the concept of a precise immunological match in the absence of effective immunosuppression. Most transplants during this time occurred between HLA-identical siblings. The restricted treatment arsenal included steroids, 6-mercaptopurine or total body radiation. During the first attempt to perform kidney transplantation between immunologically dissimilar individuals, 10 out of the 11 patients died of severe infections following lymphoid irradiation. The lone survivor was treated with steroids leading to successful graft function and survived for 20 years. This was the first time the genetic barrier was breached and paved the road leading to long-term immunosuppression.

The second period from 1962-1980s was defined by the use of azathioprine. Calne and Zukoski et al first used azathioprine to demonstrate improved canine transplant survival. Human studies during this period revealed that the one-year acute rejection and kidney graft survival rates were around 50%. In 1963, Murray et al reported prolonged survival with 13 patients treated with drugs (azathioprine and steroids) as the sole agents to suppress the immune system without the use of any irradiation or bone marrow suppression. Meanwhile, a possible biological test to aid donor selection was beginning to evolve. In 1969, a study by Ramon Patel and Paul I. Terasaki showed that the presence of preformed cytotoxic antibodies against the donor (positive cross match) appeared to be a strong contraindication for transplantation. This period concentrated its efforts to overcome logistic barriers of obtaining and preserving deceased organs and expanded the approach to transplantation beyond living donation.

The third period started in 1983 with the introduction of the calcineurin inhibitor cyclosporine. A large Canadian randomized controlled trial compared cyclosporine and prednisone to standard therapy with azathioprine and prednisolone in deceased donor kidney transplant recipients. They concluded that those treated with cyclosporine had better graft and patient survival at three years despite having reduced but stable kidney function. Implementation of this discovery led to rejection rates falling below 50% and graft survival exceeding 85% during this decade. Disappointingly, no major immunosuppression advancements made it to standard clinical care over the next decade. 

The final period began in the 1990s and continues to the current era including the robust discoveries eventually culminating in the modern-day induction regimens and standard triple therapy immunosuppression.

In 1995, a randomized controlled trial based out of Europe (European Mycophenolate Mofetil Cooperative Study Group) compared the use of mycophenolate mofetil (MMF) versus placebo in patients already on cyclosporine and corticosteroids. They demonstrated that MMF reduced the rate of biopsy-proven rejection and other forms of treatment failure during the crucial 6 months post transplantation. The Tricontinental Mycophenolate Mofetil Renal Transplantation Study Group further investigated the efficacy and safety of MMF (both 3 g and 2 g doses) and azathioprine along with equivalent doses of cyclosporine and steroids. Biopsy-proven rejection occurred in 15.9% of patients in the MMF 3 g group, in 9.7% of the MMF 2 g group, and in 35.5% of the azathioprine group. More frequent high-grade rejection (Grade II and higher) and treatment failure episodes were observed in the azathioprine group. This was a pivotal publication that led to the widespread use of MMF in the standard immunosuppressive regimen that we use today.

In 1997, induction therapy with basiliximab was introduced. At the time, available immunosuppression regimens were still far from ideal and acute rejection episodes were still reported in 30-50% of patients.  This trial showed that prophylaxis with 20 mg basiliximab on day 0 and day 4 reduces the incidence of acute rejection episodes significantly (29.8% in basiliximab group versus 44% in placebo). 

The ELITE SYMPHONY trial in 2007 evaluated the efficacy of 4 different immunosuppressive regimes: low-dose cyclosporine (50-100 ng/ml ) versus standard dose cyclosporine versus low-dose tacrolimus (0.1 mg/kg/day) versus low-dose sirolimus (3 mg daily) along with a the combination of daclizumab for induction and mycophenolate mofetil, and corticosteroids. This was an attempt to reduce the nephrotoxic effects of calcineurin inhibitors like cyclosporine and tacrolimus while maintaining efficacy in terms of acute rejection, overall survival of patients, and allograft survival. At 6 and 12 months, the incidence of biopsy-proven acute rejection (excluding borderline cases) in the low-dose tacrolimus group was half that of those in the standard-dose cyclosporine group and the low-dose cyclosporine group and approximately one third of that in the low-dose sirolimus group. Allograft survival in the low-dose tacrolimus group was also significantly higher than that of those in the standard-dose cyclosporine group and the low-dose sirolimus group. Since its publication, this trial has guided our current standards for routine maintenance immunosuppression.

The FREEDOM study group compared steroid free, steroid withdrawal (on day 7) and standard steroid based regimens along with cyclosporine, mycophenolate sodium and basiliximab. The results showed that median 12-month GFR was not significantly different in the steroid-free or steroid-withdrawal groups. However, both these groups had a significantly higher early biopsy-proven acute rejection risk. Of note though, the steroid-free and steroid withdrawal regimes were associated with reduced de novo use of antidiabetic and lipid-lowering drugs, triglyceride levels and weight gain thereby suggesting that for standard risk transplants, steroid avoidance may be more desirable if it could be coupled with equivalent acute rejection risk.

The CONVERT study in 2009 evaluated  the conversion to sirolimus from calcineurin inhibitors to further minimize calcineurin (CNI) nephrotoxicity. Use of sirolimus in patients with baseline GFR more than 40 mL/min was associated with excellent patient and graft survival, no difference in biopsy-confirmed acute rejection, increased urinary protein excretion, and a lower incidence of malignancy compared with CNI continuation. On-therapy analysis showed superior renal function among patients who remained on sirolimus through 12 to 24 months, particularly in the subgroup of patients with baseline GFR more than 40 mL/min and UPCR less than or equal to 0.11g/g.

The 3C study was a pragmatic randomized controlled trial that compared alemtuzumab (a potent lymphocyte-depleting antibody) as an induction treatment versus standard basiliximab-based treatments. All participants were on Tacrolimus and mycophenolate. Steroids were given only to those assigned to basiliximab. Those with a functioning transplant 5 and 7 months post-transplantation were eligible to participate in tacrolimus versus sirolimus‐based maintenance therapy. The results showed that this approach led to reduced risk of biopsy-proven acute rejection (7% in the alemtuzumab group versus 16% in the basiliximab group during the first 6 months). The sirolimus conversion group had no difference in eGFR, but a higher proportion of biopsy-proven acute rejection (14.7 % vs 3 %) and serious infections (48.2% vs 35.5%) at 18 months compared to the tacrolimus group. 

The Harmony trial in 2017 compared the efficacy of two induction agents (thymoglobulin versus basiliximab) at permitting rapid steroid withdrawal in the first year post-transplant in tacrolimus and MMF-based regimens. Results showed that rabbit antithymocyte globulin (ATG) was not superior to basiliximab for allograft and patient survival after rapid steroid withdrawal. However, those with a low immunological risk profile may benefit from rapid steroid withdrawal with lower incidence of post-transplantation diabetes.

The BENEFIT trial and BENEFIT EXT trial included standard criteria and extended criteria organs respectively. These trials aimed to explore the use of belatacept along with MMF and steroids as maintenance therapy. Both more intensive and less intensive belatacept regimens were shown to be non-inferior to cyclosporine at 1 year for graft survival however, the mean measured and calculated GFR were approximately 13-15 mL/min higher in the belatacept group when compared to the cyclosporine group and there was lower biopsy-proven allograft nephropathy at 12 months. However, acute rejection at 12 months was higher in the belatacept group with histologically more severe rejections. Donor-specific antibodies (DSA) were significantly lower with both belatacept regimens compared to cyclosporine and they were Class I DSA which is associated with better outcomes when compared to class 2.

Since July 2011, belatacept‐based immunosuppression regimens were used in clinical practice but one limitation of these trials was that they did not compare belatacept to the standard of care, tacrolimus. A more detailed overview of the key belatacept trials is available in our previous timeline – here. A retrospective analysis from the Emory group compared patients treated with belatacept with a historical cohort receiving a tacrolimus‐based protocol. They compared the historical tacrolimus cohort with CNI free belatacept and a combined belatacept/tacrolimus group (a short 5 months overlap and long 11 months overlap). The rates of rejection using the combined belatacept/tacrolimus were significantly lower than the CNI‐free, belatacept‐based immunosuppression. Those who received the long 11 months overlap belatacept/tacrolimus had comparable rejection rates to standard tacrolimus triple therapy. Thus, showing the potential reduction in risk of the acute rejection seen in CNI-free belatacept using a combined tacrolimus/belatacept bridge. However, this has not be shown in randomized clinical trials.

Stegall et al studied kidney allograft histology at 10 years post-transplantation in the era of tacrolimus use. Almost all kidney allografts examined had evidence of major histologic injury with the most frequent pathology being arterial hyalinosis and glomerulosclerosis. Both of which are associated with chronic exposure to CNIs as well as antibody-mediated rejection. Surprisingly, much damage appears to be non-immunologic, suggesting that new approaches are needed because the graft is vulnerable to chronic late insults leading to deteriorating allograft function. 

The TRANSFORM trial was designed to compare the safety and efficacy of everolimus treatment to permit reduced CNI exposure. It showed that everolimus is non-inferior to the current standard of care in preventing acute rejection and preserving graft function in mild-to-moderate immunologic risk patients. In addition, the rates of CMV and BK virus were lower in the first year.

Although the 21st century has witnessed the introduction of several novel immunosuppressive drugs beyond CNIs, the use of these newer therapies remains limited and still fraught with issues. We are still on a lookout for a new breakthrough that will challenge the “current standard of care”. Recently, enthusiasm is growing in the generation of antigen-specific Tregs by genetic engineering with chimeric antigen receptors (CARs) with the promise of “liberation from immunosuppression”. Over the next few years, we hope to seen these new therapies tested in high quality clinical trials and welcome discoveries that continue to make transplantation safer and better.

Reviewed by: Martha Pavlakis, MD, Larissa Krüger, MD, Dearbhla Kelly, MD, Matthew A. Sparks, MD.

1 comment

  1. .

Leave a Reply