Living donor liver transplantation: Open and laparoscopic

Published on 10/04/2015 by admin

Filed under Surgery

Last modified 10/04/2015

Print this page

rate 1 star rate 2 star rate 3 star rate 4 star rate 5 star
Your rating: none, Average: 0 (0 votes)

This article have been viewed 4577 times

Chapter 98B Living donor liver transplantation

Open and laparoscopic

Overview

Organ availability continues to be a major issue in contemporary transplantation. Living-donor renal transplantation is accepted globally as an important alternative to cadaveric renal transplantation for patients with end-stage renal failure. In 2008, living donors were the source of 36% of the total kidney transplants in the United States (United Network of Organ Sharing [UNOS], 2008). This is largely because the kidneys are paired organs, laparoscopic donor nephrectomy is widespread, and donor morbidity and mortality is low (Kocak et al, 2006).

Despite advances in hepatobiliary surgery, hepatic resection remains a technically demanding surgical endeavor, with higher complication rates than kidney transplantation (Miller et al, 2004a; Pomfret, 2003). Liver transplantation, now standard treatment for many hepatic diseases (see Chapter 97A), has gone through the same evolution as renal transplantation with respect to live donation. Transplantologists embarked on living-donor liver transplantation (LDLT) in the late 1980s, principally as a means to combat pediatric waiting-list mortality (Broelsch et al, 1991; Raia et al, 1989). The inception and refinement of LDLT and split-liver transplantation (see Chapter 98C) have significantly reduced pediatric waiting-list mortality (Testa et al, 2001), thus paving the way for the development of LDLT in adults.

As the indications for liver transplantation have broadened, and cadaveric organ supply has remained largely static, the waiting-list mortality for adults has increased. Thus, in the mid-1990s, as in pediatric transplantation, adult LDLT was initiated in response. In Japan, where death defined by neurologic criteria is not accepted, LDLT began with left-lobe adult LDLT (Hashikura et al, 1994), leading to right-lobe LDLT (Yamaoka et al, 1994) with substantial acceptance and growth of this practice in Asia (Chen et al, 2003), followed by initiation and refinement in the United States in the late 1990s (Boillot et al, 1999; Marcos et al, 1999; Miller et al, 2001; Wachs et al, 1998). Despite the inherent donor risks—and rare, but high-impact, donor deaths (Miller et al, 2004a; Fan et al, 2003)—right-lobe LDLT has remained a strong option in the treatment of liver disease. To that end, as of 2008, LDLT has made up 4% to 9% of all adult liver transplantations performed in the United States (UNOS, 2008). Currently, donor safety remains the primary focus of LDLT, superceding the omnipresent clinical incentive: the static cadaveric donor pool and transplant waiting-list mortality (Salame et al, 2002; Surman, 2002).

In 2002, the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) established a multicenter clinical study, the Adult-to-Adult Living Donor Liver Transplantation (A2ALL) cohort study, which consists of nine transplant centers experienced in performing LDLT and a data coordinating center responsible for directing and maintaining a clinical database for the retrospective (1998 through 2003) and prospective (2004 through 2010) arms of the study. This study, in collaboration with the American Society of Transplant Surgeons (ASTS) and the Health Resources and Services Administration (HRSA) has two principal objectives: 1) to determine whether it is more beneficial for a liver transplant recipient candidate to pursue LDLT or wait for a deceased-donor liver transplant (DDLT), and 2) to study the impact of liver donation on the donor’s health and quality of life. Secondary objectives will address selected biologic and clinical issues in transplantation structured around the comparison between DDLT and LDLT, including immunosuppression, malignancy, and hepatitis. Although the prospective arm of this valuable study is ongoing at the time of this publication, thus far, the resulting contributions have given considerable insight into LDLT, and these insights will be referenced in following sections of this chapter.

Congruent with the tenet of donor safety, reducing donor morbidity is of equal importance. The great advances that have been made in minimally invasive liver resection have now been applied to liver donation in hopes of reducing surgical morbidity and enhancing donor recovery. Similar to the evolution of recipient procedures, minimally invasive liver donation began in the pediatric setting, with laparoscopic harvest of the left lateral section (Cherqui et al, 2002). As centers gained experience in minimally invasive hemihepatectomy (Koffron et al, 2007a; O’Rourke & Fielding, 2004), these methods were carefully applied to right-lobe donation (Koffron et al, 2006) in the hope of reducing donor morbidity, enhancing recovery, and secondarily increasing the willingness to consider donation (Baker et al, 2009). These efforts show promise in their initial results, but greater experience and analysis of clinical outcomes is necessary to define the role of laparoscopy in liver donation and transplantation as a whole.

Living Donor Liver Transplantation: Indications and Results

Many of the indications for LDLT are the same as those for liver transplantation in general (see Chapters 97A, 97C, 97D, and 97E). Given this, the reasons to consider LDLT as opposed to DDLT center largely on chronology with regard to hepatitis C virus (HCV), hepatocellular carcinoma (HCC), and the likelihood that a cadaveric donor organ would become available in a timely fashion. These reasons are always weighed against the tenet that it is unjustified to subject even a volunteer to the risks inherent in donor hepatectomy if the recipient has a reasonable chance of receiving a cadaveric liver graft prior to medical decompensation or progression.

Pediatric Living Donor Liver Transplantation

In pediatric LDLT (see Chapter 98C), although the results in all indications are equivalent with whole-liver grafts, cadaveric graft allocation remains the major concern in the setting of acute hepatic failure. The combination of outcome and availability has made acute liver failure (ALF) a well-accepted indication for pediatric LDLT. For other indications for liver transplantation, split-liver transplantation—in which a cadaveric donor organ is divided to transplant two recipients—is an attractive and increasingly popular alternative to LDLT, and it avoids placing a healthy donor in harm’s way.

Adult Living Donor Liver Transplantation

Experience has shown that although retransplantation is higher after LDLT, patient survival is similar to DDLT (Abt et al, 2004; Freise et al, 2008). There are many reasons for this phenomenon, many of which are related to implantation of a smaller allograft. It seems intuitive that the more advanced the liver disease (liver decompensation, severe portal hypertension), the more ill the patient, and consequently the need for greater hepatocellular mass. In this light, many centers would not perform LDLT on patients with Child-Turcotte-Pugh (CTP) class C cirrhosis, unless the donor can provide a graft of substantial size. Fortunately, in most circumstances, the current U.S. organ allocation system can accommodate such gravely ill patients with DDLT.

Adult LDLT in the treatment of fulminant hepatic failure (FHF) has a unique geographic paradigm. This phenomenon is largely a result of differences in organ donation. In Western countries, LDLT for acute hepatic failure has acceptable results (Sugawara et al, 2002a, 2003), but the organ allocation scheme in the United States places patients with ALF at highest allocation priority, and so it is favorable that DDLT would be available. Therefore, in the United States the concern is raised that LDLT for ALF may place live donors at unnecessary risk in a clinical situation in which a potentially clinically superior DDLT may be possible. The A2ALL retrospective study resulted in a similar conclusion (Campsen et al, 2008). In the nine U.S. transplant centers, LDLT was rarely performed for ALF but was associated with acceptable recipient mortality and donor morbidity; however, the authors raised the concern that a partial graft might result in reduced survival of critically ill recipients and that the rapid course of ALF would lead to selection of inappropriate donors.

This contrasts sharply with the role of LDLT for ALF in Asia, where cadaveric donation is scarce. Here, LDLT accounts for more than 90% of the liver transplantations (de Villa & Lo, 2007; Kobayashi et al, 2003; Lubezky et al, 2004). FHF was the indication for transplantation in 5.7% of the adult LDLT series reported by the Asian group (Lee et al, 2007), in 12% of the series reported by the Hong Kong group (Lo et al, 2004), and in 14.6% of the series reported by the Kyoto group (Morioka et al, 2007); therefore in the East, the consensus is reflected in the findings of a large study on this subject (Ikegami et al, 2008). Over 10 years, 42 LDLTs performed for ALF were reviewed, concluding that outcomes were acceptable even in severely ill recipients, and that LDLT is the accepted treatment of choice for ALF.

Investigation of LDLT has provided clarity in other aspects of this form of liver therapy. Interestingly, results of the A2ALL retrospective cohort study did not show an immunologic advantage for LDLT versus DDLT (Shaked et al, 2009). Longer cold ischemia time was associated with a higher rate of acute cellular rejection in both groups despite much shorter median cold ischemia time in LDLT.

In terms of general recipient complications, the A2ALL study has provided additional data regarding LDLT. Complication rates were higher after LDLT (median, 3%) versus DDLT (median, 2%) and included biliary leak (32% vs. 10%), unplanned reexploration (26% vs. 17%), hepatic artery thrombosis (6.5% vs. 2.3%), portal vein thrombosis (2.9% vs. 0.0%), and complications leading to retransplantation or death (15.9% vs. 9.3%; P < .05) (Freise et al, 2008). Most notably, this analysis demonstrated that although the complication rates were initially higher with LDLT, with increasing center experience, complication rates declined to levels comparable to DDLT.

Hepatitis C Virus

Considerable reservations surround LDLT for patients with HCV (see Chapters 64 and 97A). If a patient has detectable HCV RNA before transplantation, the concern is that recurrence after LDLT may rapidly progress to cirrhosis and graft loss. In the first half of the decade, many centers reported that the course of recurrent HCV after LDLT was worse than in patients who received DDLT (Bozorgzadeh et al, 2004; Gaglio et al, 2003; Garcia-Retortillo et al, 2004; Shiffman et al, 2004; Zimmerman & Trotter, 2003), causing some centers to avoid performing LDLT for this indication (Garcia-Retortillo et al, 2004).

In addition, it has been shown that the degree of transplant center experience may influence the outcome after LDLT for HCV. The A2ALL retrospective study found that graft and patient survival was significantly lower for LDLT in centers of limited experience (<20 cases), but 3-year graft and patient survival were not significantly different (Terrault et al, 2007). This may be a significant factor, as there are more recent reports that are encouraging for LDLT in the treatment of HCV.

Although the hepatocellular mechanisms remain unknown, Schmeding and colleagues (2007) studied 289 patients and found that the intensity of HCV recurrence was not increased in living-donor graft recipients compared with recipients of full-size grafts. These findings have been reported by other groups (Takada et al, 2006). Furthermore, there is encouraging evidence that when HCV recurrence occurs, combination therapy with ribavirin and interferon appears to improve the outcome of recurrent HCV-infected patients after LDLT (Park et al, 2007).

One specific advantage of LDLT in the treatment of HCV is the ability to optimize transplantation timing. It has been reported that patients who clear HCV RNA with interferon/ribavirin have a high likelihood of remaining HCV RNA-negative after transplantation (Berenguer & Wright, 2003). The availability of a living donor allows for treatment of HCV to proceed to transplantation shortly after HCV RNA clearance has been achieved. Unfortunately, pretransplantation treatment used to prevent progression of disease or to minimize recurrence posttransplantation has highly variable effectiveness, tolerability, and outcome.

In attempts to allow greater patient tolerance and thereby achieve HCV RNA clearance prior to transplantation, the so-called low accelerating dose regimen (LADR) has been studied. Using LADR, Everson and colleagues (2005) studied 124 patients, and although 63% experienced treatment-related complications, 46% patients were HCV RNA negative at the end of treatment, 24% in follow-up; more significantly, 12 (80%) of 15 who were RNA negative before transplantation remained negative for 6 months after transplantation. One of the secondary aims of the A2ALL cohort prospective study is to evaluate LADR in combination with LDLT for reducing posttransplantation HCV recurrence.

Primary Hepatic Malignancy

Liver transplantation is now considered a treatment option for unresectable primary hepatic malignancy. The guidelines for LDLT in the treatment of malignancy largely parallel DDLT for both HCC and cholangiocarcinoma as discussed in Chapters 97D and 97E, respectively. But because of constraints in organ supply and allocation, as well as in the timing of transplantation, LDLT offers several strategic options not available in DDLT.

The growth of LDLT has been fostered in part by the inherent ability to treat hepatic malignancy with the chronologic advantage that scheduled LDLT provides. This is particularly evident in the treatment of HCC. In cirrhotic patients with early stage, unresectable HCC, transplantation is the favored treatment. Under the constraints of the Milan criteria (see Chapter 97D), posttransplantation survival rates equal those in patients without HCC. Patients with HCC within the Milan criteria are given added priority in the organ-allocation system. The combination of this systematic design, combined with therapies to delay tumor progression, such as ablation and arterial-based therapy, results in generally accepted outcomes. In spite of this validated process, some patients may incur prolonged waiting times. This time may allow for progression in some of these patients (Yao et al, 2003), excluding them as transplant candidates under the established guidelines.

In this circumstance, LDLT allows patients with HCC to proceed more rapidly to transplantation, potentially reducing the chance of tumor progression and/or posttransplantation recurrence. This has logically led many centers to offer LDLT for HCC; and similar to DDLT, the efficiency and efficacy of LDLT for HCC is widely reported (Gondolesi et al, 2004c; Kaihara et al, 2003; Todo & Furukawa, 2004). However, as LDLT is increasingly utilized to treat patients with locally advanced HCC not prioritized under the organ-allocation system, patients with large tumors logically experience a higher rate of recurrence (Axelrod et al, 2005).

In the era of limited organ availability, it is necessary to impose guidelines such as the Milan criteria to reduce posttransplantation cancer recurrence and to use resources wisely. This is the setting in which LDLT can provide a therapeutic alternative. In carefully selected patients, those who have been studied to determine that estimated risk of HCC recurrence is low enough to justify placing a healthy donor at risk, LDLT may provide benefit without taxing the organ pool. The acceptable success rates and controversial expanded HCC criteria designed to select such patients have been proposed by numerous centers (Bruix & Llovet, 2002; Gondolesi et al, 2004c; Lang et al, 2002).

Despite the sincere desire, and obvious ability, to treat patients with HCC, caution and practicality should supercede. LDLT may limit the waiting time and, as a result, may decrease the progression of disease so that the recurrence rate, intuitively, should be lower than for recipients awaiting DDLT; however, the reverse may also occur, and rapid transplantation may preclude waiting-list drop out, leading to higher recurrence posttransplantation. One example of this phenomenon was reported by the Northwestern group (Kulik & Abecassis, 2004), who reviewed the institutional experience in LDLT for HCC and found a higher recurrence rate, stage for stage, in recipients whose transplantations were accelerated (“fast tracked”) by performing LDLT, especially in the prior era, in which patients with HCC were disadvantaged by the allocation algorithm. Clearly, the role of LDLT in management of patients with HCC requires prospective direct analysis of both recurrence and drop-out rates in comparable patient cohorts with HCC undergoing either deceased- or living-donor liver transplantation.

Donor Evaluation

The appropriate liver donor for LDLT is a legal adult who is healthy, without liver disease, not significantly overweight, has a blood type compatible with the recipient, and is able to provide a graft of adequate size. Donor evaluation is carried out by a physician who serves as the donor’s advocate physician (New York State Committee on Quality Improvement in Living Liver Donation, 2002). The comprehensive medical evaluation includes 1) a complete history and physical examination; 2) blood analysis to exclude viral and autoimmune liver disease, diabetes, hyperlipidemia, and hypercoagulable states; 3) cardiology clearance; and 4) extensive psychosocial evaluation, with psychiatry consultations in cases of even minor concern.

As in resection for other indications, patient (potential donor) obesity increases perioperative risk and degree of hepatic steatosis. It is known that magnetic resonance imaging (MRI) can predict the presence of significant steatosis (>15%) and simplify donor evaluation in overweight candidates (Rinella, et al, 2003). Routine preoperative biopsy is not widely practiced (Ryan et al, 2002), but in many centers, potential donors who are significantly overweight (Schiano et al, 2001) or who have MRI findings suggestive of steatosis undergo liver biopsy to fully assess both donor and recipient short- and long-term risk (Hwang et al, 2004).

Our understanding of steatosis and its effect in liver transplantation has improved (Soejima et al, 2003b), and guidelines for DDLT have been established (Fishbein et al, 1997; Uchino et al, 2004). In LDLT, where liver mass is limited, hepatic steatosis must be considered in the calculation of true hepatocellular mass to ensure adequate engraftment (Limanond et al, 2004).

Estimation of graft volume (GV) is critical in LDLT to optimize donor safety and recipient outcome. Liver volume can be estimated by calculation or imaging, but no consensus exists on the most clinically effective method. To provide clarity in this dilemma, one study examined the accuracy of formula-derived GV estimates and compared them to both radiogically derived estimates and actual measurements (Salvalaggio, et al, 2005). This study found a marginal concordance between the formula-derived calculation and GV for right-lobe donors, but the error ratio was lower than for radiologic estimates; in contrast, MRI measurements for left lateral section grafts demonstrated a lower error ratio than formula-derived estimation. The authors therefore concluded that formula-derived estimates of GV should be routinely used in the initial screening of potential living donors.

Following successful medical clearance, liver vasculobiliary anatomy and graft-remnant liver volumes are assessed radiologically. MRI that includes arteriography (MRA), venography (MRV), and cholangiography (MRC) is frequently chosen, as it simultaneously defines graft-remnant liver volume, vascular and biliary anatomy, and overall hepatic integrity. MRC with mangafodipir trisodium may be used to enhance visualization of small-order biliary structures (Fig. 98B.1; Cheng et al, 2001; Goldman et al, 2003; Yeh et al, 2004).

Advances in both imaging hardware and software allow greater resolution and computer-assisted surgical simulation, aiming to improve accuracy and both donor and recipient outcome. It is logical that accurate assessment of liver graft anatomy is also pivotal in the recipient procedure, allowing management of the multitude of possible anatomic variations.

Anatomic Variations (See Chapter 1B)

The general segmental anatomy of the liver provides reliable external landmarks for living donation. Using the main hepatic scissura, anatomic resection of the left lateral section and hemilivers is commonplace for a wide variety of indications (see Chapters 90A and 92); however, LDLT relies not only on successful resection but also on the need to maintain intrahepatic and extrahepatic vasculobiliary structures, first, for donor remnant liver viability, and second, to facilitate reconstruction and implantation in the recipient. Therefore, to be successful, identification and management of the hepatic anatomy is crucial throughout the donor and recipient procedures. The following sections focus on the anatomic variations influential in LDLT.

Hepatic Artery

A great understanding and comfort managing arterial variations has resulted from liver transplantation. It is imperative to delineate precisely the arterial anatomy, because “normal” anatomy is present in just over half of the population (Gruttadauria et al, 2001; Hardy & Jones, 1994; Hiatt et al, 1994; Kawarada et al, 2000). The described arterial anatomic variations and their frequency are depicted in Figure 98B.2 (Varotti et al, 2004).

It is intuitive that arterial anatomy would have considerable impact on both donor candidacy and recipient surgical management. Experience has shown that a totally replaced right or left hepatic artery simplifies, rather than complicates, donor hepatectomy; in such situations, greater vessel length may be obtained. Rarely, imaging identifies two arteries supplying the graft. Although back-table reconstruction is possible (Marcos et al, 2001a, 2003), it may be prudent to evaluate other potential donors, if the surgical team has concerns about the recipient procedure.

Portal Vein

Clinically significant portal variations are less common than arterial variations (Nakamura et al, 2002) but have equal technical implications in both the donor and recipient procedures. These variations involve the configuration of the right portal vein bifurcation into sectorial branches. Instead of a portal bifurcation and a left and right common portal trunk, the right anterior and posterior sectorial branches may arise separately, effectively forming a portal trifurcation (Fig. 98B.3). When these sectorial branches arise immediately adjacent to each other, it is sometimes possible to leave a common wall between them to allow for single anastomosis in the recipient; however, sectorial branches that are significantly separated must be taken individually to avoid altering the portal flow to the remnant left lobe. Logically, to use such a lobe for LDLT, an extensive reconstruction using a bifucated interposition graft, such as a recipient portal vein bifurcation, is required (see Fig. 98B.3; Varotti et al, 2004). For this reason, many centers may choose to find a donor with less complex anatomy.

Biliary Anatomy

Biliary anatomic variations (Fig. 98B.4) tend to mimic those of the associated vascular structures, especially the portal vein (Lee, et al, 2004), but they may have greater surgical impact, particulary in right-lobe LDLT.

The standard anatomy consists of a confluence of left and right hepatic ducts to form the common hepatic duct. Usually, the segment II and III ducts join in the umbilical fissure, and dividing the hilar plate in this area provides a single duct for anastomosis (Renz et al, 2000). During left-lobe donation, care must be taken to avoid injury to a right posterior sectorial duct that is found to cross the Cantlie line, joining the left hepatic duct at the base of segment IV (see see Fig. 98B.4).

Right-lobe biliary anatomy may be quite complex. Only half of the donors have a single right hepatic duct (Kawarada et al, 2000) or a short duct, which is divided proximally in the interest of donor safety. Logically, the incidence of biliary complications increases with more complex anatomy and anastomoses (Gondolesi et al, 2004a). The most difficult variation is perhaps a right posterior sectorial duct draining into the left hepatic duct, which may be inadvertently transected, as it lies cephalad and posterior to the right portal vein. Because of its location, this bile duct branch is particularly difficult to reconstruct once the portal vein is anastomosed in the recipient.

Hepatic Veins

The variations in hepatic venous anatomy typically do not pose significant difficulty for left lobe or left lateral section donation. A common venous trunk of the middle and left hepatic veins is the usual configuration, facilitating a single venous outflow anastomosis in a left-lobe graft. Left lateral section grafts may be more complex. The confluence of the segment II and III veins may be at or near the junction with the middle hepatic vein. Safe donation may render two veins with a common septum, or it may necessitate performance of a side-to-side venoplasty to simplify implantation.

The venous drainage of the right lobe is much more complex. Typically, the right hepatic vein represents the main venous outflow of the right hemiliver, although variations occur that may drain individual liver segments (Fig. 98B.5). For this reason, in many centers, surgical decision making has evolved from considering the right liver as a whole to considering segmental venous drainage.

Although still somewhat controversial, it is generally agreed that venous structures 5 to 10 mm in diameter constitute significant drainage of the graft; therefore identifying and characterizing these structures has significance in LDLT. As illustrated in Figure 98B.5, imaging frequently identifies significant veins draining the anterior and posterior sectors. Those veins draining the posterior sector directly into the cava (e.g., segment VI inferior vein) may be preserved and reimplanted (Hwang et al, 2004). Anterior sector (segment V and VIII) veins drain into the middle hepatic vein. These may require preservation and later reconstruction in cases of marginal graft-recipient weight ratio (GRWR) and/or significant recipient portal hypertension (Sugawara & Makuuchi, 2001).

Graft Size and Small-for-Size Syndrome

We are only beginning to understand the unique ability of the liver to regenerate after injury or resection (see Chapter 5). Since the genesis of LDLT, however, one persistent problem has been posttransplantation hepatic insufficiency as a result of small graft mass (Lo et al, 1999). This underscores the importance of accurate estimation in the donor, especially as there is significant variation in total liver volume relative to body size and in the configuration of the liver (Gondolesi et al, 2004b).

Predonation imaging can accurately estimate the total donor liver volume and the volume of the proposed graft. Formulas to estimate expected liver volume have also been developed (Yoshizumi et al, 2003) and are slightly superior to imaging in donor screening (Salvalaggio, et al, 2005). Many centers use a combination of techniques to screen and then confirm the graft weight, which is then compared with the weight of the potential recipient. The ratio of the estimated weight of the donor liver graft to the weight of the recipient, expressed as a percent, is the GRWR.

It is well established that when the GRWR is less than 0.8%, there is significant risk of the recipient developing small-for-size syndrome (SFSS) (Sugawara et al, 2001). SFSS is posttransplantation hepatic insufficiency presenting as prolonged cholestasis, coagulopathy, and ascites formation in the absence of hepatic vascular insufficiency (see Chapter 100). Because this results in death or the need for retransplantation in half of the patients, over time the use of the right lobe has become the preferred graft in adult LDLT. The exception is when a very large person donates to a smaller recipient, where a left-lobe graft may be of adequate mass.

Additional factors contribute to recipient SFSS, including advanced cirrhosis, portal hypertension, and associated hyperdynamic splanchnic circulation. It is known that recipients of this type require a GRWR greater than 0.8% (Ben-Haim et al, 2001). The splanchnic hemodynamic mechanisms responsible for SFSS are poorly understood (Asakura et al, 1998; Gondolesi et al, 2002b; Huang et al, 2000; Niemann et al, 2002; Piscaglia et al, 1999). In attempts to reduce the hyperdynamic portal circulation and abrogate graft injury, some centers have studied adjunctive splenectomy, portosystemic shunt, or octreotide infusion with mixed results (Masetti et al, 2004; Troisi & de Hemptinne, 2003).

Although certainly an important factor for liver graft function in general, optimizing liver graft outflow, is a possible way to reduce hyperdynamic liver injury. In right-lobe grafts, the outflow of the anterior sector to the middle hepatic vein, via segment V and VIII veins, is variable. In patients in whom the GRWR is low, surgical reconstruction of these middle hepatic vein tributaries may reduce hepatic congestion and potentially prevent the development of SFSS (Sugawara et al, 2004). Venous reconstruction may be in the form of including the main trunk of the middle hepatic vein with the graft, or interposition grafting can be used to recreate the intrahepatic middle hepatic vein.

Another proposed method of avoiding SDSS is increasing liver mass by using dual liver grafts. One center reported their preliminary experience using dual grafts from one right lobe without the middle hepatic vein and one left lateral segment in adult-adult LDLT (Chen et al, 2009

Buy Membership for Surgery Category to continue reading. Learn more here