Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett...

27

Click here to load reader

Transcript of Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett...

Page 1: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

The current and future use of imaging in urological robotic surgery: A survey of the

European Association of Robotic Urological Surgeons

Archie Hughes-Hallett, MRCS1, Erik K Mayer, PhD1, Philip Pratt, PhD2, Alex Mottrie, PhD3,4, Ara

Darzi, FRS1,2, Justin Vale, MS1,

1. Department of Surgery and Cancer, Imperial College London2. The Hamlyn Centre for Robotic Surgery, Imperial College London3. Department of Urology, OLV Clinic, Aalst, Belgium 4. O.L.V. Vattikuti Robotic Surgery Institute, Aalst, Belgium

Corresponding Author

Erik Mayer,

Department of Surgery and Cancer, Imperial College London, St Marys Hospital

Campus, London, W2 1NY

07984195642

[email protected]

No reprints will be available from the authors

No financial support was received

Article Category: Original Article

Word count abstract: 244

Word count manuscript text: 2,142

5 figures and 2 tables

1

1

2

3

4

56789

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

Page 2: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

Introduction

Since Röntgen first utilised x-rays to image the carpal bones of the human hand in

1895, medical imaging has evolved and is now able to provide a detailed

representation of a patient’s intracorporeal anatomy, with recent advances now

allowing for 3-dimensional (3D) reconstructions. The visualisation of anatomy in 3D

has been shown to improve the ability to localize structures when compared to 2D

with no change in the amount of cognitive loading [1]. This has allowed imaging to

move from a largely diagnostic tool to one that can be used for both diagnosis and

operative planning.

One potential interface to display 3D images, to maximise its potential as a tool for

surgical guidance, is to overlay them onto the endoscopic operative scene (augmented

reality). This addresses, in part, a criticism often levelled at robotic surgery, the loss

of haptic feedback. Augmented reality has the potential to mitigate for this sensory

loss by enhancing the surgeons visual cues with information regarding subsurface

anatomical relationships [2].

Augmented reality surgery is in its infancy for intra-abdominal procedures due in

large part to the difficulties of applying static preoperative imaging to a constantly

deforming intraoperative scene [3]. There are case reports and ex-vivo studies in the

literature examining the technology in minimal access prostatectomy [3–6] and

partial nephrectomy [7–10], but there remains a lack of evidence determining

whether surgeons feel there is a role for the technology and if so what procedures

they feel it would be efficacious for.

2

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

Page 3: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

This questionnaire-based study was designed to assess: firstly, the pre- and

intraoperative imaging modalities utilised by robotic urologists; secondly, the current

use of imaging intraoperatively for surgical planning; and finally whether there is a

desire for augmented reality amongst the robotic urological community.

Methods

Recruitment

A web based survey instrument was designed and sent out, as part of a larger survey,

to members of the EAU Robotic Urology Section (ERUS). Only independently

practising robotic surgeons performing RALP, RAPN and/or robotic cystectomy

were included in the analysis, those surgeons exclusively performing other

procedures were excluded. Respondents were offered no incentives to reply. All data

collected was anonymous.

Survey design and administration

The questionnaire was created using the LimeSurvey platform

(www.limesurvey.com) and hosted on their website. All responses (both complete

and incomplete) were included in the analysis. The questionnaire was dynamic with

the questions displayed tailored to the respondents’ previous answers.

When computing fractions or percentages the denominator was the number of

respondents to answer the question, this number is variable due to the dynamic nature

of the questionnaire.

Survey Content

3

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

Page 4: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

Demographics

All respondents to the survey were asked in what country they practised and what

robotic urological procedures they performed, in addition to what procedures they

performed surgeons were asked specify the number of cases they had undertaken for

each procedure.

Current Imaging Practice

Procedure-specific questions in this group were displayed according to the operations

the respondent performed. A summary of the questions can be seen in appendix 1.

Procedure non-specific questions were also asked. Participants were asked whether

they routinely used the Tile Pro™ function of the da Vinci console (Intuitive

Surgical, Sunnyvale, USA) and whether they routinely viewed imaging

intraoperatively.

Augmented Reality

Prior to answering questions in this section, participants were invited to watch a

video demonstrating an augmented reality platform during Robot-Assisted Partial

Nephrectomy (RAPN), performed by our group at Imperial College London. A still

from this video can be seen in figure 1. They were then asked whether they felt

augmented reality would be of use as a navigation or training tool in robotic surgery.

Once again, in this section, procedure-specific questions were displayed according to

the operations the respondent performed. Only those respondents who felt augmented

reality would be of use as a navigation tool were asked procedure-specific questions.

4

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

Page 5: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

Questions were asked to establish where in these procedures they felt an augmented

reality environment would be of use.

Results

Demographics

Of the 239 respondents completing the survey 117 were independently practising

robotic surgeons and were therefore eligible for analysis. The majority of the

surgeons had both trained (210/239, 87.9%) and worked in Europe (215/239, 90.0%).

The median number of cases undertaken by those surgeons reporting their case

volume was: 120 (6 - 2000), 9 (1 – 120) and 30 (1 – 270), for RALP, Robot assisted

cystectomy and RAPN respectively.

Contemporary use of imaging in robotic surgery

When enquiring about the use of imaging for surgical planning, the majority of

surgeons (57%, 65/115) routinely viewed preoperative imaging intraoperatively with

only 9% (13/137) routinely capitalising on the TilePro™ function in the console to

display these images, when assessing the use of TilePro™ amongst surgeons who

performed RAPN 13.8% (9/65) reported using the technology routinely.

When assessing the imaging modalities that are available to a surgeon in theatre the

majority of surgeons performing RALP (74%, 78/106)) reported using MRI with an

additional 37% (39/106) reporting the use of CT for preoperative staging and/or

planning. For surgeons performing RAPN and robot-assisted cystectomy there was

more of a consensus with 97% (68/70) and 95% (54/57) of surgeons, respectively,

using CT for routine preoperative imaging (table 1).

5

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

Page 6: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

Those surgeons performing RAPN were found to have the most diversity in the way

they viewed preoperative images in theatre, routinely viewing images in sagittal,

coronal and axial slices (table 2). The majority of these surgeons also viewed the

images as 3D reconstructions (54%, 38/70).

The majority of surgeons used ultrasound intraoperatively in RAPN (51%, 35/69)

with a further 25% (17/69) reporting they would use it if they had access to a ‘drop-

in’ ultrasound probe (figure 3).

Desire for augmented reality

In all 87% of respondents envisaged a role for augmented reality as a navigation tool

in robotic surgery and 82% (88/107) felt that there was an additional role for the

technology as a training tool.

The greatest desire for augmented reality was amongst those surgeons performing

RAPN with 86% (54/63) feeling the technology would be of use. The largest group

of surgeons felt it would be useful in identifying tumour location, with significant

numbers also feeling it would be efficacious in tumour resection (figure 4).

When enquiring about the potential for augmented reality in Robot-Assisted

Laparoscopic Prostatectomy (RALP), 79% (20/96) of respondents felt it would be of

use during the procedure, with the largest group feeling it would be helpful for nerve

sparing 65% (62/96) (Figure 2). The picture in cystectomy was similar with 74%

(37/50) of surgeons believing augmented reality would be of use, with both nerve

6

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

Page 7: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

sparing and apical dissection highlighted as specific examples (40%, 20/50) (Figure

5). The majority also felt that it would be useful for lymph node dissection in both

RALP and robot assisted cystectomy (55% (52/95) and 64% (32/50) respectively).

Discussion

The results from this study suggest that the contemporary robotic surgeon views

imaging as an important adjunct to operative practice. The way these images are

being viewed is changing; although the majority of surgeons continue to view images

as two-dimensional (2D) slices a significant minority have started to capitalise on 3D

reconstructions to give them an improved appreciation of the patient’s anatomy.

This study has highlighted surgeons’ willingness to take the next step in the

utilisation of imaging in operative planning, augmented reality, with 87% feeling it

has a role to play in robotic surgery. Although there appears to be a considerable

desire for augmented reality, the technology itself is still in its infancy with the

limited evidence demonstrating clinical application reporting only qualitative results

[3,11–13].

There are a number of significant issues that need to be overcome before augmented

reality can be adopted in routine clinical practice. The first of these is registration (the

process by which two images are positioned in the same coordinate system such that

the locations of corresponding points align [14]). This process has been performed

both manually and using automated algorithms with varying degrees of accuracy

[2,15]. The second issue pertains to the use of static preoperative imaging in a

7

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

Page 8: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

dynamic operative environment; in order for the preoperative imaging to be

accurately registered it must be deformable. This problem remains as yet unresolved.

Live intraoperative imaging circumvents the problems of tissue deformation and in

RAPN 51% of surgeons reported already using intraoperative ultrasound to aid in

tumour resection. Cheung and colleagues [9] have published an ex-vivo study

highlighting the potential for intraoperative ultrasound in augmented reality partial

nephrectomy. They report the overlaying of ultrasound onto the operative scene to

improve the surgeon’s appreciation of the subsurface tumour anatomy, this

improvement in anatomical appreciation resulted in improved resection quality over

conventional ultrasound guided resection [9]. Building on this work the first in vivo

use of overlaid ultrasound in RAPN has recently been reported [10]. Although good

subjective feedback was received from the operating surgeon, the study was limited

to a single case demonstrating feasibility and as such was not able to show an

outcome benefit to the technology [10].

RAPN also appears to be the area in which augmented reality would be most readily

adopted with 86% of surgeons claiming they see a use for the technology during the

procedure. Within this operation there are two obvious steps to augmentation,

anatomical identification (in particular vessel identification to facilitate both routine

‘full clamping’ and for the identification of secondary and tertiary vessels for

‘selective clamping’ [16]) and tumour resection. These two phases have different

requirements from an augmented reality platform; the first phase of identification

requires a gross overview of the anatomy without the need for high levels of

registration accuracy. Tumour resection, however, necessitates almost sub-millimetre

8

178

179

180

181

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

Page 9: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

accuracy in registration and needs the system to account for the dynamic

intraoperative environment. The step of anatomical identification is amenable to the

use of non-deformable 3D reconstructions of preoperative imaging while that of

image-guided tumour resection is perhaps better suited to augmentation with live

imaging such as ultrasound [2,9,17].

For RALP and robot-assisted cystectomy the steps in which surgeons felt augmented

reality would be of assistance were those of neurovascular bundle preservation and

apical dissection. The relative, perceived, efficacy of augmented reality in these steps

correlate with previous examinations of augmented reality in RALP [18,19].

Although surgeon preference for utilising AR while undertaking robotic

prostatectomy has been demonstrated, Thompson et al. failed to demonstrate an

improvement in oncological outcomes in those patients undergoing AR RALP [19].

Both nerve sparing and apical dissection require a high level of registration accuracy

and a necessity for either live imaging or the deformation of preoperative imaging to

match the operative scene; achieving this level of registration accuracy is made more

difficult by the mobilisation of the prostate gland during the operation [18]. These

problems are equally applicable to robot-assisted cystectomy. Although guidance

systems have been proposed in the literature for RALP [3,4,13,18,20], none have

achieved the level of accuracy required to provide assistance during nerve sparing.

Additionally, there are still imaging challenges that need to be overcome. Although

multiparametric MRI has been shown to improve decision making in opting for a

nerve sparing approach to RALP [21] the imaging is not yet able to reliably discern

the exact location of the neurovascular bundle. This said significant advances are

9

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

225

226

227

Page 10: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

being made with novel imaging modalities on the horizon that may allow for imaging

of the neurovascular bundle in the near future [22].

Limitations

The number of operations included represents a significant limitation of the study,

had different index procedures been chosen different results may have been seen.

This being said the index procedures selected were chosen as they represent the vast

majority of uro-oncological robotic surgical practice, largely mitigating for this

shortfall.

Although the available ex-vivo evidence suggests that introducing augmented reality

operating environments into surgical practice would help to improve outcomes [9,23]

the in-vivo experience to date is limited to small volume case series reporting

feasibility [2,3,15]. To date no study has demonstrated an in-vivo outcome advantage

to augmented reality guidance. In addition to this limitation augmented reality has

been demonstrated to increased rates of inattention blindness amongst surgeons

suggesting there is a trade of between increasing visual information and the surgeon’s

ability to appreciate unexpected operative events [23].

Conclusions

This survey depicts the contemporary robotic surgeon to be comfortable with the use

of imaging to aid in intraoperative planning; furthermore it highlights a significant

interest amongst the urological community in augmented reality operating platforms.

10

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

Page 11: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

Short to medium term development of augmented reality systems in robotic urology

surgery would be best performed using RAPN as the index procedure. Not only was

this the operation where surgeons saw the greatest potential benefits, but it may also

be the operation where it is most easily achievable by capitalising on the respective

benefits of technologies the surgeons are already using; preoperative CT for

anatomical identification and intraoperative ultrasound for tumour resection.

Conflicts of Interest

None of the authors have any conflicts of interest to declare

References

1. Foo J-L, Martinez-Escobar M, Juhnke B, Cassidy K, Hisley K, Lobe T, Winer

E. Evaluating mental workload of two-dimensional and three-dimensional

visualization for anatomical structure localization. J Laparoendosc Adv Surg

Tech A. 2013;23(1):65–70.

2. Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

Vale J. Augmented Reality Partial Nephrectomy: Examining the Current

Status and Future Perspectives. Urology; 2014;83(2):266–73.

3. Sridhar AN, Hughes-Hallett A, Mayer EK, Pratt PJ, Edwards PJ, Yang G-Z,

Darzi AW, Vale J. Image-guided robotic interventions for prostate cancer. Nat

Rev Urol. 2013;10(8):452–62.

11

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

274

Page 12: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

4. Cohen D, Mayer E, Chen D, Anstee A, Vale J, Yang G-Z, Darzi A, Edwards P

“Eddie.” Augmented reality image guidance in minimally invasive

prostatectomy. Lect Notes Comput Sci. 2010;6367:101–10.

5. Simpfendorfer T, Baumhauer M, Muller M, Gutt CN, Meinzer HP, Rassweiler

JJ, Guven S, Teber D. Augmented reality visualization during laparoscopic

radical prostatectomy. J Endourol. 2011;25(12):1841–5.

6. Teber D, Simpfendorfer T, Guven S, Baumhauer M, Gozen AS, Rassweiler J.

In-vitro evaluation of a soft-tissue navigation system for laparoscopic

prostatectomy. J Endourol. 2010;24(9):1487–91.

7. Teber D, Guven S, Simpfendörfer T, Baumhauer M, Güven EO, Yencilek F,

Gözen AS, Rassweiler JJ, Simpfendorfer T, Guven EO, Gozen AS.

Augmented reality: a new tool to improve surgical accuracy during

laparoscopic partial nephrectomy? Preliminary in vitro and in vivo results. Eur

Urol. 2009;56(2):332–8.

8. Pratt P, Mayer E, Vale J, Cohen D, Edwards E, Darzi A, Yang G-Z. An

effective visualisation and registration system for image-guided robotic partial

nephrectomy. J Robot Surg. 2012;6(1):23–31.

9. Cheung CL, Wedlake C, Moore J, Pautler SE, Peters TM. Fused video and

ultrasound images for minimally invasive partial nephrectomy: A phantom

study. Med Image Comput Comput Assist Interv. 2010;13(Pt 3):408–15.

12

275

276

277

278

279

280

281

282

283

284

285

286

287

288

289

290

291

292

293

294

Page 13: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

10. Hughes-Hallett A, Pratt P, Mayer E, Di Marco A, Yang G-Z, Vale J, Darzi A.

Intraoperative Ultrasound Overlay in Robot-assisted Partial Nephrectomy:

First Clinical Experience. Eur Urol. 2013;

11. Nakamura K, Naya Y, Zenbutsu S, Araki K, Cho S, Ohta S, Nihei N, Suzuki

H, Ichikawa T, Igarashi T. Surgical navigation using three-dimensional

computed tomography images fused intraoperatively with live video. J

Endourol. 2010;24(4):521–4.

12. Teber D, Guven S, Simpfendorfer T, Baumhauer M, Guven EO, Yencilek F,

Gozen AS, Rassweiler J. Augmented Reality: A New Tool To Improve

Surgical Accuracy during Laparoscopic Partial Nephrectomy? Preliminary In

Vitro and In Vivo Results. Eur Urol. 2009;56(2):332–8.

13. Ukimura O, Gill IS. Imaging-assisted endoscopic surgery: Cleveland clinic

experience. J Endourol. 2008;22(4):803–9.

14. Altamar HO, Ong RE, Glisson CL, Viprakasit DP, Miga MI, Herrell SD,

Galloway RL. Kidney deformation and intraprocedural registration: A study of

elements of image-guided kidney surgery. J Endourol. 2011;25(3):511–7.

15. Nicolau S, Soler L, Mutter D, Marescaux J. Augmented reality in laparoscopic

surgical oncology. Surg Oncol. 2011;20(3):189–201.

16. Ukimura O, Nakamoto M, Gill IS. Three-dimensional reconstruction of

renovascular-tumor anatomy to facilitate zero-ischemia partial nephrectomy.

Eur Urol. 2012;61(1):211–7.

13

295

296

297

298

299

300

301

302

303

304

305

306

307

308

309

310

311

312

313

314

315

Page 14: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

17. Pratt P, Hughes-Hallett A, Di Marco A, Cundy T, Mayer E, Vale J, Darzi A,

Yang G-Z. Multimodal Reconstruction for Image-Guided Interventions.

Hamlyn Symposium. 2013.

18. Mayer EK, Cohen D, Chen D, Anstee A, Vale J a., Yang GZ, Darzi AW,

Edwards E. Augmented Reality Image Guidance in Minimally Invasive

Prostatectomy. Eur Urol Supp. 2011;10(2):300.

19. Thompson S, Penney G, Billia M, Challacombe B, Hawkes D, Dasgupta P.

Design and evaluation of an image-guidance system for robot-assisted radical

prostatectomy. BJU Int. 2013;111(7):1081–90.

20. Simpfendorfer T, Baumhauer M, Muller M, Gutt CN, Meinzer H-PP,

Rassweiler JJ, Guven S, Teber D, Simpfendörfer T, Müller M. Augmented

reality visualization during laparoscopic radical prostatectomy. J Endourol.

2011;25(12):1841–5.

21. Panebianco V, Salciccia S, Cattarino S, Minisola F, Gentilucci A, Alfarone A,

Ricciuti GP, Marcantonio A, Lisi D, Gentile V, Passariello R, Sciarra A. Use

of Multiparametric MR with Neurovascular Bundle Evaluation to Optimize the

Oncological and Functional Management of Patients Considered for Nerve-

Sparing Radical Prostatectomy. J Sex Med. 2012;9(8):2157–66.

22. Rai S, Srivastava A, Sooriakumaran P, Tewari A. Advances in imaging the

neurovascular bundle. Curr Opin Urol. 2012;22(2):88–96.

14

316

317

318

319

320

321

322

323

324

325

326

327

328

329

330

331

332

333

334

335

Page 15: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

23. Dixon BJ, Daly MJ, Chan H, Vescan AD, Witterick IJ, Irish JC. Surgeons

blinded by enhanced navigation: the effect of augmented reality on attention.

Surg Endosc. 2013;27(2):454–61.

15

336

337

338

339

340

341

342

343

344

345

346

347

348

349

350

351

352

353

354

Page 16: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

Tables

CT MRI USS None OtherRALP (n=106) 39.8%

(39)73.5% (78)

2%(3)

15.1% (16)

8.4%(9)

RAPN (n=70) 97.1% (68)

42.9% (30)

17.1% (12)

0%(0)

2.9%(2)

Cystectomy (n=57) 94.7% (54)

26.3% (15)

1.8%(1)

1.8%(1)

5.3%(3)

Table 1 - Which preoperative imaging modalities do you use for diagnosis and surgical planning?

16

355

356

357

358

359

360

361

362

363

364

365

366

367

368

Page 17: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

Axial slices(n)

Coronal slices(n)

Sagittal slices (n)

3D recons.

(n)

Do not view(n)

RALP (n=106) 49.1% (52)

44.3% (47)

31.1% (33)

9.4%(10)

31.1% (33)

RAPN (n=70) 68.6% (48)

74.3% (52)

60% (42) 54.3%(38)

0%(0)

Cystectomy (n=57)

70.2% (40)

52.6% (30)

50.9% (29)

21.1%(12)

8.8%(5)

Table 2 - How do you typically view preoperative imaging in the OR?3D recons = Three dimensional reconstructions

17

369

370

371

372

373

374

375

376

377

378

379

380

Page 18: Web viewWord count manuscript text: 2,142. 5 figures and 2 tables. Introduction. ... Hughes-Hallett A, Mayer EK, Marcus HJ, Cundy TP, Pratt PJ, Darzi AW,

Figure Legends

Figure 1 – A still taken from a video of augmented reality robot assisted partial

nephrectomy performed. Here the tumour has been painted into the operative view

allowing the surgeon to appreciate the relationship of the tumour to the surface of the

kidney.

Figure 2 – Chart demonstrating responses to the question - In robotic prostatectomy

which parts of the operation do you feel augmented reality image overlay would be of

assistance?

Figure 3 - Chart demonstrating responses to the question - Do you use intraoperative

ultrasound for robotic partial nephrectomy?

Figure 4 - Chart demonstrating responses to the question – In robotic partial

nephrectomy which parts of the operation do you feel augmented reality image

overlay would be of assistance?

Figure 5 - Chart demonstrating responses to the question – In robotic cystectomy

which parts of the operation do you feel augmented reality overlay technology would

be of assistance?

18

381

382

383

384

385

386

387

388

389

390

391

392

393

394

395

396

397

398

399

400