Segmentation and Measurement of Chronic Wounds for …hamarneh/ecopy/jbhi2017.pdf · measurement of...

8
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS 1 Abstract Objective: To provide a proof-of-concept tool for segmenting chronic wounds and transmitting the results as instructions and coordinates to a bioprinter robot and thus facilitate the treatment of chronic wounds. Methods: Several segmentation methods used for measuring wound geometry, including edge-detection and morphological operations, region-growing, Livewire, active contours, and texture segmentation, were compared on 26 images from 15 subjects. Ground truth wound delineations were generated by a dermatologist. The wound coordinates were converted into G- Code understandable by the bioprinting robot. Due to its desirable properties, alginate hydrogel synthesized by dissolving 16% (w/v) sodium-alginate and 4% (w/v) gelatin in deionized water and used for cell encapsulation. Results: Livewire achieved the best performance, with minimal user interaction: 97.08%, 99.68% 96.67%, 96.22, 98.15, and 32.26, mean values, respectively, for accuracy, sensitivity, specificity, Jaccard Index, Dice Similarity Coefficient, and Hausdorff Distance. The bioprinter robot was able to print skin cells on the surface of skin with a 95.56% similarity between the bioprinted patch’s dimensions and the desired wound geometry. Conclusion: We have designed a novel approach for the healing of chronic wounds, based on semi-automatic segmentation of wound images, improving clinicians’ control of the bioprinting process through more accurate coordinates. Significance: This study is the first to perform wound bioprinting based on image segmentation. It also compares several segmentation methods used for this purpose to determine the best. Index Termsbioprinting, chronic wound, image segmentation, Aliginate-gel, bio-ink. I. INTRODUCTION HRONIC wounds, with an approximate annual care cost of $50 billion and 2% prevalence in the general population of the United States, are a major challenge of today’s skin pathology and dermatological science [1]. Many treatment approaches have been implemented, among which the traditional tissue-engineering-based treatments are P. Gholami, is with the School of Optometry and Vision Science, Department of System Design Engineering at University of Waterloo, ON, Canada (e-mail: [email protected]). M. A. Amadi-Pajouh, N. Abolfathi and M. Kayvanrad are with the Biomedical Engineering Department at Amirkabir University of Technology, Tehran, Iran (e-mail: [email protected]; [email protected]; [email protected]). G. Hamarneh is with the School of Computing Science at Simon Fraser University, Burnaby, Canada (e-mail: [email protected]). increasingly common [2]. Such treatments usually involve fabrication of 3D-interconnected-pore networks, called scaffolds, which maintain the shape and mechanical properties of the desired structures for cell attachment, and supply cell- proliferation substrates, needed for the regeneration of 3D tissues. However, scaffold-based tissue, regardless of its benefits and various applications, can be plagued by limitations and problems, e.g., the lack of precision in cell placement, limited cell density, the need for organic solvents, insufficient interconnectivity, and the inability to control pore distribution and dimensions [2]. These common limitations necessitated the development of novel methods to improve the wound healing time and resulted in considering other methods. One brand-new method is treating chronic wounds using bioprinting based on desktop printers, which precisely place manufactured skin structures consisting of different kinds of cells or biomaterials on wounds in a controlled fashion. To implement wound bioprinting, an accurate measurement of wound surface specifications is required. While contact measurement of wounds, e.g., ruler, wound tracing, and planimetry methods, are still widely used [3], these methods are generally slow, inaccurate, and often problematic [4]. Thus, these limitations along with the need for high-speed methods for measuring different soft materials such as human body tissues, including the skin, have led to the emergence of non- contact fast measurement methods. Different non-contact wound and burn measurement methods, e.g., ultrasonic, optical, and image processing-based, are under development [5]. However, most of those currently being used require complicated hardware, high processing capabilities and complex arrangements, leading to high costs and low adjustability. Hence, image processing methods, which can be done using only a simple digital camera, are the most suitable and inexpensive option for many non-contact measurements. Fig. 1 represents typical non-contact and contact measurement methods. Our goal is to develop a low- cost, adjustable, and easy-to-use system for non-contact measurement of wound specifications on the skin surface and print the obtained geometries in an efficient way. Peyman Gholami, Mohammad Ali Ahmadi-pajouh, Nabiollah Abolftahi, Ghassan Hamarneh, Senior Member, IEEE, and, Mohammad Kayvanrad Segmentation and Measurement of Chronic Wounds for Bioprinting C (b) (a) Fig. 1. Measuring wounds using (a) a contact and (b) a non-contact method.

Transcript of Segmentation and Measurement of Chronic Wounds for …hamarneh/ecopy/jbhi2017.pdf · measurement of...

Page 1: Segmentation and Measurement of Chronic Wounds for …hamarneh/ecopy/jbhi2017.pdf · measurement of wounds, e.g., ruler, ... Due to its relatively high stiffness, any 3D scaffold

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS 1

Abstract

Objective: To provide a proof-of-concept tool for segmenting

chronic wounds and transmitting the results as instructions and

coordinates to a bioprinter robot and thus facilitate the treatment

of chronic wounds.

Methods: Several segmentation methods used for measuring

wound geometry, including edge-detection and morphological

operations, region-growing, Livewire, active contours, and

texture segmentation, were compared on 26 images from 15

subjects. Ground truth wound delineations were generated by a

dermatologist. The wound coordinates were converted into G-

Code understandable by the bioprinting robot. Due to its

desirable properties, alginate hydrogel synthesized by dissolving

16% (w/v) sodium-alginate and 4% (w/v) gelatin in deionized

water and used for cell encapsulation.

Results: Livewire achieved the best performance, with

minimal user interaction: 97.08%, 99.68% 96.67%, 96.22, 98.15,

and 32.26, mean values, respectively, for accuracy, sensitivity,

specificity, Jaccard Index, Dice Similarity Coefficient, and

Hausdorff Distance. The bioprinter robot was able to print skin

cells on the surface of skin with a 95.56% similarity between the

bioprinted patch’s dimensions and the desired wound geometry.

Conclusion: We have designed a novel approach for the

healing of chronic wounds, based on semi-automatic

segmentation of wound images, improving clinicians’ control of

the bioprinting process through more accurate coordinates.

Significance: This study is the first to perform wound

bioprinting based on image segmentation. It also compares

several segmentation methods used for this purpose to determine

the best.

Index Terms— bioprinting, chronic wound, image

segmentation, Aliginate-gel, bio-ink.

I. INTRODUCTION

HRONIC wounds, with an approximate annual care cost of $50 billion and 2% prevalence in the general population of the United States, are a major challenge of

today’s skin pathology and dermatological science [1]. Many treatment approaches have been implemented, among which the traditional tissue-engineering-based treatments are

P. Gholami, is with the School of Optometry and Vision Science,

Department of System Design Engineering at University of Waterloo, ON,

Canada (e-mail: [email protected]).

M. A. Amadi-Pajouh, N. Abolfathi and M. Kayvanrad are with the

Biomedical Engineering Department at Amirkabir University of Technology,

Tehran, Iran (e-mail: [email protected]; [email protected];

[email protected]).

G. Hamarneh is with the School of Computing Science at Simon Fraser

University, Burnaby, Canada (e-mail: [email protected]).

increasingly common [2]. Such treatments usually involve fabrication of 3D-interconnected-pore networks, called scaffolds, which maintain the shape and mechanical properties of the desired structures for cell attachment, and supply cell-proliferation substrates, needed for the regeneration of 3D tissues. However, scaffold-based tissue, regardless of its benefits and various applications, can be plagued by limitations and problems, e.g., the lack of precision in cell placement, limited cell density, the need for organic solvents, insufficient interconnectivity, and the inability to control pore distribution and dimensions [2]. These common limitations necessitated the development of novel methods to improve the wound healing time and resulted in considering other methods. One brand-new method is treating chronic wounds using bioprinting based on desktop printers, which precisely place manufactured skin structures consisting of different kinds of cells or biomaterials on wounds in a controlled fashion.

To implement wound bioprinting, an accurate measurement of wound surface specifications is required. While contact measurement of wounds, e.g., ruler, wound tracing, and planimetry methods, are still widely used [3], these methods are generally slow, inaccurate, and often problematic [4]. Thus, these limitations along with the need for high-speed methods for measuring different soft materials such as human body tissues, including the skin, have led to the emergence of non-contact fast measurement methods.

Different non-contact wound and burn measurement methods, e.g., ultrasonic, optical, and image processing-based, are under development [5]. However, most of those currently being used require complicated hardware, high processing capabilities and complex arrangements, leading to high costs and low adjustability. Hence, image processing methods, which can be done using only a simple digital camera, are the most suitable and inexpensive option for many non-contact measurements. Fig. 1 represents typical non-contact and contact measurement methods. Our goal is to develop a low-cost, adjustable, and easy-to-use system for non-contact measurement of wound specifications on the skin surface and print the obtained geometries in an efficient way.

Peyman Gholami, Mohammad Ali Ahmadi-pajouh, Nabiollah Abolftahi, Ghassan Hamarneh, Senior Member, IEEE, and, Mohammad Kayvanrad

Segmentation and Measurement of Chronic

Wounds for Bioprinting

C

(b) (a)

Fig. 1. Measuring wounds using (a) a contact and (b) a non-contact method.

Page 2: Segmentation and Measurement of Chronic Wounds for …hamarneh/ecopy/jbhi2017.pdf · measurement of wounds, e.g., ruler, ... Due to its relatively high stiffness, any 3D scaffold

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS 2

II. RELATED WORKS

Several studies have addressed wound segmentation (WS) and wound measurement (WM) [6-21] using different image processing methods. Table I summarizes the most significant works in WS [6]. There are three main types of wound tissues: granulation, slough, and necrosis which are characterized in some papers [6,14,19,20]. Other works, on skin melanoma segmentation and cancer detection, are not mentioned in this table. Most studies are performed under controlled photography conditions, such as special backgrounds, or focusing only on specific wound regions or types, e.g., leg ulcers. There is no mention of wound geometry, which directly affects the bioprinting process because more-complex wounds with detailed edges require more accurate segmentation methods. Moreover, applications of most existing works are confined to only a few limited cases, such as tracking the healing process, which are insufficiently accurate and not applicable for bioprinting.

For moving and dispensing skin cells to their correct position, the bioprinter robot needs a series of commands that must be obtained from the wound geometry; i.e., firstly, the wound’s surface information (x and y) has to be extracted by segmentation; then, a 3D geometry is constructed using the depth information. Afterwards, the robot moves and dispenses cells on the 3D geometry guided by a series of commands.

To make printed patches biocompatible and usable for the human body, we need to carry skin cells inside a bio-compatible platform and place them on the surface of the wound. Thus, selecting and applying an appropriate biocompatible material for carrying and encapsulating cells is indispensable. Various methods exist, such as including cells in

the bioprinter’s gel-like ink to form a bio-ink. Hydrogels, which are applicable in bioprinting, can be classified into several groups according to their crosslinking methods. The process of crosslinking may be initiated by external triggers such as light [22], temperature [23], changes in environmental pH [24], presence of enzymes [25,26], or ionic cross-linkers [27]. Several review papers focusing on hydrogels in bioprinting provide readers with a rich source of further information in this regard [28-30].

Alginate is a polysaccharide derived from brown seaweeds and Alginate hydrogel has many interesting properties such as biocompatibility, low cost, and an easy gelation process making it a commonly used biopolymer for encapsulating cells. Due to its relatively high stiffness, any 3D scaffold fabricated with it has a good handling ability [31]. Despite the advantages of alginate hydrogel, the lack of cell binding sites causes weak interaction between the cells and the matrix, which eventually leads to limited cell proliferation [32].

The research questions for this study were: Is it possible to

obtain the 3D geometry, needed for 3D bioprinting of wounds

using current image segmentation methods, and if so, which

one is more robust and accurate? The contribution of this work

is to implement a calculated geometry for printing skin cells in

a bio-compatible platform that aids in wound healing.

III. MATERIALS AND METHODS

The methodology is divided into two parts: image segmentation and bioprinting. For the former, it is adequate to capture images from a wound and upload them into the developed application. Using the provided graphical user interface, users would be able to choose the most-appropriate

TABLE I

A REVIEW OF CURRENT STUDIES ON WOUND SEGMENTATION

Papers Application Segmentation Methodology No. of Images Imaging Conditions Segmentation Results

Fauzi et al.,

2015[6]

WS, WM and

characterization

RYKW map, Region-growing

and Optimal thresholding 80 With background 75.1% accuracy

Wang et al., 2015

[7]

WS and healing

status

Mean-shift (boundaries)

K-mean (color segmentation)

30 moulage,

34 real patients

with the assistance of an

image capture box Not available

Loizou et al., 2013

[8,9] Wound healing

Snake (segmentation),

Texture feature (healing)

77 images,

11 subjects

Wound region only,

polarized filter, dark room Not available

Wannous et al,

2007-2011 [10-

12]

WS Mean shift, JSEG 25 With background,

controlled conditions

73.3-80.2% (granulation tissue),

56.4-69.8% (slough tissue),

64.9-70.7% (necrosis tissue)

Hettiarachchi et

al., 2013 [13] WS AC 20

Wound region only,

controlled conditions 90.0% accuracy

Veredas et al.,

2010 [14]

WS and tissue

characterization

Mean shift and

region growing 113

Wound region only, White

background

78.7% sensitivity

94.7% specificity

91.5% accuracy

Hani et al., 2012

[15] WS ICA and K-means 30 Wound region only

88.2% sensitivity

98.8% specificity

Perez et al., 2001

[16]

WS and tissue

analysis

Region growing and RGBSI

analysis Not available Leg ulcers only Not available

Wantanajittikul et

al., 2011 [17]

Burn segmentation,

degree identification

FCM and Morphology,

Texture analysis 5 Burn cases only 92% PPV, 84% sensitivity

Song and Sacan,

2012 [18] WS, identification

Edge Detection, K-means

clustering, Thresholding,

Region Growing

92 images,

14 patients

Wound region manually

traced Not available

Kolesnik and Fexa,

2004-2006 [19,20] WS, classification

SVM, Texture analysis,

Deformable Snake 50

Camera surrounded by a

ring of LED lights

Error rate of 6.56% (color),

22.16% (text.), 5.8% (color&text.)

Silveira et al.,

2009 [21]

Skin Melanoma

segmentation

AT, GVF, AS, C-LS,

EM-LS, FBSM 100 Corner and hair removal

12.63 Hammoude distance,

95.47% True detection rate, 36.90

Hausdorff distance,

Page 3: Segmentation and Measurement of Chronic Wounds for …hamarneh/ecopy/jbhi2017.pdf · measurement of wounds, e.g., ruler, ... Due to its relatively high stiffness, any 3D scaffold

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS 3

segmentation method for each specific kind of wound. Then the wound geometry is accurately calculated, and via calibration to the actual values, the program provides the real area and boundaries of the wound. In the second step, the bioprinter robot, using appropriate materials, prints cells on the desired place. Fig. 2 presents a workflow diagram illustrating the overall pipeline of the proposed method.

Obtaining and preparing wound images

Acquiring wound images While many image data bases are publically available, e.g.,

Dermquest Atlas [33] with more than 22,000 images, or the Brazilian Dermatology Atlas [34], the existing databases are not suitable for bioprinting for one or more of the following reasons: low resolution, lighting artefacts, and no appropriate indicators for wound dimensions. Therefore, in cooperation with the Wound and Ostomy Clinic of Erfan Hospital of Tehran, we collected a data set of chronic wounds, containing 26 photos of diabetic foot wounds, burns, and eschar. All of the photos were taken with a Canon 6D digital camera with a resolution of 5472 x 3648 (20 megapixels), and saved as RAW files, then down-sampled to 1368 x 912. While taking the photos, a ruler was placed beside and parallel to wound surfaces in order to obtain their precise scale and dimensions. Fig. 3 presents sample images, along with wound types and patient details. All of the results are reported based on the images of the Erfan data set.

Pre-processing of images Image Intensity Normalization: For various reasons, e.g.,

glare, shadows, etc., the range of pixel intensity values may vary in different images. A linear normalization procedure is performed to eliminate these effects by applying the equation

minmax

min),(),(

GG

GjiIjiN

to each color channel separately, where

N is the normalized image, I is the grayscale intensity prior to normalization, and Gmax and Gmin are maximum and minimum intensity levels in the grayscale image respectively.

Denoising: The main filters used for image denoising is a

Gaussian blurring filter with a standard deviation of σ = 0.5,

shown to be very effective in removing camera noise for

wound analysis [7]. However, in case there is a presence of salt

and pepper noise (due to image capture), the operator is able to

use the median filter.

Segmentation of wound in the 2D plane Seven popular algorithms were implemented and applied to

all images. Ground truth delineations were determined by a

wound clinician who used an optical pen to draw the desired borders of wounds in the designed GUI.

Region-based methods Region-growing [35]: Wound segmentation can be obtained

based on the intensity similarity of wound pixels. First, a set of seed points P within the desired wounds is chosen by the clinician. Then for each k neighbors of q (k = 4,8), for the set

of neighboring pixels available in P, if |𝐼𝑝 − 𝐼𝑞| < 𝑀 ± 3𝜎, (where M and σ are the mean value and the standard deviation of the selected pixel intensities), then point q is added to the set points P. Then this step is repeated for all pixels available in set P until no new point is added to this set and all pixels are dedicated to the desired wound region.

Active Contours without Edges (Chan-Vese (CV) model) [36]: This algorithm, unlike the classical Snake model, detects objects based on region similarities, not relying on gradients.

By assuming an evolving curve represented by ϕ as an initial level set function, and c1 and c2 as constant average intensities of inside and outside of ϕ in the image I, an energy function is defined as:

dxcIHdxcIH

dxccE2

22

2

11

21

)(1)(

)(),,(

(1)

in which µ=47, λ1=λ2=0.43 are positive weighting parameters, chosen empirically, and Hϕis a regularized Heaviside function. The first term ensures smooth boundaries, while the last two terms fit the level set to the image pixel data averaged over all RGB channels. The energy function is minimized using an iterative two-step algorithm in which c1, c2 are first computed while ϕ is considered fixed, then ϕ is updated [36].

Edge-based methods Most chronic wounds have clear and distinguished

boundaries, which make it possible to perform the segmentation based on the edges in the images. We compared three edge-based algorithms:

Edge and morphological operations: This method is based on the direct usage of image gradients. First, image gradients are obtained using a Sobel edge detector. Then, using morphological operations and structuring elements, e.g., diamond-shaped and linear, the obtained edges are dilated, unwanted margins are removed via thresholding, and the final image is eroded and smoothed [35].

Level set method preserving the distance function [37]: This algorithm is a level set method, which preserves the signed

Fig. 2. Workflow diagram of the proposed study.

Diabetic, Granulation & Slough, F/63 Diabetic, Granulation, M/45

Metabolic, Necrosis, F/55 Burn, Granulation, M/17

Fig. 3. Samples of taken photos along with details of wound types and tissues,

and patients gender/age.

Page 4: Segmentation and Measurement of Chronic Wounds for …hamarneh/ecopy/jbhi2017.pdf · measurement of wounds, e.g., ruler, ... Due to its relatively high stiffness, any 3D scaffold

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS 4

distance function (SDF) that is necessary to accurately estimate geometric features and special curvatures of complex objects like wounds. The Lagrangian energy function is defined as:

244

233

222

21

1

2)(

2)(

))((2

))(()(2

)(),,,,,(

pr

puqr

uq

Hur

Hur

qpquE b

(2)

in which ϕ is the SDF, φ is the non-linear term, u = H(φ), p =

∇φ, q = ∇u, , 𝜔𝑏 is a boundary detection function, and Λ =

(λ1,λ2, λ3, λ4) are the Lagrangian functions. µ, the weight of

edge term over the region term, was set to 0.3 empirically. An

approximate solution is computed by iteratively alternating the minimization of the energy function with respect to each

variable while considering the others fixed [37]. This method

is sensitive to the size of the initial window and the images

first need to be cropped. Livewire: Some chronic wounds have outlier edges, e.g.,

detached skin, which do not determine the actual wound area and require supervision for a precise segmentation. The Livewire algorithm, also known as Intelligent Scissors [38], enables users to discard the outlier edges. In this method, first a seed point is chosen on a wound border. The objective is to determine a path from the seed to the mouse controlled pointer that follows the object boundaries. When the user moves the pointer, wound boundaries are extracted. The main local cost function is defined as [38]:

)(.),(.)(.),( qfqpfqfqpl GGDDZZ (3)

where fZ, fD and fG are Laplacian zero-crossing, gradient direction and gradient magnitude cost terms, respectively, representing different aspects of edge features. The weights of the corresponding features were chosen empirically as follows:

Z =0.2, D =0.3 G =0.8. The image edges were obtained

using a Canny edge detector. The minimum cost function path was obtained using the Dijkstra algorithm [38].

Parametric active contour models- Snakes: A Snake is a deformable energy-minimizing curve with an energy functional as follows [39]:

(v(s))E(v(s))E(v(s))EE conimagesnake int

1

0

(4)

2),(),( yxIyxGEext (5)

])()()()([2

1 22

int svssvsE sss (6)

where α and β were set to 0.4 and 0.2 empirically. Edges were

obtained on the grayscale intensity image. The clinician

initializes the Snake contour and using (4) the contour evolves

to match image features. The Viterbi algorithm was used for

optimization [40].

Texture-based methods Houhou-Thiran-Bresson model (HTB): Wound tissue

images usually have specific textures, which is a prominent feature to distinguish between wound region and other parts of the image and extract the wound geometry. An intrinsic texture descriptor, which describes the geometry of textures using semi-local image information and tools from differential

geometry [41], is defined as:,

)det(exp

2

xygF

in which

gxy is the corresponding metric tensor and σ > 0 is a scaling parameter. Using the Kullback-Leibler (KL) distance between

probability distribution function (pdf), an active contour model which distinguishes the background and desired textures can be designed by minimizing the following energy function:

)()(

)),(

),(log(),(

),(

),(log),(

LKL

in

outout

out

inin ds

Fq

FqFq

Fq

FqFqE

R (7)

where qin, qout are the inside and outside pdfs respectively, Ω is the evolving region, and λ=1.75 is the coefficient for regularizing the evolving curve which was chosen empirically. The first term belongs to the KL distance and the second term is a curvature regularization term. Finally, the Split-Bregman method [42] was used for optimization.

Segmentation evaluation

Statistical significances: Student’s t-test was performed on each pair of segmentation algorithms under the null hypothesis that the accuracy of each pair of the implemented segmentation methods is the same. The tests were carried out at the significance level of 0.05.

Performance and correctness measures of segmentation: By intersecting a segmented image with the corresponding ground truth, four outcome relative states, i.e., True Positive, False Positive, True Negative, False Negative, were obtained, based on which the four validation indicators of Precision, Sensitivity, Specificity, and Accuracy were defined. In addition, the Jaccard index (JI) [43], Dice similarity coefficient (DS) [44] and Hausdorff Distance (HD) [45] were used to evaluate the performance of the segmentation.

Calibrating the segmented image with a real object

In order to scale a segmented image to real object size, a ruler sticker [8] is placed on the imaging plane and parallel to the surface of the wounds as the size reference with the camera held perpendicular to the wound surface. To calibrate the physical dimensions of pixels, an automated procedure is used: First, unnecessary image details and wound are removed via thresholding, to keep the ruler lines. Then the remaining blobs in the image are labeled, sorted and categorized. Finally, by finding the distance between the largest blobs in the image, the number of pixels per centimeter is obtained, from which the obtained coordinates and area are calibrated. Fig. 4 presensts the calibratrion procedure for a sample wound image.

Setting the depth of the 3D coordinates for 3D printing

The segmented regions provide wound coordinates on a 2D plane. However, 3D bioprinting requires 3D wound coordinates (length, width, and depth). For the current proof-of-concept stage of this study, i.e., for evaluating different bioprinting parts, such as the bio-ink features, e.g., elasticity,

(b) (a)

Fig. 4. Calibrating the segmented image with a real object. In this example,

per cm equals 492.2 pixels in the image. Thus, each pixel is 412.78 µm2.

The length of the wound is 3.246 cm (1598.07 pixels), its width is 1.811 cm

(891.09 pixels) and the area is 1280919 pixels which equals 5.2874 cm2.

(a) binarized image in red channel (b) primary image with detected ruler and

obtained scale.

Page 5: Segmentation and Measurement of Chronic Wounds for …hamarneh/ecopy/jbhi2017.pdf · measurement of wounds, e.g., ruler, ... Due to its relatively high stiffness, any 3D scaffold

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS 5

accuracy, and the performance of the bio-printer and the efficacy of the segmentation algorithms, the wound depth was considered uniform. The uniform values were set based on an approximation, performed by a clinician using the most common clinical wound depth measuring method [46]: the cotton-tipped applicator is placed into the deepest area of the wound and a mark is also placed on the applicator at the skin level. The applicator is then held against a metric ruler to measure the wound depth [46, 47]. If the depth varies, the measurements are taken in different areas. The recorded depth will be the deepest area of the wound measured [47].

Converting the obtained coordinates to the numerical control programming language

Obtained coordinates should be converted into numerical control programming language to let the bioprinter move on the specified paths and print the desired pattern. Currently, the bioprinter robot in the Biomedical Engineering Department of Amirkabir University (Fig. 5) works on mach3 [48] control software and requires G-code motor codes to move and dispense cells. G-code is the general name for numerical control programming, which is widely used in 3D printing and controls how instruments, other accessories, and machine equipment move.

Most G-code generator programs need STL (STereoLithography) file format. To save the obtained geometries into STL files, the open source MATLAB code in [49] was used with no modifications. Fig. 7(c) presents the 3D

form of a simulated STL file. G-codes were generated from the simulated STL files using the MankatiUM 6.5.3 software [50].

Bio-printing materials and synthesis

Sodium-alginate (Alginic acid sodium salt from brown algae), calcium-chloride di-hydrated and gelatin from bovine skin were obtained from sigma-Aldrich. To handle the weak interaction between the cells and the matrix and limited cell proliferation, we added some gelatin to the composition of our bio-ink. Gelatin is obtained by partial hydrolysis of the triple helix structure of collagen into single-strain molecules [51]. Therefore, gelatin is the most similar biopolymer to collagen, which could mimic the extra cellular matrix and enhance the cell adhesion properties of the structure at a low cost.

Alginate-gelatin hydrogel was synthesized by dissolving 16% (w/v) sodium-alginate and 4% (w/v) gelatin in deionized water. High viscosity of hydrogel was considered for a better resolution and an optimized cell viability.

We printed the described geometries in a dish utilizing bio-ink hydrogel, which consisted of alginate and gelatin. For a better adhesion of the first layer to the dish, it was washed by di-ionized water and dried with pressurized air. Sodium-Alginate blend with gelatin was printed using a pneumatic dispensing nozzle. Ionic gelation took place by soaking the bio-printed gel in 4% (w/v) calcium-chloride solution for 5 min after the process. Crosslinking density was adjusted in order to preserve the printed gel geometry within the shortest time possible. Then, the dish was filled with deionized water to keep the sample from dehydrating.

IV. RESULTS

Experimental setup

The bioprinting experimental setup consisted of a custom-made bioprinter, a hydrogel dispensing module, and a PC controlled software. The bioprinter machine itself consisted of a 3-axis motion control stage with a hydrogel injection module mounted on its head. Using the PC controlled software, the tool path was translated to XYZ coordinates, guiding the motor drivers to move along the 3-axes. The dispensing module was composed of a filled syringe and a pump connected through a sealed tube. The deposition rate of alginate-gelatin viscose solution was controlled by setting a steady air pressure behind the hydrogel. The addition of Calcium-chloride for ionic-gelation was performed manually using a 5 ml syringe.

Segmentation evaluation

Performance and correctness measures of segmentation: Fig. 6 represents the performance of different image segmentation methods on two sample images. Segmentation evaluation results are summarized in Table II: Livewire resulted in the highest accuracy. Furthermore, all methods were sensitive enough, and therefore, had a low type II error. The

Fig. 5. The bioprinter robot in the Biomedical Engineering department of

Amirkabir University.

TABLE II. RESULTS OF THE SEGMENTATION EVALUATION INDICATORS TESTS. THE VALUES ARE AVERAGE OF ALL MEASURES ACROSS THE DATA SET.

Algorithm Precision Accuracy Sensitivity Specificity Jaccard Index Dice Similarity Hausdorff Distance Variance

Edge detection 84.53 88.14 99.96 89.75 89.14 94.26 46.63 32.12

Region growing 83.49 89.22 100 87.67 87.63 93.38 52.04 28.65

Livewire 97.44 97.08 99.83 96.67 96.22 98.15 32.26 8.65

Snakes 92.12 93.74 99.68 92.47 88.21 93.74 56.65 6.24

LS 98.77 94.54 99.71 92.89 88.93 94.14 54.66 10.37

HTB model 97.42 87.50 98.37 86.93 71.84 83.61 132.23 28.11

CV Model 98.44 95.26 98.95 87.41 70.72 82.85 127.78 34.19

Fig. 6. Segmentation results of different methods and ground truth. The blue,

green, red, yellow, black colors correspond to region growing, edge

detection, snake, livewire, ground truth results respectively.

Region-growing

Snake

Chan-Vesse

HTB

Level Set

Livewire

Edge-detection

Ground Truth

Region-growing

Snake

Chan-Vesse

HTB

Level Set

Livewire

Edge-detection

Ground Truth

Page 6: Segmentation and Measurement of Chronic Wounds for …hamarneh/ecopy/jbhi2017.pdf · measurement of wounds, e.g., ruler, ... Due to its relatively high stiffness, any 3D scaffold

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS 6

results also illustrated that Livewire with specificity of 96.67% had the lowest type I error rate. Furthermore, other metrics, i.e., JI, DS, and HD were significantly higher for Livewire, indicating that it had the best performance.

Computation cost and user interaction: Table III presents average elapsed time and user interaction. It is not straightforward to evaluate these measures since different implementations of the same method may reduce user-interaction (e.g., auto-initialization) at the expense of lower accuracy. What we report is our experience with the aforementioned segmentation methods. The Snakes is the most time consuming method as it must be carefully initialized to avoid getting stuck in erroneous local minima. On the other hand, the interactive Livewire demands less computation (given its efficient dynamic programming optimization) but needs more user interaction to ensure the proper delineation compared to the other methods. For bioprinting, with respect to the tradeoff between computational complexity and accuracy, the latter is more important and Livewire’s higher accuracy outweighs the required user interaction.

Statistical significances: The results of the t-tests for all of the paired implemented algorithms returns p<0.05 and h=1, while h is the hypothesis test result, returned as a logical value. This indicates the rejection of the null hypothesis at the significance level which is 0.05. The null hypothesis for the t-test was that the evaluation indicators of all of the implemented segmentation methods are the same.

Bioprinting results

The bioprinter robot is able to print skin cells on skin surface from validated G-codes. In this study, the average obtained pixel size after calibration is about 20 μm. Fig. 7(f) and 7(e) illustrate sample results. A clinician, using a scaled ruler, measured the actual wound dimensions, i.e., width and height, for further evaluation of similarity between the bioprinted patches and the corresponding wounds. Comparing the measurements of width and height for two bioprinted patch instances with the actual corresponding wound dimensions demonstrated an average similarity of 95.56% between the two.

V. DISCUSSION

The objective of this study was to provide proof-of-concept for a segmentation and bioprinting tool to facilitate chronic wound treatment. The semi-automatic segmentation enables users to extract wound boundaries from images using prior knowledge about wounds and bioprinting. The average performance values for the Livewire, are relatively higher than most of the previous works in Table I, indicating that the proposed method, despite all current limitations, would be an appropriate selection for this application.

Typically, human cells have a dimension of 20-30 μm [52]. Since the bioprinter robot does not use a single cell delivery technique, it moves with a precision of 100 μm, meaning that it is able to move to a location with a precision of 100 μm and dispense the bio-ink. Therefore, segmentation results must have at least a precision of 100 μm. Coarser segmentation and depth resolution would result in coarser coordinates for the STL model and 3D geometry, and consequently, the bioprinter would not be able to move and dispense over all of the wound area, especially if it has more complexity and detailed edges. In this study, the average obtained pixel size after calibration is about 20 μm, indicating that the proposed method is capable of distinguishing between each 20 μm on a wound surface. This value properly fulfills the bioprinter requirements and confirms the proposed method’s capability to print accurate patches.

Based on Table III, both Livewire and Snakes need a higher level of interaction than the other methods. In addition, using user-guided external forces, the Snakes algorithm is able to obtain a higher accuracy at the expense of increased user interaction, which we found to be too cumbersome and time-consuming for users. Thus, the Livewire is able to obtain the desired level of accuracy with less user interaction. Future work could involve automatic segmentation methods which are insensitive to initialization and have energy cost terms derived from large scale training of machine learning algorithms.

This study did not include color calibration, as the proposed segmentation methods do not use color information to determine size and wound geometry. In future studies, tissue classification can fulfill the need for determining the different tissues and preparing different cells for real bioprinting in the various layers of skin. By calibrating colors and using color and texture descriptors [10-12] or a Red-Yellow-Black-White (RYKW) map [6], wound tissues can be classified.

All the segmentation methods have been applied by the same person and compared to a single ground truth. Recruiting

TABLE III.

COMPUTATION COST AND USER INTERACTION OF THE IMPLEMENTED

SEGMENTATION METHODS. THE REPORTED NUMBERS FOR m AND σ IN THE

TABLE ARE THE AVERAGE VALUES OF MEAN AND VARIANCE OVER ALL

IMAGES RESPECTIVELY.

Algorithm

Total time (sec)

Computation Time (sec)

User Interaction (Mouse Clicks)

m σ m σ m σ

Edge detection 5.02 0.85 1.03 0.001 1 0.17

Region growing 11.34 0.48 3.47 0.01 2 0.60

Livewire 17.80 5.91 2.19 0.007 4 1.62

Snakes 28.55 11.38 10.99 0.01 10 5.32

LS (170 iterations) 23.64 0.51 22.49 0.003 0 0

CV model 2.12 0.26 0.59 0.001 1 0.14

HTB model 3.01 0.33 1.18 0.009 1 0.19

(f) (e) (d) (c) (b) (a)

Fig. 7. Diagram of procedure of bioprinting a 3D generated model of a chronic wound with Alg-Gel (a) primary picture of the wound (b) taking wound

photos and using a ruler indicator and performing image segmentation (c) simulated 3D model (d) inserting STL file into the bioprinter robot (e) printed

patch while printing. The transparent liquid surrounding gel is calcium-chloride which has not reacted with alginate-gelatin yet. (f) Reacted and hard-set

Alg-Gel after injecting calcium-chloride.

Page 7: Segmentation and Measurement of Chronic Wounds for …hamarneh/ecopy/jbhi2017.pdf · measurement of wounds, e.g., ruler, ... Due to its relatively high stiffness, any 3D scaffold

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS 7

more than one user to perform the segmentation and more than one expert clinician to determine the ground truth is a possibility for future work, and will allow us to assess inter-user variability and agreement across clinicians [6].

Although the data set used in this study justifies the performance of the proposed bioprinting pipeline, a thorough understanding of the proposed method’s performance on wounds of different shapes and types, and of the effect of wound geometry on the bioprinting results calls for additional studies on larger data sets.

The ruler method used for calibration, while providing the physical dimensions of pixels on flat surfaces, is not able to consider the curvature of an imaged wound. Therefore, the actual dimensions for wounds on convex surfaces would be slightly different. Thus, instead of using a ruler and controlling the imaging conditions, a color checkerboard and a calibration algorithm, e.g., Direct Linear Transformation [53], can be used to obtain a camera matrix and study the effects of intrinsic and extrinsic camera parameters as well as wound curvature on bioprinting results.

Although the primary measurements results of the overlap between the bioprinted patch and the corresponding wound dimensions indicate a good match, a moulage phantom can be used to further assess the overlap between the bioprinting result and the wound imposed on the phantom. Also, the comparison can be performed based on a pixel-by-pixel method by capturing an image of the bioprinted patch and matching each pixel to the corresponding pixel on the wound image.

Before placing a bioprinted patch on skin, it is important to completely prepare the wound bed during a procedure called debridement. Any necrotic tissue or foreign material must be removed from areas around the wound in order to increase the chances of healing. Similarly, especially before image segmentation, debridement of the wound area is important because the open wound bed cannot be observed and assessed effectively in the presence of necrotic tissue [54].

The current preliminary stage of this study was retrospective, in which only mono (not stereo) sets of images were available. Hence, wound depth was not available but rather was considered uniform, which poses a certain limitation on the proposed method. The bio-ink, due to its characteristics and deformability, can completely fill the wound volume. However, it is expected that a strategy be deployed in future research to precisely obtain the third dimension of wounds. The most common method uses a description of the tissues involved as opposed to an actual measurement [55]. Another method uses 3D reconstruction from stereo images. Applying structures from motion (SFM) algorithms can obtain dense estimation of the surface geometry from two widely separated views. Due to several wound image characteristics, e.g., the uniform texture, small scale and baseline, etc., this SFM method confronts some challenges in finding the correspondence and ensuring proper image matching. Although these challenges can be partially tackled by incorporating wound specific shape priors, extracted from pre-operative images [56], this method is expensive, needs special equipment (synchronization of two cameras, keeping the patient still, etc.), and requires a robust image processing chain [57]. Another alternative involves replacing depth values simply with the ones obtained from a laser depth-detecting sensor [52], which is more accurate and has a lower computational cost, but is more expensive.

VI. CONCLUSION

We presented a novel approach for healing chronic wounds. The semi-automatic image segmentation method gives clinicians much higher control by providing more-efficient coordinates for bioprinting. Based on the results of Table II, the Livewire algorithm is the most efficient of those tested, with 97.08%, 99.68%, and 96.67%, 96.22, 98.15, and 32.26, respectively, in terms of accuracy, sensitivity, specificity, Jaccard Index, Dice Similarity Coefficient and Hausdorff Distance respectively.

We used completely biocompatible materials in providing cell printing platforms. Alginate-gelatin hydrogel was synthesized by dissolving 16% (w/v) sodium-alginate and 4% (w/v) gelatin in deionized water and, due to its desirable properties such as biocompatibility, low cost and an easy gelation process, was used as the biopolymer for cell encapsulation.

The bioprinting results demonstrate 95.56% similarity between the bioprinted patch dimensions and the desired wound geometries, which represents a good match and overlap.

In future works, we plan to classify different wound tissues, use a more-comprehensive data set, and improve the validation and calibration techniques to obtain more precise and reliable coordinates for bioprinting. Furthermore, we plan to design a precise method for estimating the wound’s depth.

ACKNOWLEDGMENT

The authors would like to thank the Erfan Hospital Wound and Ostomy clinic for cooperation in collecting the clinical data, as well as Mr. Rabbani and Mr. Babaei for providing bioprinting materials.

REFERENCES

[1] K. Jung, S. Covington, C. Sen, M. Januszyk, R. Kirsner, G. Gurtner and

N. Shah, “Rapid identification of slow healing wounds”, Wound Rep and

Reg, vol. 24, no. 1, pp. 181-188, 2016.

[2] A. Dababneh and I. Ozbolat, “Bioprinting Technology: A Current State-of-the-Art Review”, J. of Manufacturing Science and Eng., vol. 136, no.

6, pp. 061016, 2014.

[3] M. Bilgin and Ü. Güneş, “A Comparison of 3 Wound Measurement

Techniques”, J. of Wound, Ostomy and Continence Nursing, vol. 40, no.

6, pp. 590-593, 2013.

[4] A. Shah, C. Wollak and J. Shah, “Wound Measurement Techniques:

Comparing the Use of Ruler Method, 2D Imaging and 3D Scanner”, J. of the American College of Clinical Wound Specialists, vol. 5, no. 3, pp.

52-57, 2013.

[5] J. Thatcher, J. Squiers, S. Kanick, D. King, Y. Lu, Y. Wang, R. Mohan, E. Sellke and J. DiMaio, “Imaging Techniques for Clinical Burn

Assessment with a Focus on Multispectral Imaging”, Advances in

Wound Care, 2016.

[6] M. Fauzi, I. Khansa, K. Catignani, G. Gordillo, C. Sen and M. Gurcan,

“Computerized segmentation and measurement of chronic wound

images”, Comput. in Biology and Medicine, vol. 60, pp. 74-85, 2015.

[7] L. Wang, P. Pedersen, D. Strong, B. Tulu, E. Agu and R. Ignotz,

“Smartphone-Based Wound Assessment System for Patients With

Diabetes”, IEEE Trans. Biomed. Eng., vol. 62, no. 2, pp. 477-488, 2015.

[8] C. Loizou, T. Kasparis and M. Polyviou, “Evaluation of wound healing

process based on texture image analysis”, J. of Biomed. Graph. and

Computing, vol. 3, no. 3, 2013.

[9] C.P. Loizou, T. Kasparis, O. Mitsi, M. Polyviou, “Evaluation of wound healing process based on texture analysis”, in Proc. IEEE Int. Conf.

Bioinf. Bioeng., pp. 709–714, 2012.

Page 8: Segmentation and Measurement of Chronic Wounds for …hamarneh/ecopy/jbhi2017.pdf · measurement of wounds, e.g., ruler, ... Due to its relatively high stiffness, any 3D scaffold

IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS 8

[10] H. Wannous, S. Treuillet, Y. Lucas, “Supervised tissue classification

from color images for a complete wound assessment tool,” in Proc.

IEEE EMBS Int. Conf., pp. 6031–6034, 2007.

[11] H. Wannous, Y. Lucas, S. Treuillet, and B. Albouy, “A complete 3D

wound assessment tool for accurate tissue classification and measurement,” in Proc. IEEE 15th Conf. Image Process., pp. 2928–

2931, Oct. 2008.

[12] H. Wannous, Y. Lucas and S. Treuillet, “Enhanced Assessment of the Wound-Healing Process by Accurate Multiview Tissue Classification,”

IEEE Trans. Med. Imag., vol. 30, no. 2, pp. 315-326, 2011.

[13] N.D.J. Hettiarachchi, R.B.H. Mahindaratne, G.D.C. Mendis, H.T. Nanayakkara, N.D. Nanayakkara, “Mobile-based wound measurement,”

in Proc. IEEE Point-of-Care Healthcare Technol., pp. 298–301, 2013.

[14] F. Veredas, H. Mesa, L. Morente, “Binary tissue classification on wound images with neural networks and bayesian classifiers,” IEEE Trans.

Med. Imag.., vol. 29, no. 2, pp. 410–427, 2010.

[15] A.F.M. Hani, L. Arshad, A.S. Malik, A. Jamil, F. Yap, “Haemoglobin distribution in ulcers for healing assessment,” in Proc. Int. Conf. on

Intelligent and Advanced Syst.pp. 362–367, 2012.

[16] A.A. Perez, A. Gonzaga, J.M. Alves, “Segmentation and analysis of leg

ulcers color images”, in Proc. Int. Workshop Med. Imaging and

Augmented Reality, pp. 262–266, 2001.

[17] K. Wantanajittikul, N. Theera-Umpon, S. Auephanwiriyakul, T.

Koanantakool, “Automatic segmentation and degree identification in burn color images,” in Proc. Int. Conf. Biomed. Eng., pp. 169–173,

2011.

[18] B. Song, A. Sacan, “Automated wound identification system based on image segmentation and artificial neural networks”, in Proc. IEEE Int.

Conf. Bioinformatics and Biomedicine, pp. 1–4, 2012.

[19] M. Kolesnik, A. Fexa, “Segmentation of wounds in the combined color-texture feature space”, in Proc. SPIE Med. Imaging, vol. 5370, pp. 549–

556, 2004.

[20] M. Kolesnik, A. Fexa, “How robust is the SVM wound segmentation?”,

in Proc. NORDIC Signal Process. Symp., pp. 50–53, 2006.

[21] M. Silveira, J. Nascimento, J. Marques, A. Marcal, T. Mendonca, S.

Yamauchi, J. Maeda and J. Rozeira, “Comparison of Segmentation Methods for Melanoma Diagnosis in Dermoscopy Images”, IEEE J. Sel.

Top. Signal Process., vol. 3, no. 1, pp. 35-45, 2009.

[22] A. Skardal, J. Zhang, L. McCoard, X. Xu, S. Oottamasathien and G.

Prestwich, “Photocrosslinkable Hyaluronan-Gelatin Hydrogels for Two-Step Bioprinting”, Tissue Eng. Part A, vol. 16, no. 8, pp. 2675-2685,

2010.

[23] S. Chen, Y. Wu, F. Mi, Y. Lin, L. Yu and H. Sung, “A novel pH-sensitive hydrogel composed of N,O-carboxymethyl chitosan and

alginate cross-linked by genipin for protein drug delivery”, J. of

Controlled Release, vol. 96, no. 2, pp. 285-300, 2004.

[24] Jin, R., Hiemstra, C., Zhong, Z., & Feijen, J., “Enzyme-mediated fast in

situ formation of hydrogels from dextran–tyramine conjugates,”

Biomaterials, vol. 28, no. 18, pp. 2791–2800, 2007.

[25] S. Sershen, S. Westcott, N. Halas and J. West, “Temperature-sensitive

polymer-nanoshell composites for photothermally modulated drug

delivery”, J. Biomed. Materials Res., vol. 51, no. 3, pp. 293-298, 2000.

[26] Z. Yang, H. Gu, D. Fu, P. Gao, J. Lam and B. Xu, “Enzymatic

Formation of Supramolecular Hydrogels”, Adv. Mater., vol. 16, no. 16,

pp. 1440-1444, 2004.

[27] C. Kuo and P. Ma, “Ionically crosslinked alginate hydrogels as scaffolds

for tissue engineering: Part 1. Structure, gelation rate and mechanical

properties”, Biomaterials, vol. 22, no. 6, pp. 511-521, 2001.

[28] D. Kirchmajer, R. Gorkin III and M. in het Panhuis, “An overview of the

suitability of hydrogel-forming polymers for extrusion-based 3D-

printing”, J. Mater. Chem. B, vol. 3, no. 20, pp. 4105-4117, 2015.

[29] J. Malda et al., “25th Anniversary Article: Engineering Hydrogels for

Biofabrication”, Adv. Mater., vol. 25, no. 36, pp. 5011-5028, 2013.

[30] S. Wang, J. Lee and W. Yeong, “Smart hydrogels for 3D bioprinting”,

Int. J. Bioprinting, 2015.

[31] A. Augst, H. Kong and D. Mooney, “Alginate Hydrogels as

Biomaterials”, Macromol. Biosci., vol. 6, no. 8, pp. 623-633, 2006.

[32] J. Rowley, G. Madlambayan and D. Mooney, “Alginate hydrogels as

synthetic extracellular matrix materials”, Biomaterials, vol. 20, no. 1,

pp. 45-53, 1999.

[33] Dermquest.com, “DermQuest: Home”, 2015, [Online], Available:

http://www.dermquest.com/ [Accessed: 06- Nov- 2015].

[34] Atlasdermatologico.com.br, “Dermatology Atlas”, 2016, [Online]. Available: http://www.atlasdermatologico.com.br, [Accessed: 12- Nov-

2015].

[35] R. C. Gonzalez and R. E. Woods, Digital image processing, 3rd ed.

Reading, Massachusetts: Addison-Wesley Publishing Company, 2007.

[36] T. Chan and L. Vese, “Active contours without edges”, IEEE Trans.

Image Process, vol. 10, no. 2, pp. 266-277, 2001.

[37] V. Estellers, D. Zosso, Rongjie Lai, S. Osher, J. Thiran and X. Bresson, “Efficient Algorithm for Level Set Method Preserving Distance

Function”, IEEE Trans. Image Process., vol. 21, no. 12, pp. 4722-4734,

2012.

[38] E. N. Mortensen and W. A. Barrett, “Intelligent scissors for image

composition,” in SIGGRAPH ’95 Proc. 22nd annu. conf. on Comput.

Graph. and interactive techniques, pp. 191-198, 1995.

[39] M. Kass, A. Witkin and D. Terzopoulos, “Snakes: Active contour

models”, Int. J. Comput. Vision, vol. 1, no. 4, pp. 321-331, 1988.

[40] A. Amini, T. Weymouth and R. Jain, “Using dynamic programming for

solving variational problems in vision”, IEEE Trans. Pattern Anal.

Mach. Intell., vol. 12, no. 9, pp. 855-867, 1990.

[41] Houhou, Nawal, Jean-Philippe Thiran, and Xavier Bresson. “Fast texture

segmentation based on semi-local region descriptor and active contour.”Numerical Mathematics: Theory, Methods and Applications.

pp. 445-468, 2009.

[42] T. Goldstein, X. Bresson, and S. Osher, ”Geometric Applications of the Split Bregman Method: Segmentation and Surface Reconstruction”,

Journal of Scientific Computing, 45(1-3), pp. 272-293, 2010

[43] P. Jaccard, “The distribution of the flora in the alpine zone”, New

Phytologist. vol. 11, no. 2, pp. 37-50, 1912.

[44] L. Dice, “Measures of the Amount of Ecologic Association Between

Species”, Ecology, vol. 26, no. 3, pp. 297-302, 1945.

[45] D. Huttenlocher, G. Klanderman and W. Rucklidge, “Comparing images

using the Hausdorff distance”, IEEE Trans. Pattern Anal. Mach. Intell.,

vol. 15, no. 9, pp. 850-863, 1993.

[46] Bryant, R. and Nix, D., Acute and chronic wounds, 5th ed., Elsevier

Health Sciences, 2015.

[47] Wound Care Education Institute, N. Morgan, Skin and wound resource

manual, Wound Care Education Institute; 2006.

[48] Mach3 version 3.043.062, Livermore Falls: ArtSoft USA, 2012.

[49] Sven, 2015. stlwrite - Write binary or ascii STL file, [Online] Available

at: http://www.mathworks.com/

[50] MankatiUM version 6.5.3 Shanghai: Mankati, 2015.

[51] X. Shu, Y. Liu, F. Palumbo and G. Prestwich, “Disulfide-crosslinked

hyaluronan-gelatin hydrogel films: a covalent mimic of the extracellular matrix for in vitro cell growth”, Biomaterials, vol. 24, no. 21, pp. 3825-

3834, 2003.

[52] K. W. Binder , ”In situ bioprinting of the skin” , Ph.D. dissertation in

molecular genetics and genomics , 2011.

[53] H. Hatze, “High-precision three-dimensional photogrammetric

calibration and object space reconstruction using a modified DLT-

approach.” J. of biomechanics, vol. 21, no. 7, pp. 533-538, 1988.

[54] J. S. Boateng, K. H. Matthews, H. N. Stevens, G. M. Eccleston, “Wound

healing dressings and drug delivery systems: a review” J. of

pharmaceutical sciences, vol. 97, no. 8, pp. 2892-2923, 2008.

[55] B. Yates, Merriman's assessment of the lower limb. Edinburgh:

Churchill livingstone, 2012, p. 508.

[56] A. Amir-Khalili, J. M. G.Peyrat, G. Hamarneh, R. Abugharbieh, “3D surface reconstruction of organs using patient-specific shape priors in

robot-assisted laparoscopic surgery.” Abdominal Imaging. Computation

and Clinical Applications., pp. 184-193, 2013.

[57] S. Treuillet, B. Albouy, and Y. Lucas, “Three-dimensional assessment of skin wounds using a standard digital camera,” IEEE Trans. Med.Imag.,

vol. 28, pp. 752–762, May 2009.