Ece1001 Lecture Slide Image 2006 3Hasan

download Ece1001 Lecture Slide Image 2006 3Hasan

of 44

Transcript of Ece1001 Lecture Slide Image 2006 3Hasan

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    1/44

    OverviewOf

    Digital Signal/Image Processing

    Mohammed A. Hasan

    Department of Electrical & Computer Engineering

    University of Minnesota-Duluth

    Email:[email protected]

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    2/44

    Related ECE Courses and Software:

    1. ECE 2111: Signals and Systems

    2. ECE 5741: Digital Signal Processing

    3. ECE 8741: Digital Image Processing

    4. Matlab

    Working knowledge in Statistics, Calculus, and Differ-ential Equations is very helpful in understanding Image

    Processing.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    3/44

    History

    Many of the techniques of digital image processing, ordigital picture processing as it was often called, were de-veloped in the 1960s at the Jet Propulsion Laboratory,MIT, Bell Labs, University of Maryland, and a few otherplaces, with application to satellite imagery, wirephotostandards conversion, medical imaging, videophone, char-

    acter recognition, and photo enhancement.But the cost of processing was fairly high with the com-

    puting equipment of that era.

    In the 1970s, digital image processing proliferated, whencheaper computers and dedicated hardware became avail-able. Images could then be processed in real time, forsome dedicated problems such as television standardsconversion. As general-purpose computers became faster,

    they started to take over the role of dedicated hardwarefor all but the most specialized and compute-intensiveoperations.

    With the fast computers and signal processors availablein the 2000s, digital image processing has become themost common form of image processing, and is generallyused because it is not only the most versatile method,but also the cheapest.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    4/44

    What is an Image

    1. An image f(x, y) is 2-dimensional light intensity func-

    tion, where f measures brightness at position (x, y).

    2. A digital image is a representation of an image by a2-D array of discrete samples.

    3. The amplitude of each sample is represented by afinite number of bits.

    4. Each element of the array is called a pixel.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    5/44

    Terminology

    Images: An image is a two-dimensional signal whose in-tensity at any point is a function of two spatial variables.

    Examples are photographs, still video images, radar andsonar signals, chest and dental X-rays.

    An image sequence such as that seen in a television isa three dimensional signal for which the image intensityat any point is a function of three variables: two spatialvariables and time.

    1. Digital image processing is a term used to describe

    the manipulation of image data by a computer.

    2. The process of transforming an image to a set ofnumbers, which a computer can utilized, is calleddigitization.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    6/44

    3. Digitization is to divide an image up into severalpicture elements called pixels. A pixel is the small-est resolvable unit of an image which the computerhandles.

    4. The value of a pixel is referred to as its gray leveland can be thought of as the intensity or brightness(or darkness) of the pixel.

    5. The number of different gray-levels a pixel can havevaries from system to system, and is determined bythe hardware that produces or displays the image.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    7/44

    Why do we process images

    Images (and videos) are everywhere. This includes dif-ferent imaging modalities such as visual, X-ray, ultra-sound, etc. Multimedia information will be the wave ofthe future. Diverse applications in astronomy, biology,geology, geography, medicine, law enforcement, defense,Industrial inspection, require processing of images.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    8/44

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    9/44

    4. Remote Sensing

    (a) Land use

    (b) Crops

    (c) Weather Forecasting

    5. Defense

    (a) Target recognition

    (b) Thermal Imaging

    6. Image Storage & Transmission(a) JPEG,MPEG, etc.

    (b) Efficiently store an image in a digital camera

    (c) Send an image from Mars to Earth

    7. OCR, Document Image Processing

    8. Prepare for display or printing(a) Adjust image size

    (b) Halftoning

    9. Enhance and restore images

    (a) Remove scratches from an old movie

    (b) Improve visibility of tumor in a radiograph

    10. Extract information from images

    (a) Read the ZIP code on a letter

    (b) Measure water polution from aerial images

    11. To hide information in them or watermarking

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    10/44

    Image Sensors/Input Devices

    1. Film Cameras

    2. Digital Cameras (CCD/CMOS)

    3. Video Cameras

    4. Scanners

    Image Output Devices

    1. Printers

    2. TV/Computer Monitors

    Image Storage

    1. CD-ROM

    2. Hard disk

    Processing Hardware

    1. PCs,

    2. Microprocessors, boards,

    3. workstations, etc.

    Processing Software: Many commercial and non-commerc

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    11/44

    Grayscale and Color Images

    1. For grayscale image, 256 levels or 8 bits/pixel issufficient for most applications

    2. For color image, each component (R, G, B) needs256 levels or 8 bits/pixel

    3. Storage for typical images

    (a) 512 512, 8 bits grayscale image: 262,144B

    (b) 1024768, 24 bits true color image: 2,359,296B

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    12/44

    Digital B/W VideosXR(l ,m,n)

    1. l= vertical position

    2. m= horizontal position

    3. n= frame number

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    13/44

    Grayscale Image

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    14/44

    Color Images

    XR(n, m), XG(n, m), XB(n, m)

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    15/44

    F(x, y) = F(m, n), 0 m M 1, 0 n N 1

    A digital image can be written as a matrix

    F =

    x(0, 0) x(0, 1) x(0, N 1)x(1, 0) x(1, 1) x(1, N 1)

    ... ... ... ...x(M 1, 0) x(M 1, 1) x(M 1, N 1)

    Popular Image Size:

    1. N = 64, M = 64 (26), # of pixels = 4,096.

    2. N = 512, M = 512 (29), # of pixels = 262,144.

    3. N = 1280, M = 1024, # of pixels = 1,310,720.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    16/44

    Image Operations

    1. Point Operations

    2. Local Operations

    3. Global Operations

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    17/44

    Image Operations can be classified as Linear andnon-linear Operations:

    H is a linear operator if if satisfies the superposition

    principle:H(af + bg) = aH(f) + bH(g)

    for all images f and g and all constants a and b.

    1. Mean filtering: Linear

    2. Median filtering: Non-linear

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    18/44

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    19/44

    Feature Enhancement by Subtraction

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    20/44

    A Brief History of Lena (Lenna)

    Anil K Kandangath

    Anyone familiar with digital image processing will surelyrecognize the image of Lena. While going through someold usenet discussions, I got to know that Lena has ahistory worth all the attention that has been paid to herover the years by countless image processing researchers.

    Lena Sjblom, (also spelled Lenna by many publications)was the Playboy playmate in November 1972 and roseto fame in the computer world when researchers at theUniversity of Southern California scanned and digitizedher image in June 1973. (Lena herself never know of herfame until she was interviewed by a computer magazinein Sweden where she lives with her husband and children).

    According to the IEEE PCS Newsletter of May/June2001, they were hurriedly searching for a glossy imagewhich they could scan and use for a conference paperwhen someone walked in with a copy of Playboy. Theengineers tore off the top third of the centerfold andscanned it with a Muirhead wire photo scanner (a distantcry from the flatbed scanners of today) by wrapping itaround the drum of the scanner. (Now you know whythe image shows only a small part of the entire picture..discounting of course, the fact that the complete picture

    would raise quite a few eyebrows.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    21/44

    Also, heres the poem dedicated to Lina, written by ananonymous admirer:

    "0 dear Lena, your beauty is so vastIt is hard sometimes to describe it fast.

    I thought the entire world I would impressIf only your portrait I could compress.Alas! First when I tried to use VQI found that your cheeks belong to only you.Your silky hair contains a thousand linesHard to match with sums of discrete cosines.And for your lips, sensual and tactualThirteen Crays found not the proper fractal.And while these setbacks are all quite severeI might have fixed them with hacks here or there

    But when wavelets took sparkle from your eyesI said, "Skip this stuff. Ill just digitize."

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    22/44

    Linear Stretching

    1. Enhance the dynamic range by stretching the origi-nal gray levels to the range of 0 to

    2. Example

    (a) The original gray levels are [100, 150].

    (b) The target gray levels are [0, 255].

    (c) The transformation function

    g(f) = ((f 100)/50) 255 f or100 f 150

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    23/44

    Illustration of Linear Stretching

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    24/44

    Image/video Processing Methods

    1. Image Enhancement

    2. Image Restoration

    3. Compression

    4. Image reconstruction

    5. Morphological image processing

    6. Feature extraction and recognition, computer vision

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    25/44

    Other Image Operations

    Image algebra includes mathematical comparisons, al-

    tering values of pixels, thresholding, edge detection andnoise reduction.

    1. Neighborhood averaging is to avoid extreme fluc-tuations in gray level from pixel to pixel. It is alsovery effective tool for noise reduction.

    2. Image Scaling is a means of reducing or expandingthe size of an image using existing image data.

    3. Histogram Equalization is an adjustment of grayscale based on gray-level histogram. This is effectivein enhancing the contrast of an image.

    4. Edge Detection is an operation of measuring andanalyzing the features in an image by detecting andenhancing the edges of the features. The most com-mon edge detection method is gradient detection.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    26/44

    5. Image Restoration: Given a noisy image y(m, n)

    y(m, n) = x(m, n) + v(m, n)

    where x(m, n) is the original image and v(m, n) isnoise. The objective is to recover x(m, n) from y(m, n).Color Restoration

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    27/44

    Photo Restoration

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    28/44

    6. Contrast Enhancement: how to enhance the con-trast of an image?

    1. Low contrast image values concentrated near nar-row range (mostly dark, or mostly bright, or mostlymedium values)

    2. Contrast enhancement change the image valuedistribution to cover a wide range

    3. Contrast of an image can be revealed by its his-togram

    Histogram The histogram of an image with L pos-sible gray levels, f = 0, 1, , L 1 is defined as:

    P(l) =nl

    n

    where

    nl is the number of pixels with gray level l.

    n is the total number of pixels in the image.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    29/44

    Examples of Histograms

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    30/44

    Applications

    Astronomy: Hubble Space Telescope : This tele-scope has limitation in resolution due to atmosphericturbulence.

    Optical problem in a telescope results in blurred, outof focus image. Digital image processing is normallyused to recover the desired information from theseimages.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    31/44

    Medical Imaging: Most of advanced medical imag-ing tools are based on DSP tools. X-Ray comput-erized Tomography (X-ray CT) is capable of gener-ating a cross-sectional display of the body. This in-volves X-ray generation, detection, digitization, process-ing and computer image reconstruction. Similarly,NMRCT (nuclear magnetic resonance).

    MRI

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    32/44

    Ultrasound

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    33/44

    Biometrics and Security:Biometric recognition refers to the use of distinctivecharacteristics (biometric identifiers) for automati-cally recognition individuals. These characteristicsmay be

    1. Physiological (e.g., fingerprints, face, retina, iris)

    2. a Behavioral (e.g., gait, signature, keystroke)

    Biometric identifiers are actually a combination ofphysiological and behavioral characteristics, and theyshould not be exclusively identified into either class.(For example, speech is determined partly by thephysiology and partly by the way a person speaks.)

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    34/44

    1. Signature Verification

    2. Fingerprint identification

    3. Face recognition

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    35/44

    Signature Verification

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    36/44

    Fingerprint: In 1684, an English plant morphol-ogist published the first scientific paper reportinghis systematic study on the ridge and pore struc-ture in fingerprints.

    A fingerprint image may be classified as:

    (a) Offline: Inked impression of the fingertip on apaper is scanned

    (b) Live-scan: Optical sensor, capacitive sensors,ultrasound sensors, ...

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    37/44

    At the local level, there are different local ridgecharacteristics. The two most prominent ridgecharacteristics, called minutiae, are:

    (a) Ridge termination

    (b) Ridge bifurcation

    At the very-fine level, intra-ridge details (sweat

    pores) can be detected. They are very distinc-tive; however, very high-resolution images are re-quired.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    38/44

    Face Recognition

    Face Recognition Methods

    (a) Template matching using minimum-distanceclassifiers metrics

    (b) Linear discriminants

    (c) Bayesian approach

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    39/44

    .

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    40/44

    1. Watermarking: The World Wide Web and theprogress in multimedia storage and transmissiontechnology expanded the possibility of illegal copy-

    ing and reproducing of digital data. Digital wa-termarking represents a valid solution to the aboveproblem, since it makes possible to identify thesource, author, creator, owner, distributor or au-thorized consumer of digitized images, video record-ings or audio recordings. A digital watermarkis an identification code, permanently embeddedinto digital data, carrying information pertainingto copyright protection and data authentication.

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    41/44

    .

    (a) Copyright protection and authentication

    2. Data hiding

    3. Steganography: Secret communication Steganog-raphy is the art and science of communicating ina way which hides the existence of the commu-nication. In contrast to cryptography, where theenemy is allowed to detect, intercept and mod-ify messages without being able to violate certainsecurity premises guaranteed by a cryptosystem,

    the goal of steganography is to hide messagesinside other harmless messages in a way thatdoes not allow any enemy to even detect thatthere is a second secret message present [MarkusKuhn 1995-07-03]

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    42/44

    Entertainment1. Digital camcorders

    2. HDTV

    3. DVDs: High quality image/video compression(MPEG2: about 5-10 Million bits/second)

    4. Digital Cinema

    (a) New compression technologies are needed

    i. Consider a 2 hour movie: 1920 x 1080 x 30bits/pixel x 24 frames/second 1.5 billionbits/second ! 1.3 terra bytes / 2 hr program

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    43/44

    Compression

    HDTV (high resolution TV broadcasted with thesame existing TV channel; requires digital compres-sion technology).

    Video-phone using existing telephone cable. Theamount of intermission to be transmitted for video-phone is much bigger than speech signal. Requires

    Compression Technology.

    Streaming video over wireless

    1. Video is high bandwidth data

    2. Wireless, at present, has limited bandwidth

    3. Needs efficient and effective compression

    4. New coding techniques such as MPEG-4 havebeen developed.

    Satellite broadcasting and Satellite Imaging

    Image Compression Techniques

    1. JPEG 2000 standard is based on wavelets

    2. JPEG (original) is based on the Discrete Cosine

  • 8/6/2019 Ece1001 Lecture Slide Image 2006 3Hasan

    44/44

    An Example of Image Compression