Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics...

24
PREFACE Everyday innumerable technologies are invented and developed all over the world in many fields. Added to this, many technologies have resulted in failure also. One, who is born, has to die one day. During this short span of life, the thrust for new technologies and developments have not quenched yet. As a result of this, many latest technologies were introduced. This magazine “INFOLINE” was aimed to provide basic necessary information about the latest technologies developed and it also creates awareness to the one who is reading this. Your comments and valuable suggestions for the improvement from the students, teachers and friends are warmly welcomed and will be gratefully acknowledged. Infoline Team

Transcript of Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics...

Page 1: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

PREFACE

Everyday innumerable technologies are invented and developed all over the world in many fields. Added to this, many technologies have resulted in failure also. One, who is born, has to die one day. During this short span of life, the thrust for new technologies and developments have not quenched yet. As a result of this, many latest technologies were introduced.

This magazine “INFOLINE” was aimed to provide basic necessary information

about the latest technologies developed and it also creates awareness to the one who is reading this.

Your comments and valuable suggestions for the improvement from the students, teachers and friends are warmly welcomed and will be gratefully acknowledged.

Infoline Team

Page 2: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

ACKNOWLEDGEMENT

We wish to thank Thiru A.Venkatachalam B.Sc., Correspondent, Kongu Arts And Science College, Erode and our Management for the support to publish the magazine Dr.N..Raman M.B.A., M.Com., M.Phil., B.Ed., PGDCA.,Ph.D., Principal, Kongu Arts And Science College, Erode has provided considerable support to us during this effort. We proudly thank our Chief Editor, Staff Advisor, Staff Members and the students of Department of Computer Technology and Information Technology for their guidance and suggestions to complete this magazine.

Page 3: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINEINFOLINEINFOLINEINFOLINE

TECHNOLOGY NAVIGATORTECHNOLOGY NAVIGATORTECHNOLOGY NAVIGATORTECHNOLOGY NAVIGATOR

Executive Committee

Chief Patron : Thiru A.Venkatachalam B.Sc.,

Patron : Dr. N.Raman M.B.A., M.Com., M.Phil., B.Ed., PGDCA.,Ph.D.,

Editor In Chief : S.Muruganantham M.Sc., M.Phil.,

Staff Advisor:

M.G.Annapoorani M.Sc.,

Assistant Professor, Department of CT & IT.

Staff Editor:

C.Indrani M.C.A., M.Phil.,

Assistant Professor, Department of CT & IT.

Student Editors:

Ramya.R III-B.Sc(CT)

Rameshkumar.R III-B.Sc(CT)

Ramya.B III-B.Sc(CT)

Kasthuri.H III-B.Sc(IT)

Kiruthika.S.M III-B.Sc(IT)

Page 4: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

Organizing Members:

Senthilkumar.M II-B.Sc(CT)

Sathish.K II-B.Sc(CT)

Saranya.K II-B.Sc(CT)

Sasikumar.S II-B.Sc(CT)

Rahul Babu.B II-B.Sc(IT)

Sathiya.P II-B.Sc(IT)

Senthil kumar.V II-B.Sc(IT)

Shanmugam.P II-B.Sc(IT)

Page 5: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

CONTENTS

Preface i

Acknowledgment ii

Executive Committee iii

FREE SPACE OPTICAL COMMUNICATION 1

PON TOPOLOGIES 3

TURANOR PLANET SOLAR 5

WEB SEARCH ENGINE 7

INTEL PUMPS $30 MILLION INTO CLOUD'S FUTURE 10

GRID COMPUTING 10

WINDOWS OS SECURITY 13

WINDOWS HOME SERVER 2011 15

Page 6: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

ARTICLES

Page 7: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

1

Free Space Optical Communication

An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is the large disc in the middle, the transmitters the smaller ones. To the top and right side a monocular for assisting the alignment of the two heads. Free-space optical communication (FSO) is an optical communication technology that uses light propagating in free space to transmit data for telecommunications or computer networking. "Free space" means air, outer space, vacuum, or something similar. This contrasts with using solids such as optical fiber cable or an optical transmission line. The technology is useful where the physical connections are impractical due to high costs or other considerations.

History

Optical communications, in various forms, have been used for thousands of years. The Ancient Greeks polished their shields to send signals during battle. In the modern era, semaphores and wireless solar telegraphs called heliographs were developed, using coded signals to communicate with their recipients.In 1880 Alexander Graham Bell and his assistant Charles Sumner Tainter created the Photophone, at Bell's newly established Volta Laboratory in Washington, DC. Bell considered it his most important invention. The device allowed for the transmission of sound on a beam of light. On June 3, 1880, Bell conducted the world's first wireless telephone transmission between two buildings, some 213 meters apart. Its first practical use came in military communication systems many decades later. Carl Zeiss Jena developed the Lichtsprechgerät 80 (direct translation: light speaking device) that the German army used in their World War II anti-aircraft defense units. The invention of lasers in the 1960s revolutionized free space optics. Military organizations were particularly interested and boosted their development. However the

technology lost market momentum when the installation of optical fiber networks for civilian uses was at its peak. Many simple and inexpensive consumer remote controls use low-speed commnication using infrared (IR) light. This known as consumer IR technologies.

Usage and technologies

Free-space point-to-point optical links can be implemented using infrared laser light, although low-data-rate communication over short distances is possible using LEDs. Infrared Data Association (IrDA) technology is a very simple form of free-space optical communications. Free Space Optics are additionally used for communications between spacecraft. Maximum range for terrestrial links is in the order of 2 to 3 km (1.2 to 1.9 mi) but the stability and quality of the link is highly dependent on atmospheric factors such as rain, fog, dust and heat. Amateur radio operators have achieved significantly farther distances using incoherent sources of light from high-intensity LEDs. One reported 173 miles (278 km) in 2007. However, physical limitations of the equipment used limited bandwidths to about 4 kHz.

The high sensitivities required of the detector to cover such distances made the internal capacitance of the photodiode used a dominant factor in the high-impedance amplifier which followed it, thus naturally forming a low-pass filter with a cut-off frequency in the 4 kHz range. In outer space, the communication range of free-space optical communication is currently in the order of several thousand kilometers, but has the potential to bridge interplanetary distances of millions of kilometers, using optical telescopes as beam expanders. The distance records for optical communications involved detection and emission of laser light by space probes. A two-way distance record for communication was set by the Mercury laser altimeter instrument aboard the MESSENGER spacecraft. This infrared diode neodymium laser, designed as a laser altimeter for a Mercury orbit mission, was able to communicate across a distance of 15 million miles (24 million km), as the craft neared Earth

Page 8: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

2

on a fly-by in May, 2005. The previous record had been set with a one-way detection of laser light from Earth, by the Galileo probe, as two ground-based lasers were seen from 6 million km by the out-bound probe, in 1992.

Secure free-space optical communications have been proposed using a laser N-slit interferometer where the laser signal takes the form of an interferometric pattern. Any attempt to intercept the signal causes the collapse of the interferometric pattern. This technique has been demonstrated to work over propagation distances of practical interest and, in principle, it could be applied over large distances in space.

Applications

Typically scenarios for use are:

• LAN-to-LAN connections on campuses at Fast Ethernet or Gigabit Ethernet speeds

• LAN-to-LAN connections in a city, a metropolitan area network

• To cross a public road or other barriers which the sender and receiver do not own

• Speedy service delivery of high-bandwidth access to optical fiber networks

• Converged Voice-Data-Connection • Temporary network installation (for

events or other purposes) • Reestablish high-speed connection

quickly (disaster recovery) • As an alternative or upgrade add-on to

existing wireless technologies • As a safety add-on for important fiber

connections (redundancy) • For communications between spacecraft,

including elements of a satellite constellation

• For inter- and intra -chip communication

The light beam can be very narrow, which makes FSO hard to intercept, improving security. In any case, it is comparatively easy to encrypt any data traveling across the FSO connection for additional security. FSO provides vastly

improved electromagnetic interference (EMI) behavior compared to using microwaves.

Advantages

RONJA is a free implementation of FSO using high-intensity LEDs.

• Ease of deployment • License-free long-range operation (in

contrast with radio communication) • High bit rates • Low bit error rates • Immunity to electromagnetic interference • Full duplex operation • Protocol transparency • Very secure due to the high directionality

and narrowness of the beam(s) • No Fresnel zone necessary

Disadvantages

For terrestrial applications, the principal limiting factors are:

• Beam dispersion • Atmospheric absorption • Rain • Fog (10..~100 dB/km attenuation) • Snow • Scintillation • Background light • Shadowing • Pointing stability in wind • Pollution / smog • If the sun goes exactly behind the

transmitter, it can swamp the signal.

These factors cause an attenuated receiver signal and lead to higher bit error ratio (BER). To overcome these issues, vendors found some solutions, like multi-beam or multi-path architectures, which use more than one sender and more than one receiver. Some state-of-the-art devices also have larger fade margin (extra power, reserved for rain, smog, fog).

To keep an eye-safe environment, good FSO systems have a limited laser power density

Page 9: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

3

and support laser classes 1 or 1M. Atmospheric and fog attenuation, which are exponential in nature, limit practical range of FSO devices to several kilometres.

By

SARANYA.G

III–B.Sc(IT)

Pon Topologies There are several topologies suitable for

the access network: tree, ring, or bus. A PON can also be deployed in redundant configuration as double ring or doubletree; or redundancy may be added only to a part of the PON, say the trunk of the tree. For the rest of this article, we will focus our attention on the tree topology; however, most of the conclusions made are equally relevant to other topologies . All transmissions in a PON are performed between Optical Line Terminal (OLT) and Optical Network Units (ONU). Therefore, in the downstream direction (from OLT to ONUs), a PON is a point-to-multipoint network, and in the upstream direction it is a multipoint-to-point network. The OLT resides in the local exchange (central office), connecting the optical access network to an IP, ATM, or SONET backbone. The ONU is located either at the curb (FTTC solution), or at the end-user location (FTTH, FTTB solutions), and provides broadband voice, data, and video services. In the downstream direction, a PON is a P2MP network, and in the upstream direction it is a MP2P networkThe advantages of using PONs in subscriber access networksarenumerous. 1. PONs allow for long reach between central offices and customer premises, operating at distances over 20km.

2. PONs minimizes fiber deployment in both the local exchange office and local loop. 3. PONs provides higher bandwidth due to deeper fiber penetration, offering gigabit per second

solutions. 4. Operating in the downstream as a broadcast network, PONs allow for video broadcasting as either IP video or analog video using a separate wavelength overlay. 5. PONs eliminate the necessity to install active multiplexer at splitting locations thus relieving network operators 6. Being optically transparent end to end PONs allow upgrades to higher bit rates or additional wavelengths.

Multiple Access

One possible way of separating the channels is to use wavelength division multiplexing (WDM) in which each ONU operates at a different wavelength. While a simple solution, it remains cost prohibitive for an access network. A WDM solution would require either tunable receiver or a receiver array at the OLT to receive multiple channels. An even more serious problem for network operators would be wavelength-specific ONU inventory instead of having just one type of ONU, there would be multiple types of ONUs based on their laser wavelength .It would also be more problematic for an unqualified user to replace a defective ONU. Using tunable lasers in ONUs is too expensive at the current state of technology. For these reasons a WDM PON network is not an attractive solution in today's environment. By M.VIGNESH III - B.Sc(IT)

Ipod Touch

The iPod Touch (stylized and marketed as lowercase iPod touch; also colloquially referred to as the iTouch,by analogy to the iPhone) is a portable media player, personal digital assistant, handheld game console, and Wi-Fi mobile platform designed and marketed by Apple. The iPod Touch adds the multi-touch graphical user interface to the iPod line. It is the first iPod with wireless access to the iTunes Store, and also has access to Apple's App Store, enabling content to be purchased and downloaded directly on the device. As of March 2011, Apple has sold over 60 million iPod Touch units.

Page 10: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

4

Software

The iPod Touch runs iOS. The first major update after the initial release was iPhone OS 2.0. This update introduced the App Store, which allowed third-party applications for the first time. iPhone OS 2.0 debuted June 29, 2008. iPhone users received the update for free, while iPod Touch users had to pay for the update. The second major update to the operating system, iPhone OS 3.0, was released June 17, 2009. iPhone OS 3.0 added features such as cut, copy, and paste; data tethering; and push notification support. As with the previous major release, iPhone users received the update for free, while iPod Touch users had to pay for the update. iOS 4.0 was made available to the public on June 21, 2010. It was the first major iOS release to drop support for some devices; the first generation iPod Touch and original iPhone are not supported in iOS 4.0. The iPhone 3G and second generation iPod Touch had limited functionality under iOS 4.0, while the iPhone 4, iPhone 3GS, third generation iPod Touch, and fourth generation iPod Touch had full functionality under iOS 4.0. The major features introduced in iOS 4.0 included iBooks, FaceTime, and multitasking. iOS 5.0 was previewed to the public on June 6, 2011, and is expected to be released in the fall of 2011The iPod touch and the iPhone share essentially the same hardware and run the same iOS operating system. The iPod touch lacks some of the iPhone's features and associated apps, such as access to cellular networks, GPS navigation and the inbuilt compass. Older models also lacked speakers and cameras. Although the SMS and Phone apps are included on the iPod touch software, they are disabled and therefore not visible. Also, the sleep/wake button was on the opposite side, up until the release of the iPod touch 4th generation. Since it doesn't need GPS and cellular components, the iPod touch is slimmer and lighter than the iPhone. Steve Jobs oncereferred to the iPod touch as "training wheels for the iPhone". Another major difference is the quality of the back camera compared to the iPhone. While the iPod Touch, like the iPhone, allows for HD video recording, the iPhone

camera stills deliver higher quality photos to that of the iPod's.

Requirements

• iTunes 10 or later • Mac OS 10.5 or later Windows XP Home or Professional with Service Pack 3 or later

Synchronization

As supplied new, the iPod Touch must be connected to a Macintosh or Windows computer. There is no official Linux support. On either OS, the iPod Touch must be connected through a USB port. This will charge the iPod Touch and sync music, videos, pictures and more. Special cables that plug into a wall can also be bought separately but can only be used to charge the iPod Touch.

Battery charging

Starting with the second generation, iPod Touch can only be charged from the 5 V pin of the dock connector while most previous iPod models (including the original iPod Touch) could also be charged from the 12 V pin for FireWire power.This change dropped support for charging in vehicles equipped with a FireWire-based iPod connection. Most aftermarket manufacturers of such equipment offer cables and/or adapters which convert the vehicle's 12 V to 5 V.Hacks Shortly after the iPhone (then also the iPod touch) was released, hackers were able to "jailbreak" the device through a TIFF exploit. The application installed by this exploit enabled the user to download a selection of unofficial third-party programs. Jailbreaking the iPod Touch was the only way to get third-party programs when running 1.1.x OSes. These third-party programs could use additional functionality not supported by Apple (such as enabling multitasking, applying themes to the home screen, or enabling a battery percentage indicator). All officially released

versions of iOS through 3.1.2 though 4.3.3 with some bugs, can be jailbroken, but

Page 11: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

5

version 4.3.1could not at the time it was released.Recently, the 4.3.X firmware has been jailbroken untethered by @i0nic. Servicing an iPod Touch after jail breaking or other modifications made by unofficial means is not covered by Apple's warranty (however, the jailbreaking process is easily undone by performing a restore through iTunes).Today every firmware of the iPod touch can be jailbroken from 1.1.1 to 4.3.3 and even 5.0b7.

By

MADHAN KUMAR.M

II-B.sc (CT)

Turanor Planet Solar

Tûranor PlanetSolar, also known under the project name PlanetSolar, is an entirely solar powered boat that was launched on 31 March 2010. It was built by Knierim Yachtbau in Kiel, Germany, and was designed by LOMOcean Design, formerly known as Craig Loomes Design Group Ltd.. It is the largest solar-powered boat in the world. Similarly to Earthrace, another boat designed by LOMOcean Design, Tûranor PlanetSolar is planning to set a round-the-world record that will promote the use sustainable

energy. Instead of using Bio Diesel like Earthrace, Tûranor PlanetSolar plans to exclusively use solar power. In doing so, Tûranor PlanetSolar will become the first ever boat to circumnavigate the world using solar power alone.[6] In order to fulfill this challenge, the boat is covered in over 500 square meters of solar panels rated 93 kW, which in turn connect to one of the two electric motorsin each hull. Although its hull is capable of hosting 200 persons, the shape of the boat means that it will be able to reach speeds of up to fourteen knots. The boat's hull has been model tested in wind tunnels and has been tank tested to determine the hydrodynamics and aerodynamics of the hull. This 31 meter long boat has been designed to be used as a luxury yacht after the record attempt is finished. On 27 September 2010 Tûranor Planet Solar started on a journey around the world in Monaco. With this expedition, the iniatiors of the project would like to focus the public awareness on the importance of renewable energies for environmental protection.

The crew of six will circumnavigate the globe solely with the aid of solar power. Captain of the expedition is Frenchman Patrick Marchesseau. Other participants are Christian Ochsenbein (Bern, Switzerland) and Jens Langwasser (Kiel, Germany); as well as project initiator Raphael Domjan (Yverdon-les Bain, Switzerland). On the first leg across the Atlantic Ocean technician Daniel Stahl (Kiel, Germany) and first mate Mikaela von Koskull (Finland) were part of the crew. On 27 November 2010 the solar boat reached Miami. A significant stopover was Cancún, during the United Nations World Climate Conference. At the centerline of the world tour, the French Canadian Captain Erwann Le Rouzic took over in New Caledonia mid may 2011, for thefurther circumnavigation sharing the master's responsibility with Captain Patrick Marchesseau. End of May 2011, the world's largest solar boat docked at Brisbane, where in an official ceremony the "PlanetSolar

Page 12: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

6

Relay for Hope" was launched, a global relay children and young people can take part in and present their visions and hopes for a solar energy driven world through essays, videos, music, drawing and models.On the first solar boat expedition around the globe, the TÛRANOR PlanetSolar has already reached two records: fastest crossing of the Atlantic Ocean by solar boat and longest distance ever covered by a solar electric vehicle. The current routing around the globe (subject to favourable weather and nautical conditions) foresees stopovers in several port cities to inform the public about the importance o sustainable and renewable energies. The boat is registered in Switzerland andwas Zinanced byaGerman entrepreneur.Construction cost was € 12.5 million. The name Tûranor, derived from J.R.R. Tolkien's novel The Lord of the Rings, translates to "The Power of the Sun"

PlanetSolar: The sun-powered super yacht

World's largest solar boat, PlanetSolar, will silently and cleanly carry two men around the globe.

The planetSolar team unveiled its massive boat this week. To graps the scale of this super yacht, compare it to the forklift on the far right or to the person working behind the windshield. This green leviathan is the world’s largest solar-powered seacraft. Weighing in at 60 tons, the PlanetSolar measures 102 feet long, about 50 feet wide,

and 24 feet tall. For a sense of scale, peek into its front window, pictured above , and try to spot the doll-like man working inside.(You might need to click to enlarge the photo.) Really, PlanetSolar’s jumbo size is simply to accommodate the 5,300 square feet of sun-soaking panels that run along its topside. The solar array pulls in 103kw ,five times more than the boat needs to run at its average speed of 9 m.p.hThat’s not exactly jetpack speed but planet solaraims for the long haul. The boat will life anchor in Europe around april 2011 and attempt to circle the globe , fueled by nothing but solar rays.Unlike the almost absurdly decadent Oculus and Infinitas super yachts that we told you about here, the interior of this boat leans toward the Spartan. Only two men will make the worldwide voyage."Today, the boat is the most used means of transport of goods," the team writes. "It represents single-handedly almost 1.4 billions of tons of carbon dioxide (in 2008), that is 6% of the total carbondioxide emissions and twice more than the air Transport." Thriller-seeker Raphaël Domjan will skipper the ship. And he picked an excellent adventure buddy: Gerard d'Aboville, the first man to row across the entire Atlantic Ocean. Along the cruise from New York to San Francisco to Abu Dhabi, the world tour will share a message of environmental stewardship. A Milestone In The Progress Of Solar Mobility

The launch of the largest solar yacht in the world, the TÛRANOR PlanetSolar is a powerful symbol for the advancement of solar shipping. The TÛRANOR PlanetSolar, with its PV modules covering approx. 500 m², can navigate up to three days even without exposure to the sunlight. The boat’s task is to demonstrate that motorised shipping can work without fuel.

The long-term performance of the TÛRANOR PlanetSolar is to be tested for the first time in a circumnavigation of the globe.

Page 13: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

7

The name TÛRANOR is derived from the Lord of the Rings Saga of J.R.R. Tolkien and translates into "The Power of the Sun".

By

DHIVYA.N

III-B.Sc(IT)

Web Search Engine "Search engine" redirects here. For other uses, see Search engine (disambiguation).

Search engine market share in the US, as of 2008.

A web search engine is designed to search for information on the World Wide Web and FTP servers. The search results are generally presented in a list of results and are often called hits. The information may consist of web pages, images, information and other types of files. Some search engines also mine data available in databases or open directories. Unlike web directories, which are maintained by human editors, search engines operate algorithmically or are a mixture of algorithmic and human input.

History

During the early development of the web, there was a list of webservers edited by Tim Berners-Lee and hosted on the CERN

webserver. One historical snapshot from 1992 remains. As more webservers went online the central list could not keep up. On the NCSA site new servers were announced under the title "What's New!" The very first tool used for searching on the Internet was Archie.[4] The name stands for "archive" without the "v". It was created in 1990 by Alan Emtage, Bill Heelan and J. Peter Deutsch, computer science students at McGill University in Montreal. The program downloaded the directory listings of all the files located on public anonymous FTP (File Transfer Protocol) sites, creating a searchable database of file names; however, Archie did not index the contents of these sites since the amount of data was so limited it could be readily searched manually. The rise of Gopher (created in 1991 by Mark McCahill at the University of Minnesota) led to two new search programs, Veronica and Jughead. Like Archie, they searched the file names and titles stored in Gopher index systems. Veronica (Very Easy Rodent-Oriented Net-wide Index to Computerized Archives) provided a keyword search of most Gopher menu titles in the entire Gopher listings. Jughead (Jonzy's Universal Gopher Hierarchy Excavation And Display) was a tool for obtaining menu information from specific Gopher servers. While the name of the search engine "Archie" was not a reference to the Archie comic book series, "Veronica" and "Jughead" are characters in the series, thus referencing their predecessor.

In the summer of 1993, no search engine existed yet for the web, though numerous specialized catalogues were maintained by hand. Oscar Nierstrasz at the University of Geneva wrote a series of Perl scripts that would periodically mirror these pages and rewrite them into a standard format which formed the basis for W3Catalog, the web's first primitive search engine, released on September 2, 1993. In June 1993, Matthew Gray, then at MIT, produced what was probably the first web robot, the Perl-based World Wide Web Wanderer, and used

Page 14: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

8

it to generate an index called 'Wandex'. The purpose of the Wanderer was to measure the size of the World Wide Web, which it did until late 1995. The web's second search engine Aliweb appeared in November 1993. Aliweb did not use a web robot, but instead depended on being notified by website administrators of the existence at each site of an index file in a particular format.

JumpStation (released in December 1993) used a web robot to find web pages and to build its index, and used a web form as the interface to its query program. It was thus the first WWW resource-discovery tool to combine the three essential features of a web search engine (crawling, indexing, and searching) as described below.

Because of the limited resources available on the platform on which it ran, its indexing and hence searching were limited to the titles and headings found in the web pages the crawler encountered. One of the first "full text" crawler-based search engines was WebCrawler, which came out in 1994. Unlike its predecessors, it let users search for any word in any webpage, which has become the standard for all major search engines since.

It was also the first one to be widely known by the public. Also in 1994, Lycos (which started at Carnegie Mellon University) was launched and became a major commercial endeavor. Soon after, many search engines appeared and vied for popularity. These included Magellan (search engine), Excite, Infoseek, Inktomi, Northern Light, and AltaVista. Yahoo! was among the most popular ways for people to find web pages of interest, but its search function operated on its web directory, rather than full-text copies of web pages.

Information seekers could also browse the directory instead of doing a keyword-based search. In 1996, Netscape was looking to give a single search engine an

exclusive deal to be the featured search engine on Netscape's web browser. There was so much interest that instead a deal was struck with Netscape by five of the major search engines, where for $5 million per year each search engine would be in rotation on the Netscape search engine page.

The five engines were Yahoo!, Magellan, Lycos, Infoseek, and Excite. Search engines were also known as some of the brightest stars in the Internet investing frenzy that occurred in the late 1990s.Several companies entered the market spectacularly, receiving record gains during their initial public offerings.

Some have taken down their public search engine, and are marketing enterprise-only editions, such as Northern Light. Many search engine companies were caught up in the dot-com bubble, a speculation-driven market boom that peaked in 1999 and ended in 2001. Around 2000, Google's search engine rose to prominence.

The company achieved better results for many searches with an innovation called PageRank. This iterative algorithm ranks web pages based on the number and PageRank of other web sites and pages that link there, on the premise that good or desirable pages are linked to more than others. Google also maintained a minimalist interface to its search engine.

In contrast, many of its competitors embedded a search engine in a web portal. By 2000, Yahoo! was providing search services based on Inktomi's search engine. Yahoo! acquired Inktomi in 2002, and Overture (which owned AlltheWeb and AltaVista) in 2003.

Yahoo! switched to Google's search engine until 2004, when it launched its own search engine based on the combined technologies of its acquisitions. Microsoft first launched MSN Search in the fall of 1998

Page 15: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

9

using search results from Inktomi. In early 1999 the site began to display listings from Looksmart blended with results from Inktomi except for a short time in 1999 when results from AltaVista were used instead.

In 2004, Microsoft began a transition to its own search technology, powered by its own web crawler (called msnbotrebranded search engine, Bing, was launched on June 1, 2009. On July 29, 2009, Yahoo! and Microsoft finalized a deal in which Yahoo! Search would be powered by Microsoft Bing technology.

How web search engines work

High-level architecture of a standard Web crawler

A search engine operates in the following order:

1. Web crawling 2. Indexing 3. Searching.

Web search engines work by storing information about many web pages, which they retrieve from the html itself. These pages are retrieved by a Web crawler(sometimes also known as a spider)

INFOLINE

using search results from Inktomi. In early 1999 the site began to display listings from

blended with results from Inktomi except for a short time in 1999 when results from AltaVista were used instead.

began a transition ology, powered by its

msnbot). Microsoft's , was launched

on June 1, 2009. On July 29, 2009, Yahoo! and Microsoft finalized a deal in which

would be powered by

level architecture of a standard Web

A search engine operates in the following

Web search engines work by storing out many web pages, which

they retrieve from the html itself. These Web crawler

(sometimes also known as a spider) — an

automated Web browser which follows evelink on the site. Exclusions can be made by the use of robots.txt. The contents of each page are then analyzed to determine how it should be indexed (for example, words are extracted from the titles, headings, or special fields called meta tags). Data about web pages are stored in an index database for use in later queries.

A query can be a single word. The purpose of an index is to allow information to be found as quickly as possible. Some search engines, such as Google, store all or part of the source page (referred to as a well as information about the web pages, whereas others, such as AltaVistaword of every page they find. This cached page always holds the actual search text since it is the one that was actually indexed, so it can be very useful when the contecurrent page has been updated and the search terms are no longer in it.

This problem might be considered to be a mild form of linkrothandling of it increases usabilityuser expectations that the search terms will be on the returned webpage. Thiprinciple of least astonishmentnormally expects the search terms to be on the returned pages. Increased search relevance makes these cached pages very useful, even beyond the fact that they may contain data that may no longer be available elsewhere.When a user enters a search engine (typically by using the engine examines its indexlisting of best-matching web pages according to its criteria, usually with a short summary containing the document's title and sometimes parts of the text.

The index is built from the information stored with the data and the method by which the information is indexed. Unfortunately, there are currently no known public search engines that allow documents to be searched by date. Most search engines

automated Web browser which follows every link on the site. Exclusions can be made by

. The contents of each page are then analyzed to determine how it

(for example, words are extracted from the titles, headings, or special

). Data about web pages are stored in an index database for use

A query can be a single word. The purpose of an index is to allow information to be found as quickly as possible. Some search

, store all or part of the source page (referred to as a cache) as well as information about the web pages,

AltaVista, store every word of every page they find. This cached page always holds the actual search text since it is the one that was actually indexed, so it can be very useful when the content of the current page has been updated and the search

This problem might be considered to linkrot, and Google's usability by satisfying

that the search terms will be on the returned webpage. This satisfies the principle of least astonishment since the user normally expects the search terms to be on the returned pages. Increased search elevance makes these cached pages very

useful, even beyond the fact that they may contain data that may no longer be available elsewhere.When a user enters a query into a

rch engine (typically by using key words), index and provides a

matching web pages according to its criteria, usually with a short summary containing the document's title and sometimes parts of the text.

The index is built from the information stored with the data and the

y which the information is indexed. Unfortunately, there are currently no known public search engines that allow documents to be searched by date. Most search engines

Page 16: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

10

support the use of the boolean operators AND, OR and NOT to further specify the search query. Boolean operators are for literal searches that allow the user to refine and extend the terms of the search.

The engine looks for the words or phrases exactly as entered. Some search engines provide an advanced feature called proximity search which allows users to define the distance between keywords. There is also concept-based searching where the research involves using statistical analysis on pages containing the words or phrases you search for. As well, natural language queries allow the user to type a question in the same form one would ask it to a human. A site like this would be ask.com.The usefulness of a search engine depends on the relevance of the result set it gives back. While there may be millions of web pages that include a particular word or phrase, some pages may be more relevant, popular, or authoritative than others. Most search engines employ methods to rank the results to provide the "best" results first.

How a search engine decides which pages are the best matches, and what order the results should be shown in, varies widely from one engine to another. The methods also change over time as Internet usage changes and new techniques evolve. There are two main types of search engine that have evolved: one is a system of predefined and hierarchically ordered keywords that humans have programmed extensively. The other is a system that generates an "inverted index" by analyzing texts it locates. This second form relies much more heavily on the computer itself to do the bulk of the work.

Most Web search engines are commercial ventures supported by advertising revenue and, as a result, some employ the practice of allowing advertisers to pay money to have their listings ranked higher in search results. Those search engines which do not accept money for their search engine results make money by running search

related ads alongside the regular search engine results. The search engines make money every time someone clicks on one of these ads.

By

PERIYASAMY.M

III-B.Sc(IT)

Intel Pumps $30 Million Into Cloud's Future

Intel Labs is pumping the tires for cloud computing with a $30 million investment in a pair of new Intel Science and Technology Centers (ISTC) at Carnegie Mellon University that will focus on cloud computing and embedded computing research. The $30 million is part of Intel's five-year $100 million program launched to accelerate innovation and increase university research. The new ISTCs, Intel said, join the already announced centers for visual and secure computing. “These new ISTCs are expected to open amazing possibilities," said Justin Rattner, CTO, Intel in a statement. “Imagine, for example, future cars equipped with embedded sensors and microprocessors to constantly collect and analyze traffic and weather data. That information could be shared and analyzed in the cloud so that drivers could be provided with suggestions for quicker and safer routes.” Cloud and embedded computing represent two major growth areas for Intel. In its second quarter earnings call last month, Intel said its Data Center Group sales jumped 15 percent year-over-year with cloud computing and enterprise servers leading the way. And Intel's Embedded & Communications Group,

Page 17: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

11

which includes processors like the Xeon and Atom chips to power portable and other devices, leapt a whopping 25 percent. The new ISTCs will also build upon Intel's vision for cloud computing, dubbed Intel Cloud 2015. The Cloud 2015 vision features centers around three key elements; a world of interoperable federated clouds; automated movement of software applications and resources; and PC and device-savvy client-aware clouds that know what processing should take place in the cloud or on a mobile device such as a laptop, smartphone or tablet. Intel said the ISTCs will add new ideas from academic researchers to extend Intel's existing cloud computing initiatives. The center will combine researchers from Carnegie Mellon University, Georgia Institute of Technology, University of California Berkeley, Princeton University, and Intel that will explore cloud-impacting technologies like built-in application optimization; efficient and effective support of big data analytics on large amounts of online data; and making the cloud more distributed and localized by extending cloud capabilities to the network edge and to client devices. "In the future, these capabilities could enable a digital personal handler via a device wired into your glasses that sees what you see, to constantly pull data from the cloud and whisper information to you during the day -- telling you who people are, where to buy an item you just saw, or how to adjust your plans when something new comes up," Intel said. A key area of research at the ISTCs will be to make it easier for devices to collect, analyze and act on data from sensors and online databases. For example, in cars, data could be used to customize entertainment options for specific passengers while also offering more tailored recommendations while traveling. "With the growing popularity of mobile real-time and personalized technology, there is a corresponding rise in demand for specialized embedded computing systems to support a broad range of new applications -- including many not yet envisioned," Intel said. On the

embedded computing side, the ISTC will comprise leading researchers from Carnegie Mellon University, Cornell University, University of Illinois at Urbana Champaign, University of Pennsylvania, Pennsylvania State University, Georgia Institute of Technology, the University of California at Berkeley and Intel to form a collaborative community to drive research that can transform experiences in the home, in cars and in retail environments in the future.

By

BABU.R

III–B.Sc(CT)

Grid Computing History

The term grid computing originated in the early 1990s as a metaphor for making computer power as easy to access as an electric power grid in Ian Foster's and Carl Kesselman's seminal work, "The Grid: Blueprint for a new computing infrastructure" (2004).CPU scavenging and volunteer computing were popularized beginning in 1997 by distributed.net and later in 1999 by SETI@home to harness the power of networked PCs worldwide, in order to solve CPU-intensive research problems.

The ideas of the grid (including those from distributed computing, object-oriented programming, and Web services) were brought together by Ian Foster, Carl Kesselman, and Steve Tuecke, widely regarded as the "fathers of the grid". They led the effort to create the Globus Toolkit incorporating not just computation management but also storage management, security provisioning, data movement, monitoring, and a toolkit for developing

Page 18: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

12

additional services based on the same infrastructure, including agreement negotiation, notification mechanisms, trigger services, and information aggregation. While the Globus Toolkit remains the de facto standard for building grid solutions, a number of other tools have been built that answer some subset of services needed to create an enterprise or global grid. In 2007 the term cloud computing came into popularity, which is conceptually similar to the canonical Foster definition of grid computing (in terms of computing resources being consumed as electricity is from the power grid). Indeed, grid computing is often (but not always) associated with the delivery of cloud computing systems as exemplified by the AppLogic system from 3tera. As an example, we implemented a plugin-frontend couple that solves a partial differential equation on an elliptic domain with a random walkers approach According to benchmarks, GPU scales up to about 64 computers in a particular 2D-grid configuration where each node has degree 4. In a random topology, the duplicates problem is still present and should be fixed in order to achieve the normal Gnutella scaling. Some theoretical considerations are attempted: in particular, we discuss the coupon collector problem and we give an estimation for the small world problem using fractal theory As final.

Grid computing (or the use of a computational grid) is applying the resources of many computers in a network to a single problem at the same time - usually to a scientific or technical problem that requires a great number of computer processing cycles or access to large amounts of data. A well-known example of grid computing in the public domain is the ongoing SETI (Search for Extraterrestrial Intelligence) @Home project in which thousands of people are sharing the unused processor cycles of their PCs in the vast search for signs of "rational" signals from outer space. According to John Patrick, IBM's vice-president for Internet strategies, "the next big thing will be grid

computing."Grid computing requires the use of software that can divide and farm out pieces of a program to as many as several thousand computers. Grid computing can be thought of as distributed and large-scale cluster computing and as a form of network-distributed parallel processing. It can be confined to the network of computer workstations within a corporation or it can be a public collaboration (in which case it is also sometimes known as a form of peer-to-peer computing).

Grid computing is a term referring to the combination of computer resources from multiple administrative domains to reach a common goal. The grid can be thought of as a distributed system with non-interactive workloads that involve a large number of files. What distinguishes grid computing from conventional high performance computing systems such as cluster computing is that grids tend to be more loosely coupled, heterogeneous, and geographically dispersed. Although a grid can be dedicated to a specialized application, it is more common that a single grid will be used for a variety of different purposes.

Grids are often constructed with the aid of general-purpose grid software libraries known as middleware.Grid size can vary by a considerable amount. Grids are a form of distributed computing whereby a “super virtual computer” is composed of many networked loosely coupled computers acting together to perform very large tasks. Furthermore, “distributed” or “grid” computing, in general, is a special type of parallel computing that relies on complete computers (with onboard CPUs, storage, power supplies, network interfaces, etc.) connected to a network (private, public or the Internet) by a conventional network interface, such as Ethernet. This is in contrast to the traditional notion of a supercomputer, which has many processors connected by a local high-speed computer bus.

Page 19: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

13

Overview

Grid computing combines computers from multiple administrative domains to reach a common goal, to solve a single task, and may then disappear just as quickly.One of the main strategies of grid computing is to use middleware to divide and apportion pieces of a program among several computers, sometimes up to many thousands. Grid computing involves computation in a distributed fashion, which may also involve the aggregation of large-scale cluster computing-based systems.The size of a grid may vary from small—confined to a network of computer workstations within a corporation, for example—to large, public collaborations across many companies and networks. "The notion of a confined grid may also be known as an intra-nodes cooperation whilst the notion of a larger, wider grid may thus refer to an inter-nodes cooperation".Grids are a form of distributed computing whereby a “super virtual computer” is composed of many networked loosely coupled computers acting together to perform very large tasks. This technology has been applied to computationally intensive scientific, mathematical, and academic problems through volunteer computing, and it is used in commercial enterprises for such diverse applications as drug discovery, economic forecasting, seismic analysis, and back office data processing in support for e-commerce and Web services.

Comparison of grids and conventional supercomputers

“Distributed” or “grid” computing in general is a special type of parallel computing that relies on complete computers (with onboard CPUs, storage, power supplies, network interfaces, etc.) connected to a network (private, public or the Internet) by a conventional network interface, such as Ethernet. This is in contrast to the traditional notion of a supercomputer, which has many processors connected by a local high-speed

computer bus. The primary advantage of distributed computing is that each node can be purchased as commodity hardware, which, when combined, can produce a similar computing resource as multiprocessor supercomputer, but at a lower cost. This is due to the economies of scale of producing commodity hardware, compared to the lower efficiency of designing and constructing a small number of custom supercomputers. The primary performance disadvantage is that the various processors and local storage areas do not have high-speed connections. This arrangement is thus well-suited to applications in which multiple parallel computations can take place independently, without the need to communicate intermediate results between processors. The high-end scalability of geographically dispersed grids is generally favorable, due to the low need for connectivity between nodes relative to the capacity of the public Internet.There are also some differences in programming and deployment. It can be costly and difficult to write programs that can run in the environment of a supercomputer, which may have a custom operating system, or require the program to address concurrency issues. If a problem can be adequately parallelized, a “thin” layer of “grid” infrastructure can allow conventional, standalone programs, given a different part of the same problem, to run on multiple machines. This makes it possible to write and debug on a single conventional machine, and eliminates complications due to multiple instances of the same program running in the same shared memory and storage space at the same time.

Current Projects

Biology and medicine

Folding@Home - seeks to cure Cancer, ALS, Alzheimer's and many other diseases by observing proteins folding. Currently the fastest computer in the world at 8 petaFLOPS.

Page 20: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

14

Docking@Home — models protein-ligand docking.

GPUGRID.net — conducts full-atom molecular biology simulations, designed for CUDA-capable graphics processing units.

Malaria Control — performs stochastic modelling of the clinical epidemiology and natural history of malaria.[1]

POEM@Home — models protein folding using Anfinsen's dogma.

Rosetta@home — tests the assembly of specific proteins, using appropriate fragments of better-known proteins.

SIMAP — compiles a database of protein similarities.

Earth sciences

Climateprediction.net — attempts to reduce the uncertainty ranges of climate models.

Quake-Catcher Network — uses in, or attached to, internet- connected computers to detect earthquakes.

Physics and astronomy

AQUA@home — uses Quantum Monte Carlo to predict the performance of superconducting adiabatic quantum computers.

Einstein@Home — uses data from LIGO and GEO 600 to search for gravitational waves.

MilkyWay@Home — uses data from the Sloan Digital Sky Survey to deduce the structure of the Milky Way galaxy.

QMC@Home — uses Quantum Monte Carlo to predict molecular geometry.

SETI@home — searches cosmic radio emission data for extraterrestrial intelligence.

theSkyNet — searches data collected from radio telescopes such as ASKAP

Mathematics

PrimeGrid — searches for various types of prime numbers

Great Internet Mersenne Prime Search - searches for Mersenne primes

Multi-application projects

Ibercivis — studies nuclear fusion, materials science, neurodegenerative diseases caused by amyloid accumulation, the effect of light on nanomaterials, fluid mechanics, macromolecular docking, and the function of proteins in memory and learning.

Clean Energy Project — tries to find the best organic compounds for solar cells and energy storage devices.On phase 2.

Computing for Clean Water — uses the techniques of molecular dynamics to determine the fluid dynamics of water filters that are composed of nanotubes .

FightAIDS@Home — identifies candidate drugs that have the right shape and chemical characteristics to block HIV protease.

By

ARULMOZHISELVI.S

III –B.Sc(IT)

Page 21: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

15

WindowsOs Security Window Security Tools

Microsoft has been putting more effort into security, which Windows Server 2008 R2 and Windows 7 proves. They have been hardening the “out of the box” experience for some time and with the new Firewall and User Account Control features that come preconfigured, it is no wonder why many are moving to these more powerful and secure operating systems. Although the new, and even older, Windows operating systems are and can be more secure, what tools are available to help you configure your system for more than the firewall and UAC?

Microsoft Baseline Security Analyzer

MBSA has now been around for quite some time. The tool had great hopes when it first arrived on the scene, but has never developed into anything more than a tool that can be used to scan for installed patches. Yeah, MBSA does more than scan for patches, but the overall sense of the tool from nearly everyone is that it was never really all that useful.

The latest version of MBSA is v2.2 and can be downloaded here. The updated version of MBSA is not all that shocking, as it now supports Windows 7 and Windows Server 2008 R2, which everyone on the planet seems to be migrating to. Other features that MBSA v2.2 brings to the table include:

• Offline mode from graphical and command-line interfaces

• Support for Windows 7 and Windows Server 2008 R2

• Updated graphical user interface • Full support for 64-bit platforms • Improved support for Windows XP

Embedded platform • Automatic Microsoft Update

registration and agent update for graphical interface or from the command-line

• Output completed scan reports to a user-selected directory path or network share

• Windows Server Update Services 2.0 and 3.0 compatibility

As you can see from Figure 1, the tool is easy to configure and picking the computers you want to scan is easy too. You can either scan the computer where you are running MBSA, or you can pick a range of IP addresses.

Figure 1: MBSA 2.2 configuration options before scanning.

Once a computer is scanned, the results are clearly displayed and easy to read asshowninFigure2.

Page 22: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

16

Figure 2: MBSA 2.2 scan output and summary.

The major issues that I have with MBSA are that it just does not have any customization and the security scans seem to be arbitrary and not very extensive. I wish I could add additional Registry entries to the scan, so I can include all of the other security settings that need to be configured.

Security Templates

Security templates are not new, actually, they were first introduced in Windows NT! Yeah, this security tool has been around the block, but still provides some good centralized security options. Security templates provide an administrator the opportunity to configure some key security features, then use Group Policy via Active Directory to deploy the settings. Since Group Policy can configure multiple computers with just one set of configurations, security templates provide a way to configure many computers with very little effort. Security templates have been leveraged, then not used, then leveraged, then not used… as each operating system has been updated. For example, in the Windows 2000 era there were “pre-configured” security templates such as basicsv.inf, hisecdc.inf, securedc.inf, compatws.inf, etc. These pre-configured security templates allowed an administrator to easily implement a baseline of security without much effort. Security templates provide a way to configure some of the most common security features, which can be seen

Security template configuration areas.For more information on how to leverage security templates into a GPO go here. In a similar way that MBSA fails to impress me, security templates fall short of an amazing way to deploy security due to the lack of customization. Yes, Group Policy is customizable, but security templates are not. What you see is what you can configure.

Security Configuration Wizard

Security Configuration Wizard (SCW) has been available for some time, back to the Windows Server 2003 days. The tool used to be an out of band download, but now is installed on every Windows Server 2008 and 2008 R2 computer, available on the Start Menu-Administrative Tools list. The main points I want to make about the SCW tool are the following: SCW uses a security database, which is useful for what each Windows Server Role includes, including required firewall rules

• SCW touches on some hard to reach security areas, such as LM authentication protocol, SMB signing, and firewall rules

• SCW can consume security templates, adding to the configuration baseline options

• SCW results can be ported into a GPO using command-line options.

Server 2003 days. The tool used to be an out of band download, but now is installed on every Windows Server 2008 and 2008 R2 computer, available on the Start Menu-Administrative Tools list. The main points I want to make about the SCW tool are the following: SCW uses a security database, which is useful for what each Windows Server Role includes, including required firewall rules

• SCW touches on some hard to reach security areas, such as LM

Page 23: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

17

authentication protocol, SMB signing, and firewall rules

• SCW can consume security templates, adding to the configuration baseline options

By

PRABHU .M

III-B.Sc(IT)

Windows Home Server 2011

Windows Home Server operating system designed for Small Office/Home Office . Windows Home Server is intended to be a solution for homes with multiple connected PCs to offer file sharing, automated backups, and remote access. Windows Home Server 2011 was released on 6 April 2011. It follows the release of Power Pack 3 for the aging Windows Home Server (equivalent to client edition service packs) which added support for Windows 7 to Windows Home Server. Windows Home Server 2011 is considered a "major release". Windows Home Server 2011 is built on the Windows Server 2008 R2 code baseits predecessor having been built on Windows Server 2003 and only supports x86-64 hardware. Features: No new features have been announced by Microsoft, but reportedly will include additional entertainment capabilities including web-based media functionality and an 'add in' feature with an app store. Initial speculation by technology columnist Mary Jo Foley fueled the idea that 'Vail' would integrate withWindows Media Center. This prompted the response "Time will tell" by Microsoft Windows Home Server Product Planner Todd Headrick but by the time of the public beta Microsoft had decided not to integrate Windows Media Center with 'Vail'. Microsoft SQL Server: Microsoft SQL Server is a relational database

server, developed byMicrosoft: It is a software product whose primary function is to store and retrieve data as requested by other software applications, be it those on the same computer or those running on another computer across a network (including the Internet). There are at least a dozen different editions of Microsoft SQL Server aimed at different audiences and for different workloads (ranging from small applications that store and retrieve data on the same computer, to millions of users and computers that access huge amounts of data from the Internet at the same time) Tools:SQLCMD:SQLCMD is a command line application that comes with Microsoft SQL Server, and exposes the management features of SQL Server. It allows SQL queries to be written and executed from the command prompt. It can also act as a scripting language to create and run a set of SQL statements as a script. Such scripts are stored as a .sql file, and are used either for managementofdatabases or to create the database schema during the deployment of a database. SQLCMD was introduced with SQL Server 2005 and this continues with SQL Server2008. Its predecessor for earlier versions was OSQL and ISQL, which is functionallyequivalentas it pertains to TSQL execution, and many of hecommand line parameters are identical, although visual studioMicrosoft Visual Studio includes native support for data programming with Microsoft SQL Server. It can be used to write and debug code to be executed by SQL CLR. It also includes a data designer that can be used to graphically create, view or edit database schemas. Queries can be created either visually or using code. SSMS 2008 onwards, provides intellisense for SQL queries as well.SQL server management studio SQL Server Management Studio is a GUI tool included with SQL Server 2005 and later for configuring, managing, and administering all components within Microsoft SQL Server. The tool includes both script editors and graphical tools that work with objects and features of the server SQL Server

Page 24: Infoline Team - Erode · INFOLINE 1 Free Space Optical Communication An 8-beam free space optics laser link, rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is

INFOLINE

18

Management Studio replaces Enterprise Manager as the primary management interface for Microsoft SQL Server since SQL Server 2005. A version of SQL Server Management Studio is also available for SQL Server Express Edition, for which it is known as SQL Server Management Studio ExpressA central feature of SQL Server Management Studio is the Object Explorer, which allows the user to browse, select, and act upon any of the objects within the server. It can be used to visually observe and analyze query plans and optimize the database performance, among others. SQL Server Management Studio can also be used to create a new database, alter any existing database schema by adding or modifying tables and indexes, or analyze performance. It includes the query windows which provide a GUI based interface to write and execute queries sam. Business Intelligence Development Studio (BIDS) is the IDE from Microsoft used for developing data analysis and Business Intelligence solutions utilizing the Microsoft SQL Server Analysis Services, Reporting Services and Integration Services. It is based on the Microsoft Visual Studio development environment but customizes with the SQL Server services-specific extensions and project types, including tools, controls and projects for reports (using Reporting Services), Cubes and data mining structures.

By

RAMYA.R

III–B.Sc(CT)