Case Study of Os

27
Assignments On: Case study on os 

Transcript of Case Study of Os

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 1/27

Assignments On: 

Case study on os 

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 2/27

 

CASE STUDY OF OPERATING SYSTEM

 1. MS DOS

2. MICROSOFT WINDOWS

A. XP

B.WIN VISTA

C.WIN 7

3. UNIX

4. APPLE MACINTOSH OS

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 3/27

A. MAC OS (1984-2001)

B. MAC OS X 

CASE STUDY OF OPERATING SYSTEM

Case study 1 ± MS-DOS

Nowadays, its impossible to introduce desktop computers without seeing the name of Microsoft

around sooner or later. This company totally managed to take over this world. If they tell your computer

to commit suicide, unless youre some kind of bearded geek not using their software, it will.

The story of Microsoft begins with the emergence of personal computers, and with two clever guys

called Bill Gates and Paul Allen who loved to mess around with computers. They were popular at the

time in the computing world, because they wrote useful software (things that made writing other

software much easier, a spreadsheet, a world processor) on a low-cost computer called the Altair 8800

(that didnt even included a keyboard and a monitor). When people at IBM who ruled a big part of the

computer world at the time started having an interest in micro-computers, building what was going to

be a big hit called the IBM PC, they were attracted by this popularity and licensed some of Gatess

software. And when they failed to get the major operating system of those days, CP/M, they went back

to Gates and asked him if he could provide them for an OS for it. Gates found one, bought it, and hired

the one who wrote it in order to modify it so that it fits IBMs need. MS-DOS was born.

The first releases of MS-DOS essentially provided the file abstraction (ie the ability to consider that disk

space is divided into chunks of data called files, allowing one to ignore the actual structure of the floppy

disk he works on), some primitive tools to manipulate internal memory and text files (ie files containing

text and numbers), and basic configuration routines, everything being stolen from the dominant OS of 

those days, CP/M. One used it by typing commands on a keyboard, pressing Return, and reading a

resulting dump of text on the screen, an interface thats calledcommand line. However, it introduced a

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 4/27

new commercial idea that was here to last : the idea of selling an operating system bundled with a

computer, betting that users wont bother using another one if theres already something working

there.

The command line interface

Subsequent releases included support for higher-capacity data storage, directories (a kind of box which

may contain files and other directories, and hence a hierarchical way to organize files), then for more

languages, then for clocks (a chip whose goal is to measure time, useful not only for calendar/watches-

like applications but also for real-time applications) and even higher data storage, then for computer

networks. After that, DOS 4, 10 times heavier than the first release of DOS with 110 KB memory usage

(which is 1/30 the size of a common MP3 file), introduced dosshell, a new optional way to explore files

that was a little less linear than the command line, direct ancestor of windowss file explorer.

The Shell interface

Subsequent releases of MS-DOS, besides adding up support for a higher storage space (again) through

multiple hard drives management and a new file management system, and managing more RAM,

introduced three new tools : one to compress files in order to economize disk space (at the cost of 

slower data access), one to check the hard drive for errors (due to bad machine shutdown as an

example) and try to fix them, and a simple antivirus, MSAV.

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 5/27

As one may figure out, at this stage, DOS didnt really evolved anymore. Those last things really were

secondary, questionable features, that were added just to sell new releases of the operating system and

make money from it, and maintenance improvements for keeping in touch with new hardware. MS-DOS

had reached a mature stage of evolution, and was a bit left behind while Microsoft were now working

on their new product, Windows, initially running on top of DOS, which were going to describe in a later

article.

Now that we have described the features and evolution of MS-DOS, we can discuss them. A few points

shine, especially :

Most of the development effort on the hardware abstraction side was spent on file and storage space

management : Some releases of DOS took months, years of development, just to add support for newer,

higher-capacity diskettes. Improvement on OS support of other kind of hardware was very limited.This

may be partly explained by the fact that, at the time, everyone was okay with letting programs deal with

the bare hardware. Hardware manufacturers made the thing a lot easier by letting everyone know how

to interact with their hardware, and by introducing hardware that was dead easy to manipulate for

someone used to assembly language. Standards were the rule rather than the exception, so if you could

make your program work on a computer, you were 95% sure that it would work on other hardware

without modifying it in any way. There wasnt that much viral software at the time to harm the machine

given direct access to it, especially due to the fact that people didnt downloaded and run random

software from the internet, for the internet didnt exist for most people. On the other hand,

performance was a critical issues, and theres no program faster on a specific hardware that one written

specifically for it, using it directly.

Part of that wasnt true for storage hardware. First, there wasnt any kind of dominating storage

medium, there was a jungle of incompatible technologies (multiple kinds of tapes, various flavors of 

diskettes, the first hard drive disks), and they shared the common characteristic of being awful tomanipulate. Then, performance wasnt such an issue for file storage : if you store something, its not for

using it right away, and you dont spend your time reading and storing files on a diskette in your

programs when youve got a main memory thats a lotfaster. Last, file storage and manipulation was

almost the sole thing that an average unskilled user, trying to use software rather than write it, was

forced to get into, in order to find and run the software he used, or to copy the text files he wrote to

disk, so the process had to be as simple as possible. The Shell, whose only purpose was to better

visualize the hierarchical structure of directories and find files quicker, perfectly illustrates that.

Little to no planning, features come as needed : Lets see As DOS aged up, 2 different file systems

ways to manage the file abstraction were used, one after another, called FAT12 and FAT16. FAT16 was

introduced to address the maximum disk capacity limit of FAT12. They are extremely close to each other

for older programs maximum compatibility reasons, so that differentiating FAT12 and FAT16 is

extremely difficult, but at the same time they are structurally incompatible. This is a typical example of a

hack, a modification in a program that a developer introduces when he understands that he messed up,

but dont want to make a new design doc and other silly rigorous conception, just want to fix the sole

thing that doesnt work. Perhaps the most common source of bugs in software is when people forget

about the hack (or dont even know about it) and push it to its limits.

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 6/27

At the time of DOS, they couldnt plan things, since they didnt have a clue of how the computer

business would move on, so they introduced gradual changes, leading to hack accumulation, and hence

bug multiplications. Lets anticipate the following articles by saying that this is one of the reasons that

led to the abandon of DOS later : managing, correcting, and more generally modifying it had become too

complicated, since no one could ever understand how it exactly worked. The operating system needed a

complete rewrite.

From the history of DOS, we may extract the following keys of its success and subsequent fall :

Always focus on whats important first : Well, reasons for that are pretty obvious

but dont neglect the OSs theoretical foundation and design : Or, sooner or latter, hacks will occur.

Make the doc simple, complete, and easily accessible : And you save people a lot of trouble, preventing

effort duplication on the way. Hardware manufacturers knew that, at the time.

Dont neglect the average user : After all, theres a lot more users than theres developers, so you know

which one makes lots of sales.

Cleanness is the friend of reliability : It always seems fine at the time, but it is the source of most of 

computer lack of reliability nowadays.

Compatibility may be the enemy of cleanness if the software is poorly designed to start with : This will

be even more obvious after we study the story of Windows.

The search of maximal performance may be an enemy too : There are times when only hack will make

a program faster. Though if I were you, Id better choose reliability and simplicity of design over

performance, as long as said performance is sufficient.

With those last three points, one figures out that OS design is often made of dilemmas. This is not the

last time Ill be saying it. Thanks for reading !

Case study 2 ± Microsoft Windows

Microsoft were close to getting rid of MS-DOS and introducing a new product, Windows. We¶re now going to describe

why they did that, how they did it, and what kind of issues they¶ve been facing then and are still facing now. Please

note that we¶ll focus on Windows releases for the consumer market only.

DEATH OF THE CLI 

First, what is windows initially, and why would Microsoft wish to introduce such a product ? Well, from

an average users point of view, DOS sucked badly by one point : it was using a command-line interface

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 7/27

(CLI), which is, I recall you, when one types in a command, hits Return, after what the computer starts

some computation and displays the results.

Command-line is a preferred interface for most power users and developers because :

Its fast : Text manipulation is a relatively easy task from a computing point of view, and the command

line syntax is made so that the computer may analyze commands very quickly. On the user side, typing a

command is a speedy task because it doesnt require you to analyze what the computer tells you. The

whole command line syntax is engraved in your brain, you just have to remember it, which is pretty

quick if you use it everyday. And even if youve got thousands of programs, typing in command-line isnt

slowed down the tiniest bit.

It has very low hardware requirements : Modern graphical interfaces have important resource

consumption. They have to track the motion of the mouse, constantly re-draw windows as needed,

putting flashy, real-time-rendered, animated eye-candy everywhere, keep in memory big uncompressed

images corresponding to whats displayed on the screen, along with some temporary data used for

quicker and smoother drawing By comparison, command line only requires a few KB of memory, few

CPU power, and no advanced graphic card.

Its a perfect tool for repeated tasks : Suppose that there is a single thing that you want to do on twelve

computers. You know what commands you have to type in order to obtain this result. Then you can just

put those commands in a text file (called a batch file or a shell script depending on the computing

church you belong to), one after the other, and give the text file to your operating system, which is

going to automatically read and execute all the commands in it. Think of it : twelve computers to

manipulate, a single batch file to carry around and run

It is streamable : Command-line input and output may be redirected from and to various I/O devicesand disk files. The former allows one very low-powered, dumb terminal computer to only read input

from and send input to a central server, which was practical when computers were big and expensive

because it allowed people to work on tiny, inexpensive machines, that would cost far less to replace if 

they died because of a misplaced cup of coffee. Though not needed anymore, this way of doing things

still is loved by the paranoiac and dictatorial system administrator. Streaming to a file allows to keep a

record of the huge dump of text spurred out by some command-line programs in order to keep a track

of it and carefully analyze it later. Newer command-line interfaces also allows one to use the output

from a program as input for another program directly. Just think of the possibilities : if you want to

delete all files containing foo in their name, you just use DIR *foo* to list such files and directly send

the output of it to the REM program which deletes files.

It has some features of modern programming languages : Latter interfaces also introduced loops (do a

task multiple times) and conditional structures (if certain things happen, do this, otherwise do that),

making command-line a (relatively) simple way to make little programs that work on almost any

computer provided they use the same operating system.

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 8/27

Its easily extensible : Any program may, under certain simple conditions, become a new command in a

command-line interface, and managing thousands of programs in command-line isnt nearly as

complicated as it can be with other interface paradigms

However, command-line is just awful for normal people because :

Reading the manual is mandatory : You cant learn how the system works progressively, knowing only a

few basic concepts. You have to immediately stick in your memory several basic commands (CD, REM,

DIR, TYPE, HELP) and what theyre used for, because without that you just cant do anything. Such a

necessity is very frustrating for new users, who generally prefer to jump in and learn later as much as

possible. Its also a major source of errors while one gets used to the system, which is even more

frustrating.

Error prevention does not exist : Suppose that youve been starting to type a big command. But the

second word is wrong, because you were sleepy or dont know anything by heart yet. In a command line

system, the computer does not analyses what you told him to do before you press Return. So its only

after youve sent the whole command that the computer says Im sorry, Dave. Im afraid I cant do

that (or rather Bad command or filename in DOSs poetic words), and then you have to check the

entire long command for errors.

Error detection and correction is poor : The basic structure of most command line systems is made so

that several frequent and simple errors cant be properly detected and reported to the user by the

system. Hence the user must do error correction all by himself. Simple example : several command

accept an infinite number of parameters, so that you cant know by the number of parameters given by

the user if a mandatory parameter is missing. All the operating system can say is that there is an

incorrect parameter somewhere. Even worse, there are (several) cases where the operating system

cant detect the error at all and does nonsense, making the user think that its buggy.

There is a huge lack of hierarchy : Even though the average user is only going to use five commands or

so, the operating system treats ALL commands as being equal. This means that if you want to type

DIR, the command for displaying the content of directories, but only remember that it starts with a D,

have lost your manual, and want the system to display all commands starting with a D, you may get

something like

DAT

DATPCK

DCHECK

DDDEFLATE

DESTOR

DIR

DRWATSON

DUCK

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 9/27

DWM

DZBRWZ

or, usually, something much longer, and have to read through the entire list, looking for your

command.

So while command-line satisfies perfectly old-school and professional people, it should now be obvious

that its not okay for average users, that together form, say, 95% of Microsofts targeted audience.

Clearly, something was wrong in this core concept of MS-DOS, and had to be fixed.

A solution to this problem came from research made by the team of Doug Engelbart (Stanford Research

institute), and later by people working in a huge laboratory called Xerox PARC. Its called the GUI

(Graphical User Interface), and more specifically the WIMP (Windows, Icons, Menus, Pointer) paradigm.

The user interacts with programs, including the operating system, through the use of a pointing-and-

clicking device (generally a mouse, but touchscreen are gradually getting popular nowadays) that allows

an on-screen cursor, the pointer, to be moved on the screen. Programs and certain functionalities are

launched by clicking on icons, tiny pictures that should more or less indicate what theyre used for, but

usually helpfully associated with a tiny text explaining it. Those programs work in well-defined areas of 

the screen called windows (that the user can usually conveniently move or resize). The icon model isnt

that great when a large set of actions must be provided, so one uses the menu system : a list of action

that is shown temporary for interaction purposes an then disappears (just right-click on the present text

and youll get a perfect example of menu).

WINDOWS XP

Just one year after Windows Me, Windows XP was out. It merged together professional-oriented

features and superior core from Windows NT on one side and consumer-oriented features from

Windows Me and 9x on the other side, in one product that sold like hotcakes, including :

True application and ker nel data isolation : Coming from the NT branch, this neat feature makes sure that

applications cant mess with each others data and that they cant gain direct access to the hardware.This makes Windows XP the first consumer-oriented OS from Microsoft with serious security

capabilities.

Multiple user s management : Also coming from the NT branch. One may now create several users for the

OS. Those users might have more or less limited capabilities as needed, and they have their own private

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 10/27

folder, which other users (except administrators) cannot read, in which any user-specific data (wallpaper

used, games saves) are stored..

Relative har dware independence : NT-related too. All the hardware specific-code is limited to a subset of 

the kernel and can easily be changed. This allowed Windows XP to be easily ported to 64-bit processors

as they came out, for the sake of improved performance.

Far increased reliability : Windows XP was by far one of the most reliable OSs ever conceived by

Microsoft. Complete system crashes (aka Blue Screens of Death or BSODs) that were quite common at

the time of Windows 98 and even more with Windows Me became rather the exception than the rule,

especially after the introduction of several Service Packs, because applications bugs werent capable of 

doing much harm anymore.

DOS emulation and gener al good compatibility : Though not perfect on that side, Windows XP made great

efforts to ensure application compatibility with Windows Me, Windows 9x, and DOS. Even some

Windows 1 applications can still run flawlessly on it.

Mature multimedia suppor t : With Windows XP, multimedia features finally reached a perfectly mature

state. As an example, there was out-of-the-box excellent support for scanners and cameras. Movie

maker became somewhat less laughable. All the power of Windows 9x DirectX multimedia

infrastructure was flawlessly introduced to Windows XP. Folders could display little thumbnails of 

images in order to help managing photos. One could burn data CDs easily since CD burners became a

common thing at the time. And so on

More networ king : The successive releases of Windows XP implemented all newer networking devices

and protocols one by one : DSL, Wi-Fi, WEP then WPA encryption protocols

Var ious UI tweaks and stupid def ault theme : The new start menu now displays frequently used

applications and various user folders, the file explorer displays common tasks in order to interact

quickly with files, big icons makes it easy to target important elements, and other delicate attentions,

sadly including even more annoying popup windows filled with garbage than ever Windows XP also

introduced ability for the user to change the appearance of windows and controls using themes, a

very popular feature. Especially looking at the childish and hard on the eyes theme bundled with

Windows XP.

Bloated task manager : Windows 95 and later featured the Ctl+Alt+Suppr keystroke, used in order to

close hanged programs. It displayed a list of opened programs, along with a button to close one, with

extreme simplicity and efficiency. However, starting with Windows XP, things wont go this wayanymore. The Task Manager, as its called, will allow one to know about CPU usage, memory usage,

networking, users logged in, and so forth. When an application is hanged, taking 100% CPU, this

application will take minutes to load. Hence the improvements make it quite inefficient at closing

buggy application, and one has to wonder what else it had to be used for in Microsofts engineers mind.

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 11/27

Def initive f ailure at making a simple design : Though NT included more interesting feature in its internals

than I may describe here (in fact, Microsoft included almost any existing solution for each OS design

problem in it), this came with a cost : Windows NT internals are horribly complicated, and have a high

footprint on performance and memory usage. Developing for it is a hard task, because there are, say,

ten alternative approaches to each problem, with no clue which one is the fastest for a specific

purpose..

Bloated ker nel : Somewhat related to the preceding issue. Windowss NT core includes a lot of 

abstractions, alternative ways to manage the same task, but also running as privileged applications were

hardware drivers. This makes a lot of machine code running as maximum-privilege software, reducing

the benefit of kernel data isolation by the hardware (since several unauthorized operations can occur

using bugs in this huge mass of computer code), and more generally that the system is prone to failure

due to buggy software with high security privileges. This led to poor security in Windows XP, albeit still

better than that of preceding consumer-oriented releases of Windows.

Slower than ever : Windows XP was quite slow at doing everyday tasks like file management or launching

applications, close to Windows 98 in that respect despite running on hardware that was four times

faster. Microsofts plans to reduce boot time to 30s turned out to take the form of showing the desktop

a long time before its really usable, with the operating system silently continuing to load in the

background.

Pr oduct activation : On the early days of computing, one would just buy software, put the diskette or cd-

rom in the drive, install it, and use it. However, a new trend dawned by the time of Windows XP : clearly

stating that the user wasnt proprietary anymore of the software he was using. Alongside with several

piracy concern, this led to an activation process, where the owner of a Windows XP licence had to call

microsoft or go on some website, with failure to do so leading to software not functioning after 30 days.

This process had to be repeated anytime an important piece of hardware was changed on the PC. This

process allowed Microsoft to make sure that a product key of Windows XP wasnt used multiple times.

(As an aside, it didnt stop piracy at all, but it made several users angry, like any copy protection

measure of this type.)

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 12/27

 

Bozo the clown-oriented theming and new start menu 

Windows XP was extremely well perceived, noticeably because of its great reliability and well-

implemented multimedia features. Its slowness and somewhat poor security led to some criticism, but

globally there was far more supporters than haters. This led Microsoft to give a focus on improving

Windows XPs security, rather than release a new operating system right away, perhaps one of the

wisest design decision they ever made.

 WINDOWS VISTA

The Veterans Health Information Systems and Technology Architecture (VistA) is an

enterprise-wide information system built around an Electronic Health Record (EHR), used

throughout the United States Department of Veterans Affairs (VA) medical system, known

as the Veterans Health Administration (VHA). It's a collection of about 100 integrated

software modules.

By 2003, the VHA was the largest single medical system in the United States, providing care to

over 4 million veterans, employing 180,000 medical personnel and operating 163 hospitals,

over 800 clinics, and 135 nursing homes. About a quarter of the nation's population is

potentially eligible for VA benefits and services because they are veterans, family members, orsurvivors of veterans.

By providing electronic health records capability, VistA is thereby one of the most widely

used EHRs in the world. Nearly half of all US hospitals that have a full implementation of 

an EHR are VA hospitals using VistA.

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 13/27

 

Features 

The Department of Veterans Affairs (VA) has had automated data processing systems, including extensive

clinical and administrative capabilities, within its medical facilities since before 1985.[7] Initially called the

Decentralized Hospital Computer Program (DHCP) information system, DHCP was enshrined as a

recipient of the Computerworld Smithsonian Award for best use of Information Technology in Medicine in

1995.

VistA supports both ambulatory and inpatient care, and includes several significant enhancements to

the original DHCP system. The most significant is a graphical user interface for clinicians known as the

Computerized Patient Record System (CPRS), which was released in 1997. In addition, VistA

includes computerized order entry, bar code medication administration, electronic prescribing and clinical

guidelines.

CPRS provides a client±server  interface that allows health care providers to review and update a patient's

electronic medical record. This includes the ability to place orders, including those for medications,

special procedures, X-rays, nursing interventions, diets, and laboratory tests. CPRS provides flexibility in

a wide variety of settings so that a consistent, event-driven, Windows-style interface is presented to a

broad spectrum of health care workers.

Windows Vista Advantages: 

1. Improved Graphical User Interface, the Aero:

There is an entire new GUI for Windows Vista. The appearance of window, desktop, start button, start

menu, taskbar, everything got a new look. Now, the window in Windows Vista looks similar to that of 

in Mac OSX. Minimize, maximize and the close buttons wore an entire new strange look. The windows

can be made to appear in translucent, 3D modes, if the new Aero is applied. The start button has been

modified to give a new look. The word start has been removed. Only the new Windows logo gets appear

in start button. Start menu also got a new look. The desktop wallpapers got a new look too. The desktop

contains a side bar, which shows a clock, and some of the widgets. Widgets are small programs, which is

equivalent to standard short cuts.

2. Manage your knotty kids, using parental controls:

One can control kids by using parental controls in Windows Vista. Parents can now deny access to

certain programs especially, the games. Through separate child account, parents can now keep track of 

what all activities done by the child. It also enables parents to control the activities by blocking certain

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 14/27

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 15/27

installed. Totally, I would like to say that, Windows vista eats up lot of resources, than its predecessor,

the Windows XP.

2. Regarding price, sucking your hard earned money:

The price of Windows Vista Ultimate edition seems to be too high. Ordinary user cannot thus, have alook into all the features of Vista, which is only available in the Ultimate Edition. Microsoft seems to be

not so interested in country wise markets. The prices are set according to the US market. However these

prices are on higher side for developing asian countries. Microsoft should set the price of Vista as per

the market. It is the time for Microsoft to think over this aspect, having introduced stricter validation

process.

3. Replace the oldies, another way of extracting:

This is a usual problem, which always exists whenever new operating system gets released, and you

have an old system or have some old components in the system. If one decides to install windows vista inthe old system, then they should check the compatibility of system components, checking whether Vista

supports them or not. Some manufacturers will still provide Vista support for the oldies by providing the

latest drivers for the same. If manufacturers are not providing the latest drivers, you need to purchase a

new device as a replacement for the existing device. Before, you purchase a new device you need to

check out for Vista compatible or Vista ready logo, which is put up on the device.

4.Window appearance. Did you like the different look?

In Windows Vista, the window appearance underwent lot of unwanted changes. The windows in

Windows Vista, the window appears similar to that of  MAC OSX. The Minimize, Maximize, and Close

buttons, wore a different look. The three buttons got reduced in their sizes, making them unclear to

aged people, people with eye sight problems. The minimize, maximize, and close buttons have reduced

in sizes. The icon in the other end has disappeared, which makes the window slightly dull in appearance.

In overall, the new operating system, Windows Vista is said to be an ideal replacement to its

predecessor, the Windows XP.

 WINDOWS 7

This led Microsoft to be more cautious with the next release of Windows, Windows 7. They didnt gave

up on any idea or technology from Vista, even keeping the highly laughable Flip 3D, but didnt

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 16/27

introduced lots of destabilizing new feature this time, and rather focused on improving performance as

much as technically possible and fine-tuning any new concept from Vista. It included :

Major debugging and optimisation eff or t : UAC showed less popups. Disk footprint was reduced to around

6 GB and memory footprint to around 512 MB. CPU usage was dropped. Animations were a little bit

more quiet. The system boots and shuts down faster than Vista, though a bit slower than Windows XP(but the desktop is available faster). In one word, they made software that used Vista technology while

being able to run smoothly on a netbook, and that was somewhat less annoying from an end user

perspective.

Some applications disappeared : Perhaps the first time Microsoft do this. Most Windows Live-related

content (Messenger/Calendar/Mail) was moved out of Windows, along with the movie making

functions. Those are now only available for download on Windows Lives website.

Some advanced options disappeared : In Windows Vista, it was possible to turn off the transparency

effects of the windows, in order to see their title better or to push laptop battery life a bit further. In

Windows 7, I didnt get to find that settings window anymore and suspect that its gone. Windows 7 also

introduced a new way to manage peripherals thats a lot more user-friendly than the old one, but that

feels somewhat incomplete facing its older counterpart. The advanced search windows, along with its

interesting options like specifying where to search and the like, is now gone, making using search bars

mandatory.

The Sidebar disappeared : No gray translucent thing on the desktop anymore. However, its gadgets are

still around. They just show up directly on the desktop now.

Focus on touch screens : Windows 7 finishes the work in Vista to introduce proper support for this. By

proper support, I mean that one may now easily right-click, scroll in windows, zoom in and out, get anon-screen keyboard, or use some windows features through gestures. In no way were Windows

applications display modified to make them touch-friendly, meaning that menus, toolbars, and the like

are pretty much unusable because of their small size. Only some special touch-enhanced applications

are available. Still Microsoft has time : computers with touch screen still arent common or useful

enough to make improvements like this the top priority.

Libr ar ies : Microsoft discovers symbolic linking technology. Libraries are a special kind of folder that only

contains links to the files in it. Their link nature is hidden to normal applications, making it possible to

interact with those links exactly the same way one interacts with the real file. Such technology allows

one to organize its file better, by putting them in more than one place without needing extra disk space.

Its worth nothing to point out that this feature has been available in about all other significant OSs for

ages, without the need to create a special kind of folder in order to use it.

Major  Taskbar  revamp : Made bigger in order to be easier operated on touch screens, the Taskbar now

steals several ideas and design principles from Apple OSXs own taskbar-like application called the Dock

(Apple having themselves stolen the taskbar idea from Microsoft, this looks somewhat fair) and

introduces some new concepts. First, its now application-oriented, meaning that instead of showing

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 17/27

one button for each opened window it shows one button for each opened application. Then theres only

few distinction between buttons to launch favorite programs and opened windows, especially since

program icons arent accompanied by their title anymore : theres only icons. Thats somewhat bad,

because its sometimes not that easy to figure out what an icon is about, introducing the need to move

the mouse on the icon in order to know more.

When one moves the mouse on top of the icon corresponding to an opened program, a popup appears,

allowing one to choose one of the opened windows (this choice being helped by hiding all other

windows with a useless glassy effect, see screenshot below). Another improvement, minor at the

moment but having some potential, is the introduction of Jump lists : by right-clicking on one programs

icon, one gets access to a list of commonly used options and recently opened files, provided said

program is compatible with this feature.

Last but not least. the Notification area has been redone. Windows XP introduced the ability to auto-

hide inactive icons, which was a feature somewhere between inefficient and useless, but Windows 7

finally introduces something grand : the ability to prevent them from showing popup balloons endlessly.

Basically, if a program keeps annoying you with stupid popups such as connexion lost, awaiting

connexion or is your antivirus up to date ?, you may now tell it to shut up either definitely, or the

time before youre ready to listen to them. Thats what any long-time Windows user will call really great

!

T he new T askbar  

Var ious Star t menu tweaks : The shut down button now really shuts down the computer, and is labelled

as such to advertise this important improvement (though its function may be changed, too). Jump lists

of applications are available here too. The search box may now search control panel items.

Var ious windows management tweaks : Some windows management can now be done using mouse

gestures, by grabbing the windows to one side of the screen. Its now easier to have two maximized

windows side by side : dragging one to the left makes it maximized and taking the left half of the screen,

dragging the other to the right make it do the same but on the right side of the screen.

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 18/27

 

Microsoft Windows 7  

Okay, so now is the time for a global review of Microsoft Windows and for a word of conclusion.

CONCLUSION 

So what can be globally said about Microsoft Windows and remembered as a lesson from the past ?

Micr osof t are sometimes extremely good at keeping compatibility« : This is one of the best explanations to

Microsofts monopoly nowadays, since its one of those things that allow Windowss application catalog

to be so huge. Though, sadly, its often the applications that matter (like games or working applications

costing hundreds of dollars) that wont work, lots of Windows 1 applications still run on windows vista32bit, even though it uses a totally different core. In the same way, core interface concepts like the

Taskbar, windows maximize/iconize/close buttons, or file explorer, have gone a long way without

significant modification, and still no one would complain about them before Vista and 7 would tweak it.

«but can¶t prevent themselves fr om intr oducing new features and re-designing the API« : This is not

always a bad thing, since it brought several innovations in the computing world. But overdoing this is

dangerous. The Windows NT structure is insanely complicated, because for almost each problem

encountered in operating system design its made so that it would support many different solutions

provided in the past. And thats just the core. Almost each new releases of Windows, introduces quite a

revolution in the development kit, meaning that developers have to get used to a new way of doing

things in order to benefit from the new features and to get compatibility. And, as youd guess, only the

latest version is totally supported by Microsoft engineers, though the older GDI, MFC, and the like still

remain around. In one word, compatibility + feature-adding frenzy = a mess. From an average users

point of view, Windows used to be somewhat more stable and less surprising, but starting with Vista,

it looks like those times are over.

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 19/27

Features aren¶t always well-chosen nor well-designed : Who the hell needs Wordpad for anything ?

Whatever you may think of, this piece of software sucks at it Vistas explorer revamp was globally

unwanted and not very well-perceived, whereas one had to wait for Windows 7 to see the cleanup and

increase in performance that just everybody was waiting for after the release of Windows XP.

Ear ly releases intr oduced a good equilibr ium between use of icons and text : This has somewhat been loststarting with Windows Vista, sadly.

Windows intr oduced the pr oblem of managing big menus or lists : Just look at the Programs menu length

on most computers

Windows intr oduced monopoly abuse, too : And how ! If I was to choose just one example, it would be the

integration of IE, Media Player, and C#/.NET

Windows gets globally too much in the way of the user since W95 : Popups filled with garbage, endless

are you sure you want to do this ? questioning, lack of system access for the user since windows Me

And lets not talk about all those Windows Update popup just waiting for you to click OK instead of  just doing it themselves

Case study 3 - UNIX

Introduction:Unix (officially trademarked as UNIX, sometimes also written as UNIX with small caps) is a

multitasking, multi-user computer operating system originally developed in 1969 by a group of AT&T employees at 

Bell Labs, including Ken Thompson, Dennis Ritchie, Brian Kernighan, Douglas McIlroy, and Joe Ossanna. The Unix

operating system was first developed in assembly language, but by 1973 had been almost entirely recoded in C,

greatly facilitating its further development and porting to other hardware. Today's Unix systems are split into variousbranches, developed over time by AT&T as well as various commercial vendors and non-profit organizations. 

The Open Group, an industry standards consortium, owns the UNIX trademark. Only systems fully compliant with

and certified according to the Single UNIX Specification are qualified to use the trademark; others might be called

"Unix system-like" or "Unix-like" (though the Open Group disapproves of this term[1]). However, the term "Unix" is

often used informally to denote any operating system that closely resembles the trademarked system. 

During the late 1970s and early 1980s, the influence of Unix in academic circles led to large-scale adoption of Unix

(particularly of the BSD variant, originating from the University of California, Berkeley) by commercial startups, the

most notable of which are Solaris, HP-UX and AIX. Today, in addition to certified Unix systems such as those already

mentioned, Unix-like operating systems such as Linux and BSD descendants (FreeBSD, NetBSD, and OpenBSD) are

commonly encountered. The term "traditional Unix" may be used to describe a Unix or an operating system that has

the characteristics of either Version 7 Unix or UNIX System V. 

Features Of Unix 

multi-user 

more than one user can use the machine at a time 

supported via terminals (serial or network connection)

multi-tasking 

more than one program can be run at a time

hierarchical directory structure 

to support the organisation and maintenance of files

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 20/27

portability 

only the kernel ( <10%) written in assembler 

a wide range of support tools (debuggers, compilers)

The tools for program development  

UNIX Operating System 

Consists of 

kernel 

schedules tasks 

manages data/file access and storage 

enforces security mechanisms 

performs all hardware access

shell 

presents each user with a prompt  

interprets commands types by a user 

executes user commands 

supports a custom environment for each user

utilities 

file management (rm, cat, ls, rmdir, mkdir) user management (passwd, chmod, chgrp) 

process management (kill, ps) printing (lpr)

Unix Shell

A Unix shell is a command-line interpreter or shell that provides a traditional user interface for the Unix operating

system and forUnix-like systems. Users direct the operation of the computer by entering command input as

text for a command line interpreterto execute or by creating text scripts of one or more such commands.

The most influential Unix shells have been the Bourne shell and the C shell. The Bourne shell, sh, was

written by Stephen Bourne at AT&T as the original Unix command line interpreter; it introduced the basic

features common to all the Unix shells, including piping, here documents, command substitution, variables, control

structures for condition-testing and looping and filename wildcarding. The language, including the use of a

reversed keyword to mark the end of a block, was influenced byALGOL 68.[1] 

The C shell, csh, was written by Bill Joy while a graduate student at University of California, Berkeley. The

language, including the control structures and the expression grammar, was modeled on C. The C shell

also introduced a large number of features for interactive work, including

the history and editing mechanisms, aliases, directory stacks, tilde notation, cdpath,  job control andpath hashing.

Both shells have been used as coding base and model for many derivative and work-alike shells with

extended feature sets.

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 21/27

 

Concept

The most generic sense of the term shell  means any program that users employ to type commands. In

the Unix operating system users may select which shell to use for interactive sessions. When the

user logs in to the system the shell program is automatically executed. Many types of shells have been

developed for this purpose. The program is called a "shell" because it hides the details of the underlyingoperating system behind the shell's interface. The shell manages the technical details of the operating

system kernel interface, which is the lowest-level, or 'inner-most' component of an operating system.

Similarly, graphical user interfaces for Unix, such as GNOME, KDE, and Xfce can be called visual 

shells or graphical shells.

By itself, the term shell  is usually associated with the command line. In Unix, users who want to use a

different syntax for typing commands can specify a different program as their shell, though in practice

this usually requires administrator rights.

The Unix shell was unusual when it was first created. Since it is both an interactive command language aswell as a scripting programming language it is used by Unix as the facility to control (see shell script) the

execution of the system.

Many shells created for other operating systems offer rough equivalents to Unix shell functionality.

On systems using a windowing system, some users may never use the shell directly. On Unix systems, the

shell is still the implementation language of system startup scripts, including the program that starts the

windowing system, the programs that facilitate access to the Internet, and many other essential

functions.

Many users of a Unix system still find a modern command line shell more convenient for many tasksthan any GUI application.

Due to the recent movement in favor of  free and open source software, most Unix shells have at least one

version that is distributed under an open source or free software license.

Bourne shell

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 22/27

The Bourne shell was one of the major shells used in early versions of the Unix operating system and became a d e

 facto standard. It was written by Stephen Bourne at Bell Labs and was first distributed with Version 7 Unix, circa 1977. Every

Unix-like system has at least one shell compatible with the Bourne shell. The Bourne shell program name is sh and it is typically

located in the Unix file system hierarchy at /bin/sh. On many systems, however, /bin/shmay be a symbolic link or hard

link to a compatible, but more feature-rich shell than the Bourne shell. The POSIX standard specifies its standard shell as a strict

subset of the Korn shell. From a user's perspective the Bourne shell was immediately recognized when active by its

characteristic default command line prompt character, the dollar sign ($).

C shell

The C shell was developed by Bill Joy for the Berkeley Software Distribution, a line of Unix operating systems derived from Unix

and developed at the University of California, Berkeley. It was originally derived from the 6th Edition Unix shell (Thompson

shell). Its syntax is modeled after the C programming language. It is used primarily for interactive terminal use, but less

frequently for scripting and operating system control. C shell has many interactive commands. 

 Advantages Of Unix 

Full multitasking with protected memory. Multiple users can run multiple programs each at the same time without 

interfering with each other or crashing the system.

Very efficient virtual memory, so many programs can run with a modest amount of physical memory.

Access controls and security. All users must be authenticated by a valid account and password to use the system at 

all. All files are owned by particular accounts. The owner can decide whether others have read or write access to his

files.

A rich set of small commands and utilities that do specific tasks well -- not cluttered up with lots of special options.

Unix is a well-stocked toolbox, not a giant do-it-all Swiss Army Knife.

A lean kernel that does the basics for you but doesn't get in the way when you try to do the unusual.

Available on a wide variety of machines - the most truly portable operating system.

Optimized for program development, and thus for the unusual circumstances that are the rule in research.

Disadvantages Of Unix 

The traditional command line shell interface is user hostile -- designed for the programmer, not the casual user.

Commands often have cryptic names and give very little response to tell the user what they are doing. Much use of 

special keyboard characters - l ittle typos have unexpected results.

To use Unix well, you need to understand some of the main design features. Its power comes from knowing how to

make commands and programs interact with each other, not just from treating each as a fixed black box. 

Case study 4 ± APPLE MAC OS

Mac OS is a series of graphical user interface-based operating systems developed by Apple Inc. (formerly Apple

Computer, Inc.) for their Macintosh line of computer systems. The Macintosh user experience is credited with

popularizing the graphical user interface. The original form of what Apple would later name the "Mac OS" was the

integral and unnamed system software first introduced in 1984 with the original Macintosh, usually referred to simply

as the System software.

 Apple deliberately downplayed the existence of the operating system in the early years of the Macintosh[citation needed ]

to

help make the machine appear more user-friendly and to distance it from other operating systems such as MS-DOS,

which was more arcane and technically challenging. Much of this early system software was held in ROM, with

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 23/27

updates typically provided free of charge by Apple dealers on floppy disk. As increasing disk storage capacity and

performance gradually eliminated the need for storing much of the advanced GUI operating system in the ROM,

 Apple explored clones while positioning major operating system upgrades as separate revenue-generating products,

first with System 7.1 and System 7.5, then with Mac OS 7.6 in 1997.

Early versions of the Mac OS were compatible only with Motorola 68000-based Macintoshes. As Apple introduced

computers with PowerPC hardware, the OS was ported to support this architecture as well. Mac OS 8.1 was the last

version that could run on a 68000-class processor (the 68040). Mac OS X, which has superseded the "Classic" Mac

OS, is compatible with both PowerPC and Intel processors through to version 10.5 ("Leopard"). Version 10.6 ("Snow

Leopard") supports only Intel processors. 

Mac OS can be divided into two families:

The Mac OS Classic family, which was based on Apple's own code

The Mac OS X operating system, developed from Mac OS Classic family, and NeXTSTEP, which was UNIX-based.

Classic" Mac OS (1984±2001)

Main article: History of Mac OS  

Original 1984 Macintosh desktop

The "classic" Mac OS is characterized by its total lack of a command line; it is a completely graphicaloperating system. Noted for its ease of use and its cooperative multitasking, it was criticized for its very

limited memory management, lack of  protected memory, and susceptibility to conflicts among operating

system "extensions" that provide additional functionality (such as networking) or support for a particular

device. Some extensions may not work properly together, or work only when loaded in a particular

order. Troubleshooting Mac OS extensions could be a time-consuming process of  trial and error .

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 24/27

The Macintosh originally used the Macintosh File System (MFS), a flat file system with only one level of 

folders. This was quickly replaced in 1985 by the Hierarchical File System (HFS), which had a

true directory tree. Both file systems are otherwise compatible.

Most file systems used with DOS, Unix, or other operating systems treat a fi le as simply a sequence of 

bytes, requiring an application to know which bytes represent what type of information. By contrast,MFS and HFS give files two different "forks". The data fork contains the same sort of information as

other file systems, such as the text of a document or the bitmaps of an image file. The resource

fork contains other structured data such as menu definitions, graphics, sounds, or code segments. A file

might consist only of resources with an empty data fork, or only a data fork with no resource fork. A text

file could contain its text in the data fork and styling information in the resource fork, so that an

application, which doesnt recognize the styling information, can still read the raw text. On the other

hand, these forks provide a challenge to interoperability with other operating systems; copying a file

from a Mac to a non-Mac system strips it of its resource fork, necessitating such encoding schemes

as BinHex and MacBinary.

PowerPC versions of Mac OS X up to and including Mac OS X v10.4 Tiger (support for Classic was

dropped by Apple with Leopard's release and it is no longer included) include a compatibility layer for

running older Mac applications, the Classic Environment. This runs a full copy of the older Mac OS, version

9.1 or later, in a Mac OS X process. PowerPC-based Macs shipped with Mac OS 9.2 as well as Mac OS X.

Mac OS 9.2 had to be installed by the user it was not installed by default on hardware revisions

released after the release of  Mac OS X 10.4 Tiger . Most well-written "classic" applications function

properly under this environment, but compatibility is only assured if the software was written to be

unaware of the actual hardware, and to interact solely with the operating system. The Classic

Environment is not available on Intel-based Macintosh systems due to the incompatibility of  Mac OS

9 with the x86 hardware.

Users of the classic Mac OS generally upgraded to Mac OS X, but many criticized it as being more

difficult and less user-friendly than the original Mac OS, for the lack of certain features that had not

been re-implemented in the new OS, or for being slower on the same hardware (especially older

hardware), or other, sometimes serious incompatibilities with the older OS. Because drivers (for

printers, scanners, tablets, etc.) written for the older Mac OS are not compatible with Mac OS X, and

due to the lack of Mac OS X support for older Apple machines, a significant number of Macintosh users

have still continued using the older classic Mac OS.

In June 2005, Steve Jobs announced at the Worldwide Developers Conference keynote that Apple computers

would be transitioning from PowerPC to Intel processors and thus dropping compatibility on new

machines for Mac OS Classic. At the same conference, Jobs announced Developer Transition Kits that

included beta versions of Apple software including Mac OS X that developers could use to test their

applications as they ported them to run on Intel-powered Macs. In January 2006, Apple released the

first Macintosh computers with Intel processors, an iMac and the MacBook Pro, and in February 2006,

Apple released a Mac mini with an Intel Core Solo and Duo processor. On May 16, 2006, Apple released

the MacBook, before completing the Intel transition on August 7 with the Mac Pro. To ease the transition

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 25/27

for early buyers of the new machines, Intel-based Macs include an emulation technology called Rosetta,

which allows them to run Mac OS X software that was compiled for PowerPC-based Macintoshes.

Rosetta runs transparently, creating a user experience identical to running the software on a PowerPC

machine, though execution is typically slower than with native code.

Mac OS X

Mac OS X is the newest of   Apple Inc.'s Mac OS line of operating systems. Although it is officially

designated as simply "version 10" of the Mac OS, it has a history largely independent of the earlier Mac

OS releases.

The operating system is the successor to Mac OS 9 and the "classic" Mac OS. It is a Unix operating

system, based on the NeXTSTEP operating system and the Mach kernel which Apple acquired after

purchasing NeXT Computer , with its CEO Steve Jobs returning to Apple at this time. Mac OS X also makes

use of the BSD code base. There have been six significant releases of the client version, the most recent

being Mac OS X 10.6, referred to as Snow Leopard. On Apple's October 20th 2010 "Back to the Mac"

event, Mac OS X 10.7 Lion was previewed, showing improvements and additions including a Mac App

Store.

As well as the client versions, Mac OS X has also had six significant releases as a server version,

called Mac OS X Server. The first of these, Mac OS X Server 1.0, was released in beta in 1999. The serverversions are architecturally identical to the client versions, with the differentiation found in their

inclusion of tools for server management, including tools for managing Mac OS X-basedworkgroups, mail

servers, and web servers, amongst other tools. It is currently the default operating system for

the Xserve server hardware, and as an optional feature on the Mac Mini, as well as being installable on

most other Macs. Unlike the client version, Mac OS X Server can be run in a virtual machine using

emulation software such as Parallels Desktop.

Mac OS X is also the basis for iOS, (previously iPhone OS) used on Apple's iPhone, iPod Touch, and iPad, as

well as being the basis for the operating system used in the  Apple TV.

Advantages of Apple Computer s 

Security. Apple computers are much more secure than Windows PCs. Viruses, adware and

malware designed for a Windows-based processor simply will not run on a Mac. The main

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 26/27

reason for this is simply that there are much fewer Macs in use. People who make malicious

software want to have it spread to as many people as possible and designing a virus for a Mac

simply will not accomplish this. 

Reliability. The people who make Mac software are the same people who make the Mac

hardware. While Windows programmers have to take into account nearly infinite variations in

hardware, Mac OS is designed to be used on a very limited amount of computers: those built by

Apple. The software is designed specifically to run on the hardware. This means that the

operating system is much more stable. 

Macs are convenient. Do you own an iPod? An iPhone? Do you listen to music with iTunes?

Do you appreciate how simple, elegant and easy to use these products are? Apple applies all of 

the things you like about these products to their computers.

Yes, there will be a time of adjustment to using Mac OS, but once you are used to it, you will love it. 

Advanced technology. The current generation of Macbook Pros feature LED monitors, multi-touch

mouse trackpad and a CNC machined aluminum case. Desktop Mac Pros feature up to 8 core processors

(yes, eight) and up to 4 TB of storage (TB=Terabyte). Thats four thousand (4000) GB.

A good overview of the Macbook Pro is available here.

A Mac can do anything a Windows PC can. With Apple computers now using Intel processors, they are

able to run Windows. With programs like Bootcamp or software emulation, you are able to install and

run Windows on your Mac and switch between the two OSs easily. Macs come with a Mail utility withthe same functionality as Microsoft Outlook. The iWork line of software comes with Pages, Keynote and

Numbers which can function as replacements for Word, Powerpoint and Excel. Each of these programs 

can export files in a format that can be used in Windows software. Or, if you dont want to try iWork, Microsoft

offers a Mac-compatible version of Office.

Disadvantages of Mac Computers. 

They¶re only more secure because fewer people use them. If everybody used Macs, there would

be a lot more viruses and malware for them. If Apple computers became more popular, they would

become less secure.

Cannot be upgraded/customized. There are upgrade options when you buy a Mac, but unlike a

Windows PC, you cannot mix and match components. The easiest way to upgrade a Mac is to buy a new

one.

8/6/2019 Case Study of Os

http://slidepdf.com/reader/full/case-study-of-os 27/27

Price. Macs are very expensive. Even the cheapest laptop costs $999.99. Windows machines cost as

little as $500.

Playing games requires Windows. There are very few games available for the Mac. If you are a

gamer, a Mac is probably not the best choice. You could run Bootcamp, but if you are going to spend

most of your time on a Mac running Windows, you have to ask yourself if it is worth it. Another option (and

the one I have taken) is to play games on a video game console, such as a PS3 or Wii, and simply use

your computer for computing.

It requires adjustment. It¶s a Windows-based world. Most people are used to using Windows. Changing to

a Mac requires that you get used to a number of differences. Some people are simply not willing to make

that change.