No Harasshment Like In IE, GET FIREFOX

New method of long distance communication


Summary: A general article on how video conferencing works and how it is being implemented.

Doctors can use it to assess the best treatment for patients with life-threatening injuries who live miles away from a hospital. Teachers on all academic levels can use it to hold classes and to bring the world -- in the form of an exhibit at the Musee des Beaux Arts in Brussels, an archaeological dig in Tibet, or a class of South African third graders, to name a few possibilities -- before students' very eyes. A 12-year-old can use it to communicate with his grandmother in Indiana, and a bank executive can use it to unroll a new marketing strategy to her counterparts at the bank's Berlin, Buenos Aires, London and Singapore offices. It is video conferencing, and industry representatives project that it will be a $5 billion industry by 2002. Video conferencing is real-time audio and visual interaction between two or more parties from distant locations. There are a number of technologies on the market now that enable such communication and they vary greatly in cost, capability and the quality of the transmitted image.

How it works
Video conferencing systems today primarily transmit audio and visual communication one of two ways: Either through digital phone lines on an integrated services digital network (ISDN) or through a local area network (LAN) via the Internet. Henry Valentino, president of the Virginia-based Video Phone Store LLC, said that analog video phone technology, which utilizes standard telephone lines, is "basically dying."
"Everyone's been preaching about the explosion of video phone technology since AT&T introduced it in the '60s. The industry is finally taking off now because of cable modems and high-speed connection services," he added.
It is widely -- although not universally -- expected that the ISDN-based systems, which comply with industry standards known as H.320, will eventually be left in the dust of trail-blazing Internet systems, whose standards are called H.323.
James Whitlock, associate director of computing services at the University at Buffalo, said that ISDN connections will in the not-too-distant future be more problematic and limited than Internet access. "We can presume that the world will have Internet connectivity; we cannot presume that the world will have ISDN," he said.
Until those higher-speed connection services are refined, however, ISDN access will produce better image resolution for most users.
The necessary elements of any video conferencing system are:
A video capture card, or CODEC (for COmpression DECompression) card
A video camera
A microphone
Video conferencing software
And a service provider (in addition to a computer to house the system).
Consumers today can choose from desktop PC-based video conferencing systems and components or systems designed for group conferences in rooms specially fitted for such purposes.
Some video conferencing systems can support application sharing so that architects in different locations, for example, can simultaneously alter and collaborate on CAD drawings. Satellite downlinks are another feature available to those with greater technological resources.

Applications
Not ready to commit to such an investment just yet? You can test-drive the technology at several "public rooms" in Western New York, where consumers can rent the rooms and equipment to hold point-to-point or multipoint video conferences with anyone on the globe who can access a similar facility. Consumers can deal directly with public video conferencing rooms or with broker firms that establish affiliate locations all over the globe. Brokers like Proximity, which has more than 2,000 locations worldwide, and Affinity, with more than 800 sites, will conduct research to determine how clients can reach personal or business contacts in other locations. Most public video conferencing sites offer clients accommodations, including catering and videotaping to enhance the comfort and productivity of their sessions.
Francesca Mesiah, director of sales and client services at one such venue, the Advanced Training Center on Oak Street in Buffalo, described a conference during which doctors from around the world assembled at the center (and other international locations) to watch a video transmission of live heart surgery being performed at Buffalo General Hospital. ATC was then transformed into a medical laboratory as the doctors put the surgical techniques they had learned into practice on pigs' hearts within the facility's classrooms.
Another local video conferencing facility, recruitment firm MRI Sales Consultants of Buffalo Inc., uses the technology primarily as a medium for job interviews. General Manager Bob Artis said that corporations can save hundreds of dollars on travel and accommodation expenses for each prospective employee by conducting interviews through video conferencing.
WNED-TV also offers public video conferencing locally, as does Ronco Communications & Electronics Inc. Ronco's Kathleen Hardy said that her company serves a wide variety of clients who use video conferencing for legal applications, including depositions and expert testimony, as well as business, medical and educational purposes.
The broad potential for distance learning through video conferencing is of particular interest to instructors and technicians at the University at Buffalo. Lisa Stephens, associate director of distance learning, coordinates video conferences using the university's three classroom-based systems. The university also has at least three mobile desktop units that are used for smaller conferences, Stephens said.
The possibilities created by evolving video conferencing technologies are truly limitless, and will affect every aspect of our lives in the near future. Charles Rutstein, an analyst with Forrester Research, expects that we will see widespread use of video conferencing technologies in approximately five years. But even a self-described video conferencing "zealot" reminds us that there are situations in which old-fashioned interpersonal communication will never be obsolete.
"Sometimes people who are communicating still need to smell the fear in negotiating situations, to press the flesh," said James Whitlock, associate director of computing services at UB. "These technologies won't replace the need for travel or the need for live person-to-person communication. But they will supplement them and increase the effectiveness of our travel and our use of the telephone."

Source: © 1999 American City Business Journals Inc.

Iphone VS Open Moko


We have all seen the new Apple iPhone that was just released at CES. Now have a look at the new OpenMoko open source phone that has some similar sexy lines. The specifications sound impressive, what do you think?
Have a look at the presentation (PDF).
“Until now, mobile platforms have been proprietary and scattered. With the release of OpenMoko, which is based on the latest Linux open source efforts, developers now have an easy way to create applications and deliver services that span all users and provide a common “look and feel”. OpenMoko also offers common storage models and libraries for application developers, making writing applications for mobile phones fun and easy while guaranteeing swift proliferation of a wide range of applications for mobile phones. With such extremely high quality open frameworks, developers will be armed with exactly the tools they need to revolutionize the mobile industry.

Future Mobile


Not sure if this Nokia concept cell phone will ever be for sale, if it does I will be first in line for it. See Sc iFi Tech for more details.
“The “Nokia Open” is/would be a cell phone that opens like a fan with a “scrollable touch screen,” which seems to be an essential-yet-nonexistent item that would need to be invented in order for this to work. The idea is that with the push of a button the thin phone opens up, revealing a spacious screen on which buttons and menu options appear for you to manipulate with your digits. All well and good, but a cell phone that appears before you on the wings of a magical eagle would be cool too, though I’m not expecting Nokia to start marketing it anytime soon.”

Nice Rebuilt


Clock On Hard Disk

Latest AMD Release









AMD developed the Personal Internet Communicator (PIC) reference design as part of its "50x15 Initiative," which aims to equip 50 percent of the world's population with affordable Internet access and computing capability by the year 2015. It runs a "customized" version of Windows CE, and includes a minimal set of applications.
(Click here for larger image of PIC reference design)
When the PIC reference design debuted in the fall of 2004, AMD touted a target end-user device price point of $185, including a keyboard, mouse, and preinstalled software for basic personal computing and internet/email access; for $249, a monitor would be included. To meet such aggressive price targets, the PIC is designed for minimal cost, much like a consumer audio/video appliance. It is not internally expandable, and includes a minimum set of interfaces, according to AMD.


AMAZING CPUs








some Amazing CPUs (YOU MUST SEE THIS)










iphone


iPhone combines three products — a revolutionary mobile phone, a widescreen iPod with touch controls, and a breakthrough Internet communications device with desktop-class email, web browsing, maps, and searching — into one small and lightweight handheld device. iPhone also introduces an entirely new user interface based on a large multi-touch display and pioneering new software, letting you control everything with just your fingers. So it ushers in an era of software power and sophistication never before seen in a mobile device, completely redefining what you can do on a mobile phone.

Intel In Mac



Now every new Mac ships with an Intel processor. Experience delightful responsiveness from the smallest Mac mini to the most beefed-up Mac Pro. Use one of more than 3,000 universal applications that take full advantage of the Intel chip. Run programs from your PowerPC-based Mac in translation. Powered by Intel chips, your new Mac will do all those things that only Macs can do — and do so at an astonishing level of performance.

The new Mac core
Every Mac uses a chip based on Intel Core technology, the next generation in processor design from the world’s leading chip maker. The result of massive R&D effort involving thousands of engineers. An entire collection of revolutions shrunk into an unimaginably small space, consuming less energy, too. Two cores work together to share resources, and are designed to conserve power when their functions aren’t required. Whether in an ultra-sleek MacBook, or workstation class Mac Pro, Intel Core technology lets you get more power with less power.

Four on the floor
And that means pure creative exhiliration with four 64-bit cores inside the new Mac Pro. The Core-based Intel Xeon is so power efficient, that Apple engineers were able to remove the liquid cooling system from the previous Power-PC based model. Which means you can load up the Mac Pro with more cards, more hard drives, more memory. So you can do more with Final Cut Studio, Aperture, Logic Pro, and the growing number of universal applications for creative professionals.

Dual-roar
The Intel Core Duo is actually two processors (cores) engineered onto a single chip —offering virtually twice the computational power of a traditional single processor in the same space. With two cores tightly integrated, increased L2 cache, and a host of engineering breakthroughs, the Intel Core Duo delivers higher performance for all the things you do — from enhancing the family photos to rendering special effects for a feature film.

IT Today


As with other industrial processes , commercial IT has moved in all respects from a custom, craft-based industry where the product was tailored to fit the customer; to multi-use components taken off the shelf to find the best-fit in any situation. Mass-production has greatly reduced costs and IT is available to the smallest company or one-man band - or school-kid.
LEO was hardware tailored for a single client. Today, Intel Pentium and compatible chips are standard and become parts of other components which are combined as needed. One individual change of note was the freeing of computers and removable storage from protected, air-filtered environments. Microsoft and IBM at various times have been influential enough to impose order on IT and the resultant standardisations allowed specialist software to flourish.
Software is available off the shelf: apart from Microsoft Office or IBM Lotus Notes, there are also specialist packages for payroll and personnel management, account maintenance and customer management, to name a few. These are highly specialised and intricate components of larger environments, but they rely upon common conventions and interfaces.
Data storage has also standardised. Relational databases are developed by different suppliers to common formats and conventions.Common file formats can be shared by large main-frames and desk-top personal computers, allowing online, realtime input and validation.
In parallel, software development has fragmented. There are still specialist technicians, but these increasingly use standardised methodologies where outcomes are predictable and accessible. At the other end of the scale, any office manager can dabble in spreadsheets or databases and obtain acceptable results (but there are risks).As with other industrial processes , commercial IT has moved in all respects from a custom, craft-based industry where the product was tailored to fit the customer; to multi-use components taken off the shelf to find the best-fit in any situation. Mass-production has greatly reduced costs and IT is available to the smallest company or one-man band - or school-kid.
LEO was hardware tailored for a single client. Today, Intel Pentium and compatible chips are standard and become parts of other components which are combined as needed. One individual change of note was the freeing of computers and removable storage from protected, air-filtered environments. Microsoft and IBM at various times have been influential enough to impose order on IT and the resultant standardisations allowed specialist software to flourish.
Software is available off the shelf: apart from Microsoft Office or IBM Lotus Notes, there are also specialist packages for payroll and personnel management, account maintenance and customer management, to name a few. These are highly specialised and intricate components of larger environments, but they rely upon common conventions and interfaces.
Data storage has also standardised. Relational databases are developed by different suppliers to common formats and conventions.Common file formats can be shared by large main-frames and desk-top personal computers, allowing online, realtime input and validation.
In parallel, software development has fragmented. There are still specialist technicians, but these increasingly use standardised methodologies where outcomes are predictable and accessible. At the other end of the scale, any office manager can dabble in spreadsheets or databases and obtain acceptable results (but there are risks).

History


The first commercial business computer was developed in the United Kingdom in 1951, by the Joe Lyons catering organization. This was known as the 'Lyons Electronic Office' - or LEO for short. It was developed further and used widely during the 1960s and early 1970s. (Joe Lyons formed a separate company to develop the LEO computers and this subsequently merged to form English Electric Leo Marconi and then International Computers Ltd.)
Early commercial systems were installed exclusively by large organizations. These could afford to invest the time and capital necessary to purchase hardware, hire specialist staff to develop custom software and work through the consequent (and often unexpected) organizational and cultural changes.
At first, individual organizations developed their own software, including data management utilities, themselves. Different products might also have 'one-off' custom software. This fragmented approach led to duplicated effort and the production of management information needed manual effort.
High hardware costs and relatively slow processing speeds forced developers to use resources 'efficiently'. Data storage formats were heavily compacted, for example. A common example is the removal of the century from dates, which eventually lead to the 'millennium bug'.
Data input required intermediate processing via punched paper tape or card and separate input to computers, usually for overnight processing. Data required validation in batches. All of this was a repetitive, labour intensive task, removed from user control and error-prone. Invalid or incorrect data needed correction and resubmission with consequences for data and account reconciliation.
Data storage was strictly serial on paper tape, and then later to magnetic tape: the use of data storage within readily accessible memory was not cost-effective.
Results would be presented to users on paper. Enquiries were delayed by whatever turn round was available.The first commercial business computer was developed in the United Kingdom in 1951, by the Joe Lyons catering organization. This was known as the 'Lyons Electronic Office' - or LEO for short. It was developed further and used widely during the 1960s and early 1970s. (Joe Lyons formed a separate company to develop the LEO computers and this subsequently merged to form English Electric Leo Marconi and then International Computers Ltd.)
Early commercial systems were installed exclusively by large organizations. These could afford to invest the time and capital necessary to purchase hardware, hire specialist staff to develop custom software and work through the consequent (and often unexpected) organizational and cultural changes.
At first, individual organizations developed their own software, including data management utilities, themselves. Different products might also have 'one-off' custom software. This fragmented approach led to duplicated effort and the production of management information needed manual effort.
High hardware costs and relatively slow processing speeds forced developers to use resources 'efficiently'. Data storage formats were heavily compacted, for example. A common example is the removal of the century from dates, which eventually lead to the 'millennium bug'.
Data input required intermediate processing via punched paper tape or card and separate input to computers, usually for overnight processing. Data required validation in batches. All of this was a repetitive, labour intensive task, removed from user control and error-prone. Invalid or incorrect data needed correction and resubmission with consequences for data and account reconciliation.
Data storage was strictly serial on paper tape, and then later to magnetic tape: the use of data storage within readily accessible memory was not cost-effective.
Results would be presented to users on paper. Enquiries were delayed by whatever turn round was available.

Defination


Information Technology (IT) also known as Information and Communication(s) Technology (ICT) and Infocomm in Asia is concerned with the use of technology in managing and processing information, especially in large organizations.
In particular, IT deals with the use of electronic computers and computer software to convert, store, protect, process, transmit, and retrieve information. For that reason, computer professionals are often called IT specialists/ consultants or Business Process Consultants, and the division of a company or university that deals with software technology is often called the IT department. Other names for the latter are information services (IS) or management information services (MIS), managed service providers (MSP).