American library books » Computers » E-books and e-publishing by Samuel Vaknin (essential reading .TXT) 📕

Read book online «E-books and e-publishing by Samuel Vaknin (essential reading .TXT) 📕».   Author   -   Samuel Vaknin



1 ... 15 16 17 18 19 20 21 22 23 ... 36
Go to page:
must charge money, or

sell advertisements. Unfortunately, the revenues from

advertising on the Net have fallen dramatically in the last

few years. So if you put a price tag on your content, how much

should you charge? Most independent electronic publishers

charge a few dollars for their titles, anywhere from $1 each

to about $5 or $7 per e-book. These relatively low prices

reflect the desire to attract a large pool of customers. They

also reflect the belief common among readers that since it is

electronic and not print content, the price should be

lower. They feel that without the cost of printing and

transporting books, the publisher should set a lower price…

 

Q. As you see it, is the Internet merely another content

distribution channel or is there more to it then this? The

hype of synergy and collapsing barriers to entry has largely

evaporated together with the fortunes of the likes of AOL Time

Warner. Is the Internet a revolution - or barely an evolution?

 

A. In the beginning, the Internet was a revolution. Email

brought the people of our Earth closer together. The Net

enabled telecommuting and now as much as 10% of the world

works at home via computer and Internet. The Internet makes it

possible for artists to publish their own books, music, videos

and Websites. Video conferencing has enabled conversations

without limitations of space. The Internet has made vast

amounts of information available to students and researchers

at the click of the mouse. The 24/7 access and ease of

ordering products has stimulated online commerce and sales at

retail stores.

 

But it is not a cure-all. And, now that the Net is part of our

everyday lives, it is subject to the same cycles of media

hype, as well as social, emotional, and business

factors. Things will never be the same, and the changes have

just begun. The present generation has never known a world

without computers. When they reach working age, they will be

much more inclined to use the Net for a majority of their

reading and entertainment needs. Then, e-books will truly take

hold and become ubiquitous. Between now and then, we have work

to do, building the foundation of this remarkable industry.

 

WEB TECHNOLOGIES AND TRENDS

Bright Planet, Deep Web

By: Sam Vaknin

 

www.allwatchers.com and www.allreaders.com are web sites in

the sense that a file is downloaded to the user’s browser when

he or she surfs to these addresses. But that’s where the

similarity ends. These web pages are front-ends, gates to

underlying databases. The databases contain records regarding

the plots, themes, characters and other features of,

respectively, movies and books. Every user-query generates a

unique web page whose contents are determined by the query

parameters.The number of singular pages thus capable of being

generated is mind boggling. Search engines operate on the same

principle - vary the search parameters slightly and

totally new pages are generated. It is a dynamic, user-responsive and chimerical sort of web.

 

These are good examples of what www.brightplanet.com call the

“Deep Web” (previously inaccurately described as the “Unknown

or Invisible Internet”). They believe that the Deep Web is 500

times the size of the “Surface Internet” (a portion of which

is spidered by traditional search engines). This translates to

c. 7500 TERAbytes of data (versus 19 terabytes in the whole

known web, excluding the databases of the search engines

themselves) - or 550 billion documents organized in 100,000

deep web sites. By comparison, Google, the most comprehensive

search engine ever, stores 1.4 billion documents in its

immense caches at www.google.com. The natural inclination

to dismiss these pages of data as mere re-arrangements of the

same information is wrong. Actually, this underground ocean of

covertintelligence is often more valuable than the information

freely available or easily accessible on the surface. Hence

the ability of c. 5% of these databases to charge their users

subscription and membership fees. The average deep web site

receives 50% more traffic than a typical surface site and is

much more linked to by other sites. Yet it is transparent to

classic search engines and little known to the surfing public.

 

It was only a question of time before someone came up with a

search technology to tap these depths

(www.completeplanet.com).

 

LexiBot, in the words of its inventors, is…

 

“…the first and only search technology capable of

identifying, retrieving, qualifying, classifying and

organizing “deep” and “surface” content from the World Wide

Web. The LexiBot allows searchers to dive deep and explore

hidden data from multiple sources simultaneously using

directed queries. Businesses, researchers and consumers now

have access to the most valuable and hard-to-find information

on the Web and can retrieve it with pinpoint accuracy.”

 

It places dozens of queries, in dozens of threads

simultaneously and spiders the results (rather as a “first

generation” search engine would do). This could prove very

useful with massive databases such as the human genome,

weather patterns, simulations of nuclear explosions, thematic,

multi-featured databases, intelligent agents (e.g., shopping

bots) and third generation search engines. It could also have

implications on the wireless internet (for instance, in

analysing and generating location-specific advertising) and on

e-commerce (which amounts to the dynamic serving of web

documents).

 

This transition from the static to the dynamic, from the given

to the generated, from the one-dimensionally linked to the

multi-dimensionally hyperlinked, from the deterministic

content to the contingent, heuristically-created and uncertain

content - is the real revolution and the future of the web.

Search engines have lost their efficacy as gateways. Portals

have taken over but most people now use internal links (within

the same web site) to get from one place to another. This is

where the deep web comes in. Databases are about internal

links. Hitherto they existed in splendid isolation, universes

closed but to the most persistent and knowledgeable. This may

be about to change. The flood of quality relevant information

this will unleash will dramatically dwarf anything that

preceded it.

 

The Seamless Internet

By: Sam Vaknin

 

http://www.enfish.com/

 

The hype over ubiquitous (or pervasive) computing (computers

everywhere) has masked a potentially more momentous

development. It is the convergence of computing devices

interfaces with web (or other) content. Years ago - after Bill

Gates overcame his misplaced scepticism - Microsoft introduced

their “internet-ready” applications. Its word processing

software (“Word”), other Office applications, and the Windows

operating system handle both “local” documents (resident on

the user’s computer) and web pages smoothly and seamlessly.

The transition between the desktop or laptop interfaces and

the web is today effortlessly transparent.

 

The introduction of e-book readers and MP3 players has blurred

the anachronistic distinction between hardware and software.

Common speech reflects this fact. When we say “e-book”, we

mean both the device and the content we access on it. As

technologies such as digital ink and printable integrated

circuits mature - hardware and software will have completed

their inevitable merger.

 

This erasure of boundaries has led to the emergence of

knowledge management solutions and personal and shared

workspaces. The LOCATION of a document (one’s own computer, a

colleague’s PDA, or a web page) has become irrelevant. The

NATURE of the document (e-mail message, text file, video

snippet, soundbite) is equally unimportant. The SOURCE of the

document (its extension, which tells us on which software it

was created and can be read) is increasingly meaningless.

Universal languages (such as Java) allow devices and

applications to talk to each other. What matters are

accessibility and logical and user-friendly work-flows.

 

Enter Enfish. In its own words, it provides:

 

“…Personalized portal solution linking personal and

corporate knowledge with relevant information from the

Internet, …live-in desktop environment providing co-branding

and customization opportunities on and offline, a unique,

private communication channel to users that can be used also

for eBusiness solutions, …Knowledge Management solution that

requires no user set-up or configuration.”

 

The principle is simple enough - but the experience is

liberating (try their online flash demo). Suddenly, instead of

juggling dozens of windows, a single interface provides the

tortured user (that’s I) with access to all his applications:

e-mail, contacts, documents, the company’s intranet or

network, the web and OPC’s (other people’s computers, other

networks, other intranets). There is only a single screen and

it is dynamically and automatically updated to respond to the

changing information needs of the user.

 

“The power underlying Enfish Onespace is its patented DEX

‘engine.’ This technology creates a master, cross-referenced

index of the contents of a user’s email, documents and

Internet information.

 

The Enfish engine then uses this master index as a basis to

understand what is relevant to a user, and to provide them

with appropriate information. In this manner Enfish Onespace

‘personalizes’ the Internet for each user, automatically

connecting relevant information and services from the Internet

with the user’s desktop information.

 

As an example, by clicking on a person or company, Enfish

Onespace automatically assembles a page that brings together

related emails, documents, contact information, appointments,

news and relevant news headlines from the Internet. This is

accomplished without the user working to find and organize

this information. By having everything in one place and in

context, our users are more informed and better prepared to

perform tasks such as handling a phone call or preparing for a

business meeting. This results in … benefits in productivity

and efficiency.”

 

It is, indeed, addictive. The inevitable advent of transparent

computing (smart houses, smart cards, smart clothes, smart

appliances, wireless Internet) - coupled with the single GUI

(Graphic User Interface) approach can spell revolution in our

habits. Information will be available to us anywhere, through

an identical screen, communicated instantly and accurately

from device to device, from one appliance to another and from

one location to the next as we move. The underlying software

and hardware will become as arcane and mysterious as are the

ASCII and ASSEMBLY languages to the average computer user

today. It will be a real partnership of biological and

artificial intelligence on the move.

 

The Polyglottal Internet

By: Sam Vaknin

 

http://www.everymail.com/

The Internet started off as a purely American phenomenon and

seemed to perpetuate the fast-emerging dominance of the

English language. A negligible minority of web sites were in

other languages. Software applications were chauvinistically

ill-prepared (and still are) to deal with anything but

English. And the vast majority of net users were residents of

the two North-American colossi, chiefly the USA.

All this started to change rapidly about two years ago. Early

this year, the number of American users of the Net was

surpassed by the swelling tide of European and Japanese ones.

Non-English web sites are proliferating as well. The advent of

the wireless Internet - more widespread outside the USA - is

likely to strengthen this unmistakable trend. By 2005, certain

analysts expect non-English speakers to make up to 70% of all

netizens. This fragmentation of an hitherto unprecedentedly

homogeneous market - presents both opportunities and costs. It

is much more expensive to market in ten languages than it is

in one. Everything - from e-mail to supply chains has to be

re-tooled or customized.

It is easy to translate text in cyberspace. Various automated,

web-based, and free applications (such as Babylon or Travlang)

cater to the needs of the casual user who doesn’t mind the

quality of the end-result. Virtually every search engine,

portal and directory offers access to these or similar

services.

But straightforward translation is only one kind of solution

to the tower of Babel that the Internet is bound to become.

Enter WorldWalla. A while back I used their multilingual email application. It converted text I typed on a virtual

keyboard to images (of characters). My addressees received the

message in any language I selected. It was more than cool. It

was liberating. Along the same vein, WorldWalla’s software

allows application and content developers to work in 66

languages. In their own words:

“WordWalla allows device manufacturers

1 ... 15 16 17 18 19 20 21 22 23 ... 36
Go to page:

Free e-book: «E-books and e-publishing by Samuel Vaknin (essential reading .TXT) 📕»   -   read online now on website american library books (americanlibrarybooks.com)

Comments (0)

There are no comments yet. You can be the first!
Add a comment