Friday, July 31, 2009

Barney Google and Snuffy Smith

Jump to: navigation, search
Sunday page with Billy DeBeck strips from September 6, 1936

Barney Google and Snuffy Smith, originally Barney Google, is a long-running American comic strip created in 1919 by Billy DeBeck. The strip inspired the popular 1923 song, "Barney Google (With the Goo-Goo-Googly Eyes)," with lyrics by Billy Rose.
Contents
[hide]

* 1 Characters and story
* 2 Etymology of "Google"
* 3 Films
o 3.1 1920s
o 3.2 1930s
o 3.3 1940s
o 3.4 1960s
o 3.5 DVD
* 4 References
* 5 External links

[edit] Characters and story

When the strip began, its title character, a little fellow with big eyes, was a sportsman involved in horse racing and boxing. In 1922, the strip took a huge turn in popularity with the addition of a race horse named Spark Plug, a nag who seldom raced and was typically seen almost totally covered by his horse blanket.

In 1934, an even greater change took place when Barney and the horse visited the North Carolina mountains and met a moonshiner named Snuffy Smith. The strip increasingly focused on stereotypical humor about the hillbillies of Southern Appalachia with Snuffy as the main character. Locals in the strip are extremely suspicious of any outsiders, referred to as "flatlanders" or even worse, "revenooers" (Federal Revenue agents). Snuffy was so popular that his name was added to the strip's title in the late 1930s, and Barney Google himself virtually disappeared after the 1950s.

The strip first appeared in the sports section of the Chicago Herald and Examiner as Take Barney Google, F'rinstance. By October 1919, the strip was syndicated by King Features and was published in newspapers across the country. Fred Lasswell, DeBeck's lifelong assistant, took over Barney Google in 1942. Lasswell drew the strip until his death on March 3, 2001. John Rose, who inked the strip for Lasswell, draws the comic today. In 1963 Lasswell won both the National Cartoonist Society Humor Comic Strip Award and Reuben Award.

Barney Google appears in 21 countries and 11 languages. It is credited with introducing several slang phrases, including "sweet mama," "horsefeathers," "heebie-jeebies" and "hotsy-totsy" (meaning just right, perfect). In 1995, the strip was one of 20 included in the Comic Strip Classics series of commemorative US postage stamps.

Peanuts artist Charles M. Schulz was known to his friends as Sparky, a nickname given to him at birth by his uncle as a diminutive of Barney Google's Spark Plug.[1]

[edit] Etymology of "Google"

Although it is often stated that the comic strip was not the inspiration for the name of the Google search engine, a linkage is evident and can be traced in a simple fashion: Following "The Goo-Goo Song" (1900), the word "Google" was introduced in 1913 in The Google Book, a children's book about the Google and other fanciful creatures who live in Googleland.[2] Thus aware of the word's appeal, DeBeck launched his comic strip six years later, and the "goo-goo-googly" lyrics in the 1923 song "Barney Google" focused attention on the novelty of the word. When the mathematician and Columbia University professor Edward Kasner was challenged in the late 1930s to devise a name for a very large number, he asked his nine-year-old nephew, Milton Sirotta, to suggest a word. The youth told Kasner to use "Google" at a time when the comic strip was at a peak of popularity. Kasner agreed and in 1940 he introduced the words "googol" and "googolplex" in his book, Mathematics and the Imagination. Milton Sirotta died in 1980.[3][4] This is the term that Larry Page and Sergey Brin had in mind when they named their company in 1998, but they misspelled "googol" as "google," bringing it full circle right back to Billy DeBeck.[5][6]
Barney Google (February 5, 1931)

[edit] Films

[edit] 1920s

Beginning with Horsefeathers (1928), Barney Hellum portrayed Barney Google in a series of live-action short films, featuring Philip Davis as Spark Plug. There were at least 11 films in the series, ending with Slide, Sparky, Slide (1929).

[edit] 1930s

There was an animated version Barney Google film series in the mid-1930s, produced by the Charles Mintz Screen Gems Studio. Mintz made only four Barney Google cartoons, with two released in 1935 and two in 1936.

[edit] 1940s

Two live-action feature films with actor Bud Duncan portraying Snuffy Smith were made in 1942: Private Snuffy Smith and Hillbilly Blitzkrieg. Cliff Nazarro appeared as Barney in Hillbilly Blitzkrieg. [7]

[edit] 1960s

In 1963-64 Paramount created 50 six-minute episodes of Snuffy Smith and Barney Google based on the comic strip.

[edit] DVD

In 2006, BCI Eclipse released 20 episodes of the 1963 cartoon as part of Animated All Stars 2DVD BCI 46952.[8]
Barney Google (left) and Snuffy Smith

* The Master
* Snuffy Runs the Gamut
* Pie in the Sky
* Barney's Blarney
* Barney Deals the Cars
* The Country Club Smiths
* Ain't it the Tooth
* Jughaid the Magician
* Glove thy Neighbor
* Jughaid's Jumpin' Frog
* Getting Snuffy's Goat
* Rip Van Snuffy
* Just Plain Kinfolk
* Keeping up with the Joneses
* Settin' and a-Frettin'
* Jughaid for President
* Little Red Jughaid
* My Kingdom for a Horse
* The Tourist Trap
* It's Better to Give (Christmas Show)

All 50 episodes are available on the fourth DVD of the Advantage Cartoon Mega Pack.

Usenet

A diagram of some Usenet servers and clients. The blue, green, and red dots on the servers represent which groups they carry. Arrows between servers indicate that the servers are sharing the articles from the groups. Arrows between computers and servers indicate that the user is subscribed to a certain group, and uploads and downloads articles to and from that server.

Usenet, a portmanteau of "user" and "network", is a worldwide distributed Internet discussion system. It evolved from the general purpose UUCP architecture of the same name.

Duke University graduate students Tom Truscott and Jim Ellis conceived the idea in 1979.[1] Users read and post public messages (called articles or posts, and collectively termed news) to one or more categories, known as newsgroups. Usenet resembles bulletin board systems (BBS) in most respects, and is the precursor to the various Internet forums that are widely used today; and can be superficially regarded as a hybrid between e-mail and web forums. Discussions are threaded, with modern news reader software, as with web forums and BBSes, though posts are stored on the server sequentially.

One notable difference between a BBS or web forum and Usenet is the absence of a central server and dedicated administrator. Usenet is distributed among a large, constantly changing conglomeration of servers that store and forward messages to one another. These servers are loosely connected in a variable mesh. This is similar to the complex transportation plan of a city. There are multiple ways to get to any point in the city. If any of those ways is blocked for any reason, there is always another avenue for you to get where you want to go. In this manner, the User Network or Usenet allows newsgroup postings to reach their many destinations quickly and easily. Individual users may read messages from and post messages to a local server operated by their Internet service provider, university or employer. The servers then exchange the messages among one another, making the messages available to readers beyond the original server. Users can read Usenet newsgroups on websites such as Google, or additional application software programs called newsreaders are also available.
Contents
[hide]

* 1 Introduction
* 2 ISPs, news servers, and newsfeeds
o 2.1 Newsreader clients
o 2.2 Moderated and unmoderated newsgroups
o 2.3 Technical details
o 2.4 Organization
o 2.5 Binary content
+ 2.5.1 Binary retention time
+ 2.5.2 Legal issues
* 3 History
o 3.1 Internet jargon and history
o 3.2 Archives and Web interfaces
+ 3.2.1 Google Groups / DejaNews
o 3.3 Public USENET Servers
* 4 See also
o 4.1 Usenet terms
o 4.2 Usenet history
o 4.3 Usenet administrators
o 4.4 Usenet celebrities
* 5 References
o 5.1 Further reading
* 6 External links

[edit] Introduction
This section needs additional citations for verification.
Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (May 2009)

Usenet is one of the oldest computer network communications systems still in widespread use. It was conceived in 1979 and publicly established in 1980 at the University of North Carolina at Chapel Hill and Duke University,[1] over a decade before the World Wide Web was developed and the general public got access to the Internet. It was originally built on the "poor man's ARPANET," employing UUCP as its transport protocol to offer mail and file transfers, as well as announcements through the newly developed news software. The name USENET emphasized its creators' hope that the USENIX organization would take an active role in its operation (Daniel et al., 1980).

The articles that users post to Usenet are organized into topical categories called newsgroups, which are themselves logically organized into hierarchies of subjects. For instance, sci.math and sci.physics are within the sci hierarchy, for science. When a user subscribes to a newsgroup, the news client software keeps track of which articles that user has read.[citation needed]

In most newsgroups, the majority of the articles are responses to some other article. The set of articles which can be traced to one single non-reply article is called a thread. Most modern newsreaders display the articles arranged into threads and subthreads, making it easy to follow a single discussion in a high-volume newsgroup.[citation needed]

When a user posts an article, it is initially only available on that user's news server. Each news server, however, talks to one or more other servers (its "newsfeeds") and exchanges articles with them. In this fashion, the article is copied from server to server and (if all goes well) eventually reaches every server in the network. The later peer-to-peer networks operate on a similar principle; but for Usenet it is normally the sender, rather than the receiver, who initiates transfers. Some have noted that this seems an inefficient protocol in the era of abundant high-speed network access. Usenet was designed for a time when networks were much slower, and not always available. Many sites on the original Usenet network would connect only once or twice a day to batch-transfer messages in and out.[citation needed]

Usenet has significant cultural importance in the networked world, having given rise to, or popularized, many widely recognized concepts and terms such as "FAQ" and "spam".[2]

Today, almost all Usenet traffic is carried over the Internet. The current[update] format and transmission of Usenet articles is very similar to that of Internet e-mail messages. However, Usenet articles are posted for general consumption; any Usenet user has access to all newsgroups, unlike email, which requires a list of known recipients.[citation needed]

Today, Usenet has diminished in importance with respect to Internet forums, blogs and mailing lists. The difference, though, is that Usenet requires no personal registration with the group concerned, that information need not be stored on a remote server, that archives are always available, and that reading the messages requires not a mail or web client, but a news client (included in many modern e-mail clients).[citation needed]

[edit] ISPs, news servers, and newsfeeds

Many Internet service providers, and many other Internet sites, operate news servers for their users to access. ISPs that do not operate their own servers directly will often offer their users an account from another provider that specifically operates newsfeeds. Usually the ISP will get a kickback for referring the customer to the Usenet provider. In early news implementations, the server and newsreader were a single program suite, running on the same system. Today, one uses separate newsreader client software, a program that resembles an email client but accesses Usenet servers instead.

Not all ISPs run news servers. A news server is one of the most difficult Internet services to administer well because of the large amount of data involved, small customer base (compared to mainstream Internet services such as email and web access), and a disproportionately high volume of customer support incidents (frequently complaining of missing news articles that are not the ISP's fault). Some ISPs outsource news operation to specialist sites, which will usually appear to a user as though the ISP ran the server itself. Many sites carry a restricted newsfeed, with a limited number of newsgroups. Commonly omitted from such a newsfeed are foreign-language newsgroups and the alt.binaries hierarchy which largely carries software, music, videos and images, and accounts for over 99 percent of article data.

For those who have access to the Internet, but do not have access to a news server, Google Groups ([1]) allows reading and posting of text news groups via the World Wide Web. Though this or other "news-to-Web gateways" are not always as easy to use as specialized newsreader software, especially when threads get long, they are often much easier to search. Users who lack access to an ISP news server can use Google Groups to access the alt.free.newsservers newsgroup, which has information about open news servers.

There are also Usenet providers that specialize in offering service to users whose ISPs do not carry news, or that carry a restricted feed.

See also news server operation for an overview of how news systems are implemented.

[edit] Newsreader clients

Newsreader clients are available for all major operating systems and come in all shapes and sizes. Mail clients or "communication suites" also now commonly have an integrated newsreader. Often, however, these integrated clients are of low quality, e.g., incorrectly implementing Usenet protocols, standards and conventions. Many of these integrated clients, for example the one in Microsoft's Outlook Express, are disliked by purists because of their misbehavior.[3]

Newsgroups are typically accessed with special client software that connects to a news server. With the rise of the World Wide Web, web front-ends have become more common. Web front ends have made Usenet more accessible by lowering the technical entry barrier requirements to one application and no Usenet server account requirement. Google Groups[4] is one of the most popular web based front ends and browsers such as Firefox can access Google Groups via news: protocol links directly.[5] There are numerous other websites now offering web based gateways to Usenet groups, although some people have begun filtering messages made by some of the web interfaces for one reason or another.[6][7]

[edit] Moderated and unmoderated newsgroups

A minority of newsgroups are moderated. That means that messages submitted by readers are not distributed to Usenet, but instead are emailed to the moderators of the newsgroup, for approval. Moderated newsgroups have rules called charters. Moderators are persons whose job is to ensure that messages that the readers see in newsgroups conform to the charter of the newsgroup. Typically, moderators are appointed in the proposal for the newsgroup, and changes of moderators follow a succession plan.

The job of the moderator is to receive submitted articles, review them, and inject approved articles so that they can be properly propagated worldwide. Such articles must bear the Approved: header line.

Unmoderated newsgroups form the majority of Usenet newsgroups, and messages submitted by readers for unmoderated newsgroups are immediately propagated for everyone to see.

Creation of moderated newsgroups often becomes a hot subject of controversy, raising issues regarding censorship and the desire of a subset of users to form an intentional community.

[edit] Technical details

Usenet is a set of protocols for generating, storing and retrieving news "articles" (which resemble Internet mail messages) and for exchanging them among a readership which is potentially widely distributed. These protocols most commonly use a flooding algorithm which propagates copies throughout a network of participating servers. Whenever a message reaches a server, that server forwards the message to all its network neighbors that haven't yet seen the article. Only one copy of a message is stored per server, and each server makes it available on demand to the (typically local) readers able to access that server. The collection of Usenet servers has thus a certain peer-to-peer character in that they share resources by exchanging them, the granularity of exchange however is on a different scale than a modern peer-to-peer system and this characteristic excludes the actual users of the system who connect to the news servers with a typical client-server application, much like an email reader.

RFC 850 was the first formal specification of the messages exchanged by Usenet servers. It was superseded by RFC 1036.

In cases where unsuitable content has been posted, Usenet has support for automated removal of a posting from the whole network by creating a cancel message, although due to a lack of authentication and resultant abuse, this capability is frequently disabled. Copyright holders may still request the manual deletion of infringing material using the provisions of World Intellectual Property Organization treaty implementations, such as the U.S. Online Copyright Infringement Liability Limitation Act.

On the Internet, Usenet is transported via the Network News Transfer Protocol (NNTP) on TCP Port 119 for standard, unprotected connections and on TCP port 563 for SSL encrypted connections which is offered only by a few sites.

[edit] Organization
The "Big Nine" hierarchies of Usenet.

The major set of worldwide newsgroups is contained within nine hierarchies, eight of which are operated under consensual guidelines that govern their administration and naming. The current "Big Eight" are:

* comp.*: computer-related discussions (comp.software, comp.sys.amiga)
* humanities.*: Fine arts, literature, and philosophy (humanities.classics, humanities.design.misc)
* misc.*: Miscellaneous topics (misc.education, misc.forsale, misc.kids)
* news.*: Discussions and announcements about news (meaning Usenet, not current events) (news.groups, news.admin)
* rec.*: Recreation and entertainment (rec.music, rec.arts.movies)
* sci.*: Science related discussions (sci.psychology, sci.research)
* soc.*: Social discussions (soc.college.org, soc.culture.african)
* talk.*: Talk about various controversial topics (talk.religion, talk.politics, talk.origins)

(Note: the asterisks are used as wildmat patterns, examples follow in parentheses)

See also the Great Renaming.

The alt.* hierarchy is not subject to the procedures controlling groups in the Big Eight, and it is as a result less organized. However, groups in the alt.* hierarchy tend to be more specialized or specific—for example, there might be a newsgroup under the Big Eight which contains discussions about children's books, but a group in the alt hierarchy may be dedicated to one specific author of children's books. Binaries are posted in alt.binaries.*, making it the largest of all the hierarchies.

Many other hierarchies of newsgroups are distributed alongside these. Regional and language-specific hierarchies such as japan.*, malta.* and ne.* serve specific regions such as Japan, Malta and New England. Companies such as Microsoft administer their own hierarchies to discuss their products and offer community technical support. Some users prefer to use the term "Usenet" to refer only to the Big Eight hierarchies; others include alt as well. The more general term "netnews" incorporates the entire medium, including private organizational news systems.

[edit] Binary content
This section does not cite any references or sources. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (May 2009)
A visual example of the many complex steps required to prepare data to be uploaded to usenet newsgroups. These steps must be done again in reverse to download data from usenet.

Usenet was originally created to distribute text content encoded in the 7-bit ASCII character set. With the help of programs that encode 8-bit values into ASCII, it became practical to distribute binary files as content. Binary posts, due to their size and often-dubious copyright status, were in time restricted to specific newsgroups, making it easier for administrators to allow or disallow the traffic.

The oldest widely used encoding method is uuencode, from the UnixUUCP package. In the late 1980s, Usenet articles were often limited to 60,000 characters, and larger hard limits exist today. Files are therefore commonly split into sections that require reassembly by the reader.

With the header extensions and the Base64 and Quoted-Printable MIME encodings, there was a new generation of binary transport. In practice, MIME has seen increased adoption in text messages, but it is avoided for most binary attachments. Some operating systems with metadata attached to files use specialized encoding formats. For Mac OS, both Binhex and special MIME types are used.

Other lesser known encoding systems that may have been used at one time were BTOA, XX encoding, BOO, and USR encoding.

In an attempt to reduce file transfer times, an informal file encoding known as yEnc was introduced in 2001. It achieves about a 30% reduction in data transferred by assuming that most 8-bit characters can safely be transferred across the network without first encoding into the 7-bit ASCII space.

The standard method of uploading binary content to Usenet is to first archive the files into RAR archives (for large files usually in 15 MB, 50 MB or 100 MB parts) then create Parchive files. Parity files are used to recreate missing data. This is needed often, as not every part of the files reaches a server. These are all then encoded into yEnc and uploaded to the selected binary groups.

[edit] Binary retention time
This section does not cite any references or sources. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (May 2009)
This is a list of the 30 biggest groups on Giganews on March 3rd, 2008, and is an example of the massive retention capabilities of a commercial usenet server.

Each newsgroup is generally allocated a certain amount of storage space for post content. When this storage has been filled, each time a new post arrives, old posts are deleted to make room for the new content. If the network bandwidth available to a server is high but the storage allocation is small, it is possible for a huge flood of incoming content to overflow the allocation and push out everything that was in the group before it. If the flood is large enough, the beginning of the flood will begin to be deleted even before the last part of the flood has been posted.

Binary newsgroups are only able to function reliably if there is sufficient storage allocated to a group to allow readers enough time to download all parts of a binary posting before it is flushed out of the group's storage allocation. This was at one time how posting of undesired content was countered; the newsgroup would be flooded with random garbage data posts, of sufficient quantity to push out all the content to be suppressed. This has been compensated by service providers allocating enough storage to retain everything posted each day, including such spam floods, without deleting anything.

The average length of time that posts are able to stay in the group before being deleted is commonly called the retention time. Generally the larger usenet servers have enough capacity to archive several weeks of binary content even when flooded with new data at the maximum daily speed available. A good binaries service provider must not only accommodate users of fast connections (3 megabit) but also users of slow connections (256 kilobit or less) who need more time to download content over a period of several days or weeks.

[edit] Legal issues
This section needs additional citations for verification.
Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (May 2009)

While binary newsgroups can be used to distribute completely legal user-created works, open-source software, and public domain material, some binary groups are used to illegally distribute vast quantities of commercial software, copyrighted media, and pornography, the last of which has its own legal implications in some countries.[citation needed]

For example, some binary groups such as alt.binaries.warez.* exist solely for the illegal distribution of commercial software.[8]

ISP-operated usenet servers frequently block access to all alt.binaries.* groups to both reduce their network traffic and to avoid all the related legal issues. Commercial usenet service providers claim to operate as a telecommunications service, and assert that they are not responsible for the user-posted binary content transferred via their equipment. In the United States, usenet providers can qualify for protection under the DMCA Safe Harbor regulations, provided that they establish a mechanism to comply with and respond to takedown notices from copyright holders.[9]

Removal of copyrighted content from the entire usenet network is a nearly impossible task, due to the rapid propagation between servers and the retention done by each server. Petitioning a usenet provider for removal only removes it from that one server's retention cache, but not any others. It is possible for a special post cancellation message to be distributed to remove it from all servers, but many providers ignore cancel messages by standard policy, because they can be easily falsified and submitted by anyone.[10][11] For a takedown petition to be most effective across the whole network, it would have to be issued to the origin server to which the content has been posted, but has not yet been propagated to other servers. Removal of the content at this early stage would prevent further propagation, but with modern high speed links, content can be propagated as fast as it arrives, allowing no time for content review and takedown issuance by copyright holders.[citation needed]

Establishing the identity of the person posting illegal content is equally difficult due to the trust-based design of the network. Like SMTP email, servers generally assume the header and origin information in a post is true and accurate. However, as in SMTP email, usenet post headers are easily falsified so as to obscure the true identity and location of the message source.[12] In this manner, usenet is significantly different from modern P2P services; most P2P users distributing content are typically immediately identifiable to all other users by their network address, but the origin information for a usenet posting can be completely obscured and unobtainable once it has propagated past the origin server.[citation needed]

Also unlike modern P2P services, the identity of the downloaders is hidden from view. On P2P services a downloader is identifiable to all others by their network address. On usenet, the downloader connects directly to a server, and only the server knows the address of who is connecting to it. Some Usenet providers do keep usage logs, but this logging information is not casually available to outside parties like the RIAA.[citation needed]

[edit] History
This section needs additional citations for verification.
Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (May 2009)

UUCP/USENET Logical Map — June 1, 1981 / mods by S. McGeady 11/19/81

(ucbvax)
+=+===================================+==+
| | | |
| | wivax | |
| | | | |
| | microsoft| uiucdcs | |
| | genradbo | | | | | | (Tektronix)
| | | | | | | purdue | |
| decvax+===+=+====+=+=+ | | | |
| | | | | | | pur-phy | | tekmdp
| | | | | | | | | | |
+@@@@@@cca | | | | | | | | |
| | | | +=pur-ee=+=+=====+===+ | |
| csin | | | | | |
| | +==o===+===================+==+========+=======+====teklabs=+
| | | |
| | | pdp phs grumpy wolfvax |
| | | | | | | |
| | cincy unc=+===+======+========+ |
| | | bio | |
| | | (Misc) | | (Misc) |
| | | sii reed | dukgeri duke34 utzoo |
| | | | | | | | | |
| +====+=+=+==+====++======+==++===duke=+===+=======+==+=========+ |
| | | | | | | | | | u1100s
| bmd70 ucf-cs ucf | andiron | | | | |
| | | | | | |
| red | | | | | pyuxh
| | | | zeppo | | | |
| psupdp---psuvax | | | | | | |
| | | | alice | whuxlb | utah-cs | | houxf
| allegra | | | | | | | | | |
| | | | | | | | | +--chico---+
| +===+=mhtsa====research | /=+=======harpo=+==+ | |
| | | | | | / | | |
| hocsr | | +=+=============+=/ cbosg---+ | |
| ucbopt | | | | | esquire |
| : | | | cbosgd | |
| : | | | | |
| ucbcory | | eagle==+=====+=====+=====+=====+ | |
| : | | | | | | | | | +-uwvax--+
| : | | | mhuxa mhuxh mhuxj mhuxm mhuxv | |
| : | | | | |
| : | | | +----------------------------o--+
| : | | | | |
| ucbcad | | | ihpss mh135a |
| : | | | | | |
| : \--o--o------ihnss----vax135----cornell |
| : | | | | |
+=+==ucbvax==========+===+==+=+======+=======+=+========+=========+
(UCB) : | | | | (Silicon Valley)
ucbarpa cmevax | | menlo70--hao
: | | | |
ucbonyx | | | sri-unix
| ucsfcgl |
| | |
Legend: | | sytek====+========+
------- | | | |
- | / \ + = Uucp sdcsvax=+=======+=+======+ intelqa zehntel
= "Bus" | | |
o jumps sdcarl phonlab sdcattb
: Berknet
@ Arpanet

UUCP/USENET Logical Map, original by Steven McGeady. Copyright© 1981, 1996
Bruce Jones, Henry Spencer, David Wiseman. Copied with permission from
The Usenet Oldnews Archive: Compilation.[13]

Newsgroup experiments first occurred in 1979. Tom Truscott and Jim Ellis of Duke University came up with the idea as a replacement for a local announcement program, and established a link with nearby University of North Carolina using Bourne shell scripts written by Steve Bellovin. The public release of news was in the form of conventional compiled software, written by Steve Daniel and Truscott.[citation needed]

UUCP networks spread quickly due to the lower costs involved, and the ability to use existing leased lines, X.25 links or even ARPANET connections. By 1983, the number of UUCP hosts had grown to 550, nearly doubling to 940 in 1984.[citation needed]

As the mesh of UUCP hosts rapidly expanded, it became desirable to distinguish the Usenet subset from the overall network. A vote was taken at the 1982 USENIX conference to choose a new name. The name Usenet was retained, but it was established that it only applied to news.[14] The name UUCPNET became the common name for the overall network.

In addition to UUCP, early Usenet traffic was also exchanged with Fidonet and other dial-up BBS networks. Widespread use of Usenet by the BBS community was facilitated by the introduction of UUCP feeds made possible by MS-DOS implementations of UUCP such as UFGATE (UUCP to FidoNet Gateway), FSUUCP and UUPC. The Network News Transfer Protocol, or NNTP, was introduced in 1985 to distribute Usenet articles over TCP/IP as a more flexible alternative to informal Internet transfers of UUCP traffic. Since the Internet boom of the 1990s, almost all Usenet distribution is over NNTP.[citation needed]

Early versions of Usenet used Duke's A News software. At Berkeley an improved version called B News was produced by Matt Glickman and Mark Horton. With a message format that offered compatibility with Internet mail and improved performance, it became the dominant server software. C News, developed by Geoff Collyer and Henry Spencer at the University of Toronto, was comparable to B News in features but offered considerably faster processing. In the early 1990s, InterNetNews by Rich Salz was developed to take advantage of the continuous message flow made possible by NNTP versus the batched store-and-forward design of UUCP. Since that time INN development has continued, and other news server software has also been developed.[citation needed]

Usenet was the initial Internet community and the place for many of the most important public developments in the commercial Internet. It was the place where Tim Berners-Lee announced the launch of the World Wide Web,[15] where Linus Torvalds announced the Linux project,[16] and where Marc Andreessen announced the creation of the Mosaic browser and the introduction of the image tag,[17] which revolutionized the World Wide Web by turning it into a graphical medium.

Web-based archiving of Usenet posts began in 1995 at Deja News with a very large, searchable database. In 2001, this database was acquired by Google.[citation needed]

AOL announced that it would discontinue its integrated Usenet service in early 2005, citing the growing popularity of weblogs, chat forums and on-line conferencing.[18] The AOL community had a tremendous role in popularizing Usenet some 11 years earlier, with all of its positive and negative aspects. This change marked the end of the legendary Eternal September. Others, however, feel that Google Groups, especially with its new user interface, has picked up the torch that AOL has dropped—and that the so-called Eternal September has yet to end.[citation needed]

Over time, the amount of Usenet traffic has steadily increased. Much of this traffic increase reflects not an increase in discrete users or newsgroup discussions, but instead the combination of massive automated spamming and an increase in the use of .binaries newsgroups in which large files (frequently pornography or pirated media) are often posted publicly. A small sampling of the change (measured in feed size per day) follows:[citation needed]
Daily Volume Date Source
4.5 GB 1996-12 Altopia.com
9 GB 1997-07 Altopia.com
12 GB 1998-01 Altopia.com
26 GB 1999-01 Altopia.com
82 GB 2000-01 Altopia.com
181 GB 2001-01 Altopia.com
257 GB 2002-01 Altopia.com
492 GB 2003-01 Altopia.com
969 GB 2004-01 Altopia.com
1.30 TB 2004-09-30 Octanews.net
1.27 TB 2004-11-30 Octanews.net
1.38 TB 2004-12-31 Octanews.net
1.52 TB 2005-01 Altopia.com
1.34 TB 2005-01-01 Octanews.net
1.30 TB 2005-01-01 Newsreader.com
1.67 TB 2005-01-31 Octanews.net
1.63 TB 2005-02-01 Newsreader.com
1.81 TB 2005-02-28 Octanews.net
1.87 TB 2005-03-08 Newsreader.com
2.00 TB 2005-03-11 Various sources
2.27 TB 2006-01 Altopia.com
2.95 TB 2007-01 Altopia.com
3.12 TB 2007-04-21 Usenetserver.com
3.07 TB 2008-01 Altopia.com
3.80 TB 2008-04-16 Newsdemon.com
4.60 TB 2008-11-01 Giganews.com
4.65 TB 2009-01 Altopia.com

In 2008, Verizon Communications, Time Warner Cable and Sprint Nextel signed an agreement with Attorney General of New York Andrew Cuomo to shut down access to sources of child pornography.[19] Time Warner Cable stopped offering access to Usenet. Verizon reduced its access to the "Big 8" hierarchies. Sprint stopped access to the alt.* hierarchies. AT&T stopped access to the alt.binaries.* hierarchies. Cuomo never specifically named Usenet in his anti-child pornography campaign. David DeJean of PC World said that some worry that the ISPs used Cuomo's campaign as an excuse to end portions of Usenet access, as it is costly for the internet service providers. In 2008 AOL, which no longer offered Usenet access, and the four providers that responded to the Cuomo campaign were the five largest internet service providers in the United States; they had more than 50% of the U.S. ISP marketshare.[20] On June 8, 2009, AT&T announced that it would no longer provide access to the Usenet service as of July 15, 2009.[21]

[edit] Internet jargon and history
This section does not cite any references or sources. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (May 2009)

Many terms now in common use on the Internet—so-called "jargon"—originated or were popularized on Usenet. Likewise, many conflicts which later spread to the rest of the Internet, such as the ongoing difficulties over spamming, began on Usenet.

[edit] Archives and Web interfaces

[edit] Google Groups / DejaNews
Main article: Google Groups

Google Groups hosts an archive of Usenet posts dating back to May 1981. The earliest posts, which date from 1981-1991, were donated to Google by the University of Western Ontario and were originally archived by Henry Spencer.[22] The archive of posts from the 1990's was originally started by the company DejaNews (later Deja), which was purchased by Google in February 2001. Already during the DejaNews era the archive had become a popular constant in Usenet culture, and remains so today.

The archiving of Usenet led to a fear of loss of privacy.[23] An archive simplifies ways to profile people. This has partly been countered with the introduction of the X-No-Archive: Yes header, which is itself seen as controversial.[citation needed]

Google Groups also allows users to create groups that can only be accessed from Google's own interface, but which look like Usenet groups in search results.

[edit] Public USENET Servers

Public USENET servers are those NNTP hosts that deliberately accept for free incoming connections from every IP address without requiring any kind of authentication. All these sites impose on their users several access limits in order to keep their spam ratio low but they also strictly protect their clients' privacy.

* Aioe.org - nntp.aioe.org
* Eternal September (previously known as Motzarella) - news.eternal-september.org (no binary groups, requires users to complete a free registration)
* Mixmin.net - drooper.mixmin.net (Posting allowed only through TLS or SSL encrypted connections)
* Bananasplit.info - news.bananasplit.info (Only privacy related groups)

[edit] See also
Internet portal

* Comparison of Usenet newsreaders
* List of Usenet newsreaders
* Usenet II

[edit] Usenet terms

* Breidbart Index
* Crossposting
* FAQ
* Flaming and flame war
* Flood aka flooders and flooding
* FWAK
* Godwin's Law
* It's always September
* kill file
* list of newsgroups
* MSTing
* scorefile
* Sockpuppet (Internet)
* sporgery
* Troll (Internet)
* Usenet Death Penalty
* Usenet cabal
* Wackyparsing
* X-No-Archive


[edit] Usenet history

* Eternal September
* Great Renaming
* Legion of Net. Heroes
* Scientology versus the Internet
* Serdar Argic

[edit] Usenet administrators

There are no Usenet "administrators" per se; each server administrator is free to do whatever pleases him or her as long as the end users and peer servers tolerate and accept it. Nevertheless, there are a few famous administrators:

* Chris Lewis
* Gene (Spaf) Spafford
* Henry Spencer
* Kai Puolamäki
* Mark Horton

[edit] Usenet celebrities
Main article: Usenet celebrity

[edit] References

Mahalo.com

Mahalo.com, Inc. Image:MahaloAlpha SiteLogo.png
Type Internet
Genre Search Engine
Founded 2007
Founder Jason Calacanis
Headquarters Santa Monica, California, USA
Key people Jason Calacanis, Founding CEO.

Elliot C.R. Cook: COO
Mark Jeffrey: CTO
Revenue Unknown, Startup May 2007
Employees 20
Slogan "We're here to help."
Website Mahalo.com
Alexa rank ~3,000
Type of site Web directory
Advertising Google AdSense
Available in English
Launched May 30, 2007
Current status beta test

Mahalo.com is a web directory (or human search engine) launched in alpha test in May 2007 by Jason Calacanis. As of January 2008, the project is in beta test. It differentiates itself from algorithmic search engines like Google and Ask.com, as well as other directory sites like DMOZ and Yahoo by tracking and building hand-crafted result sets for many of the currently popular search terms.[1][2] Mahalo means "thank you" in Hawaiian.
Contents
[hide]

* 1 Directory
* 2 Search results quality
* 3 Mahalo Daily Video Show
* 4 Mahalo Answers
* 5 Critics
* 6 Ownership and funding
* 7 Traffic and growth
* 8 Competitors
* 9 Notes
* 10 External links

[edit] Directory

Mahalo's directory employs human editors to review websites and write search engine results pages that include text listings, as well as other media, such as photos and video. Each Mahalo search results page includes links to the top seven sites, as well as other categorized information, and additional web pages from Google.[2] The company also pays freelancers to create pages for piecework compensation in the Mahalo Greenhouse - the pages are approved by a full time staff member prior to appearing in the main index.[3][4]
This article may contain original research or unverified claims. Please improve the article by adding references. See the talk page for details. (July 2008)

Mahalo's approach is similar to that employed by Ask.com in 1998. At that time, both Ask.com and Google were up-and-coming search engines.

Mahalo has started with the top 4,000 search terms in popular categories like travel, entertainment, cars, food, health care and sports and is adding about 500 more terms per week with a goal of covering the top 10,000 by the end of 2007.[1][2] This goal has been exceeded when, in December 2007, Mahalo announced that its index has reached 25,000 pages, a year earlier than it was expected.[5][6]

Mahalo also offers "how to" guides offering instructions on popular topics in an editorial fashion. Mahalo will deliver results for less popular searches from Google.

[edit] Search results quality

Mahalo's goal is to improve search results by eliminating search spam from low-quality websites, such as those that have excessive advertising, distribute malware, or engage in phishing scams.[7] Webmasters have a vested interest to see their sites listed. Calacanis has said that algorithmic search engines, like Google and Yahoo, suffer from manipulation by search engine optimization practitioners. Mahalo's reliance on human editors is intended to avoid this problem, producing search results that are more relevant to the user.[1]

[edit] Mahalo Daily Video Show

Veronica Belmont was hired by Mahalo.com to produce a daily video show for the site. Her first video was an interview with Leeroy Jenkins. Belmont left Mahalo Daily in 2008 to co-host the Revision3 series Tekzilla.

After a month-long search, Belmont's replacement was announced on June 5, 2008. Former cable sports show host Leah D'Emilio won Mahalo Vlog Idol and co-hosted the show with Mahalo.com employee Lon Harris until leaving the show in March 2009.

Mahalo Daily produces a segment every Friday titled "This Week in YouTube." Since D'Emilio's departure from the show, Lon Harris has hosted with the show with guest Shira Lazar.

[edit] Mahalo Answers

On December 15, 2008, Mahalo launched a new service called Mahalo Answers.[8] The service is similar to Yahoo! Answers in that it allows users to pose questions regarding a wide variety of subjects, and those questions will be answered by other users. A key difference between the two services is that Mahalo Answers allows questioners to give a monetary reward (called a "tip" on the site) to the user who provides the most helpful response. [9] Tips are paid using "Mahalo dollars," which are bought using PayPal, and, once earned, can then either be used to tip other users or be cashed in at a 75% exchange rate.

[edit] Critics

Jim Lanzone, CEO of Ask.com said, "Just like a lot of people who watch movies think they can be scriptwriters, there are a lot of people who use search engines who think they can build a search engine." Lanzone cited the fact that about 60% of search inquiries to Ask are unique as just one of the challenges of running a search engine.[1] Google claims that 20% to 25% of its search inquiries have never been used before.[2]

At the SMX Conference in June 2007, Google software engineer Matt Cutts explained that while he supports different approaches to search, like Mahalo, it is untrue that humans have nothing to do with Google's search results. As examples of human involvement he cited Google's use of hyperlink analysis, toolbar voting, and user reporting of spam. Cutts suggested that Google would evolve to take advantage of social media.[10]

[edit] Ownership and funding

Lead investors in Mahalo.com include Sequoia Capital's Michael Moritz, an early investor in both Google and Yahoo; Elon Musk, founder of PayPal; and News Corporation.[11][12] Other disclosed investors include Dallas Mavericks owner Mark Cuban and AOL chairman Ted Leonsis[13] Jason Calacanis has said that he has enough funding to run Mahalo for four or five years without making a profit. Mahalo eventually hopes to make a profit by selling ads next to search results.[1]

[edit] Traffic and growth

Mahalo has experienced moderate growth since it was launched in May 2007. Mahalo.com traffic has increased from roughly ten thousand visitors a month in July 2007, to two million visitors a month in January 2008. [14] In the three month period leading up to the date of February 23 2008, the number of global internet users who visited Mahalo.com rose by fifty percent, and the site is currently ranked by Alexa in the top 3000 most visited websites.[15] On October 22, 2008, Calacanis announced that he was laying off 10 percent of Mahalo's employees (2 persons) due to the economic downturn. Conflicting reports suggest that the percentage of employees let go was much higher, with reports stating that it was a third of staff.[16]

[edit] Competitors

Other companies that have created a human powered search alternative include:

* Wikipedia
* About.com
* FindingDulcinea
* Organized Wisdom

MyDD

MyDD
Created by Jerome Armstrong
Political affiliation progressive/liberal
Website mydd.com

MyDD is a collaborative politically progressive American politics blog. It was established by Jerome Armstrong in 2001. Its name was originally short for "My Due Diligence." In January 2006, the name was changed to "My Direct Democracy" as part of a site redesign, with the new tagline "Direct Democracy for People-Powered Politics." All the authors of the blog, who currently include Charles Lemos, Jonathan Singer, Todd Beeton and Josh Orton, support the Democratic Party.

[edit] History

The first Dean grassroots web site [1] was created at MyDD in April, 2002. [2] In early 2003, Joe Trippi learned of Meetup through Armstrong and MyDD. [3] Armstrong shut down MyDD in 2003 to work on Howard Dean's presidential campaign. After lying dormant for a year, MyDD was re-launched with the Scoop blogging platform in March 2004, with blogger Chris Bowers. MyDD was instrumental in online campaigning and organizing of grassroots action to elect Howard Dean as Chairman of the Democratic National Committee in January, 2005, with "the pro-Dean site MyDD.com, which served as a key clearinghouse of information about the race." [4]


Several early contributors to MyDD became prominent in politics on the Internet. Markos Moulitsas Zúniga, founder of the most-visited political blog in the world, Daily Kos, began commenting on MyDD before starting his own blog in May 2002, and refers to MyDD as his "blogfather".

Armstrong attended the California State Democratic convention in Sacramento in March, 2003 with Markos Moulitsas of Daily Kos. According to Instapundit, they may have been the first bloggers to be officially accredited at a political convention.

Mathew Gross, creator of the blog on Howard Dean’s web-site, was another contributor to MyDD. Joe Trippi, former campaign manager for Howard Dean, met and hired Gross based on Gross' involvement with MyDD.[5] "One day, soon after we'd moved to a larger quarters in a South Burlington office park, I looked up to see this tall young guy with an earring and a nearly shaved head wandering around the office. Security had just grabbed him and was hauling him away when he yelled out to me: 'Wait! I blog on MyDD.com!' This was, of course, the political Web site where I'd first heard about Meetup.com. 'You're hired!' I yelled."

Other notable bloggers on MyDD that went on to work with campaigns in 2005-06 include Matt Stoller with Jon Corzine, Scott Shields with Bob Menendez, and Tim Tagaris with Ned Lamont and Sherrod Brown.

The site garnered a great deal of attention during the 2004 U.S. Presidential Election for being the first source to break the exit polls.[6]

MyDD was profiled in late 2005 as part of the article "Blogging Down the Money Trail" in Campaigns and Elections magazine. The article focused on the special election in Ohio's second congressional district and the ability of blogs like MyDD, Daily Kos, and Swing State Project to raise funds for Democratic candidates and draw national attention to local races. The magazine credits MyDD with being "the first major liberal blog."

During the 2006 midterm elections, MyDD's Chris Bowers launched two campaigns in October 2006 on MyDD.com before the 2006 congressional elections. The first was the "Use it or Lose it" to prompt safe Democrats to give 30 percent of their campaign funds to other Democratic causes; the second was a Google bomb campaign to raise the site listings for negative news articles on a set of Republican incumbents.

The "Use It or Lose It" campaign calling on bloggers and Democratic activists to pressure Democrats in safe seats to ask them to give at least 30 percent of their campaign accounts to the Democratic Congressional Campaign Committee or directly to Democrats in competitive congressional races (subject to FEC limits)[7]. "Safe" Democrats was defined as those who were either running unopposed by a Republican, or whose Republican opponent raised less than $10,000 (and thus were not considered serious opponents). The lists of such Democrats were pulled from FEC filings. The campaign drew media attention[8] and also brought MoveOn on board with their own page promoting the campaign[9].

Almost immediately after starting the "Use it or Lose It" campaign, Bowers began a Googlebomb campaign to increase the search results of a set of negative articles about endangered Republican congressional incumbents[10]. The idea was to reach less-informed voters who might use Google to search for information on candidates, most often by simply entering the person's name. Taking advantage of the Google indexing algorithm, having many people link to these articles using the candidates' names, would raise their prominence in the search results.

The articles chosen were to be from non-partisan news sources, and factually negative about the chosen Republicans.[citation needed] Local news sources were preferred over national news.[citation needed] The chosen list included mainly such sources, but also some Wikipedia pages[11]. The candidates chosen were culled from an initial list (chosen by Bowers) of 70, down to 52. The candidates cut were those whom a suitably credible and negative article could not be found.

The concept drew criticism from conservative bloggers[12] although the right had used the same tactic against John Kerry in the 2004 election.[13][14]. In June 2007, front-pagers Matt Stoller and Chris Bowers, left MyDD to found a new political blog, OpenLeft, which went online on July 9, 2007.

Founder Jerome Armstrong was known in the blogosphere for his criticisms of democratic presidential candidate Barack Obama. "I was rooting that it would come down to Edwards and Clinton -- that to me represents a battle of Democratic values and ideas," said Armstrong. "Obama's candidacy is really just personality-driven, wrapped with quasi-religious overtures."[15] As a result, many of the supporters of rival candidate Hillary Clinton migrated to MyDD. However, MyDD was one of the few blogs to let frontpage bloggers from all the campaigns post on the front page. Longtime editor Jonathan Singer supported for the Illinois Senator, and in June, 2008, Josh Orton, the Netroots Nation political director and former Online Outreach director for Barack Obama, joined MyDD as a frontpage blogger.[16] Peter Jukes, aka "brit" on MyDD, wrote a post-cap of the Obama-Clinton primary war on MyDD, titled, "My Story: Flaming for Obama" in September, 2008 for Prospect Magazine which detailed the combative primary on MyDD.[17].

HTC Dream

Jump to: navigation, search
HTC Dream HTC Dream
Slogan Amazing Possibilities. Exceptional Experiences.
Manufacturer HTC
Carrier T-Mobile
Available October 22, 2008 (US)
Screen 3.2 in (81 mm) HVGA (480×320) (180 ppi) 65K color capacitive touchscreen
Camera 3.2 megapixel with auto focus
Operating system Android 1.5[1][2]
Input Capacitive touchscreen, sliding QWERTY Keyboard, Trackball
CPU Qualcomm MSM7201A ARM11 @ 528MHz
Default ringtone G1 Mixtape
Memory 192 MB DDR SDRAM 256 MB Flash
Memory card microSD
Networks Quad band GSM / GPRS / EDGE: GSM 850 / 900 / 1800 / 1900
Dual band UMTS / HSDPA / HSUPA: UMTS 1700 / 2100 (US/Europe) (7.2/2 Mbit/s)
Connectivity Bluetooth 2.0, IEEE 802.11 b/g, ExtUSB
Battery 1150 mAh
Physical size 117.7 mm x 55.7 mm x 17.1 mm (4.60 in x 2.16 in x 0.62 in)
Weight 158g w/ battery
Series A Series
Successor HTC Magic

The HTC Dream (also marketed as T-Mobile G1 in Europe and the US) is an Internet-enabled 3G smartphone with an operating system designed by Google and hardware designed by HTC. It was the first phone to the market that uses the Android mobile device platform.[3] The phone is part of an open standards effort of the Open Handset Alliance.[4]

It was released in the US on 22 October 2008; in the UK on 30 October 2008;[5] and became available in other European countries including Austria, Netherlands, and the Czech Republic in early 2009.[6] It was released in Germany in February 2009 with a QWERTZ keyboard.[7] On 10 March 2009, it became available in Poland as Era G1 under a local mobile brand affiliated with T-Mobile.[8]

In the US, it is priced starting at $149.99 for new and existing T-Mobile customers if purchased with a two-year T-Mobile voice and data plan, and $399 without a contract.[9] Contrary to claims made by T-Mobile representatives, the handset does not need the data plan to work; however, the Access Point Name (APN) settings need to be changed to make the Multimedia Messaging Service (MMS-Picture Messages) work.[10] The Dream comes in black, bronze (except in the UK), or white.[11]

On 23 April 2009, T-Mobile USA announced it had sold one million G1s since the device's launch.[12]

On 5 February 2009, the phone was released through Optus in Australia, as the HTC Dream.[13] On 21 February 2009, Singapore became the first country in Asia to introduce the phone. It was sold by SingTel between $25 – $159 under various contracts.[14] [15] Telefónica also launched a version of the phone in Spain on 20 April 2009[16][17] with slightly modified control buttons.[18]

On June 2nd, 2009 it was released through Rogers Wireless in Canada as the HTC Dream. This edition of the HTC Dream includes the UMTS 850 / 1900 bands for use on Rogers' 3G network. [19]
Contents
[hide]

* 1 Hardware
* 2 Software
o 2.1 Google Maps
o 2.2 Location information
o 2.3 Viewing and editing text
o 2.4 Clipboard
o 2.5 Navigation within a large webpage
* 3 Native code
* 4 Developer edition
* 5 Homebrew
* 6 References
* 7 Further reading
* 8 See also
* 9 External links

[edit] Hardware

* Display: 3.2 in (8.1 cm) TFT-LCD flat glass touch-sensitive HVGA screen with 480 X 320 pixel resolution. The capacitive touchscreen makes it impossible to use a standard stylus. The display switches from portrait to landscape mode when the keyboard is opened. Users can interact to bring up or move content with a finger touch, tapping or touch-drag motion.[20] The touchscreen hardware is capable of multitouch gestures[21], but Android does not currently support it.
* CPU: The MSM7201A is an ARM-based, dual-core CPU/GPU from Qualcomm and contains many built-in features, including 3G and a GPU capable of up to 4 million triangles/sec. It has hardware acceleration for Java,[22] but this does not accelerate execution of Android applications, as they are targeted to the Dalvik VM, not the Java VM.
* Keyboard: The HTC Dream has a sliding full 5 row QWERTY keyboard. It also comes with a set of 6 navigation buttons:
o phone (green, black in UK) – make outbound calls, receive incoming calls, or open the dialer.
o home (black) – displays home screen with shortcut icons for some applications and a drawer containing all applications on the phone.
o trackball – navigate among items on the screen or scroll in text fields.
o back (black) – return to the previous screen.
o phone (red, black in UK) – end currently active call or put phone into sleep mode.
o menu (black) – display the contextual menu for the current screen.
o a touchscreen keyboard is available as of Android 1.5.
* Side controls: A pair of volume buttons is located on the left side of the phone, and a camera button on the right side.
* Audio: In place of a headphone jack, the Dream (like many HTC smartphones) has a mini-USB-compatible ExtUSB jack [23][24] that carries audio signals alongside the regular USB signals, and can be converted with a dongle (now shipped with the phone) to support any standard 3.5 mm headphone. The standard headset includes a clip-on microphone and call answer/hangup button. The Dream supports audio files in MP3, AAC, AAC+, WMA, MPEG4, WAV, MIDI, and Ogg formats.[25]
* Camera: The HTC Dream has a 3.2-megapixel camera with autofocus functionality.[26]
* Video: The Dream can play H.264, streaming, 3GPP, MPEG4, and 3GP files.[25] There is no light ("flash") for the camera in low light conditions. Video recording and uploading to YouTube is available as of Android 1.5. Recording resolution 352x288 H.263 3GP Mono sound @ 8KHz.
* Storage: The HTC Dream has a microSD card slot and comes with a 1GB memory card (2GB in the UK and Canada). It has been confirmed to work with capacities up to 16GB, and may work with even larger cards. [27] When the USB cable is connected to a computer, this computer can access the card without removing it from the HTC Dream. The phone can access media files arranged in folders, but the folders have to be created from the computer.
* Battery: The HTC Dream has a user-replaceable, 3.7V, 1150 mAh (4.25Whr) rechargeable lithium ion battery, which is advertised to offer up to 130 hours of standby power.
* Orientation and location: The HTC Dream provides an accelerometer for sensing movement and which way up it is. It also has a digital compass, giving it complete orientation data. The Dream includes a GPS receiver for fine-grained positioning, and can use cellular or wifi networks for coarse-grained positioning.
* Case: Three different colors are available: black, bronze, white.

[edit] Software

The HTC Dream runs the Android Operating System. Most Android software developers write their applications in Java, but because Android does not directly run Java bytecode, they first need to be compiled into a unique non-Java bytecode, Dalvik bytecode, before they can be executed on an Android-powered phone.

The Home screen allows the user to place icons for applications, contacts, and other items on three virtual desktops. It also supports widgets, but until version 1.5 of the operating system was released, third-party applications were not able to install their own widgets.[28] Since the release of Android 1.5, however, third-party widgets are enabled.

It comes with a web browser powered by the WebKit rendering engine, the same one used by Safari and Chrome.

Pre-installed software applications provide access to many Google services, including Gmail, Google Calendar, Google Maps, Google Talk, and a YouTube video player.[25] In the United States, the carrier-subsidized firmware for the G1 also comes with an application for accessing the Amazon MP3 music store, which allows users to browse and legally purchase DRM-free songs; however, in developer firmwares this application is not included.[29]

Also included with the device is the Android Market application that allows users to download new software applications from third-party developers, as well as provide publicly-viewable ratings and comments.

[edit] Google Maps

The Google Maps application supports map, satellite, traffic, and street view. Street View uses the accelerometer and digital compass to align the view on the screen with the actual orientation of the phone. It also comes with Google latitude, a location based service that allows friends who also have Google maps to view your locale, add status updates, chat, etc. Google maps has recently been updated to provide a more robust search experience. Searching for a business would bring up the usual search criteria, as well as ratings, reviews, and if applicable, the website to the business. Another addition to the update includes the ability to receive routes for different methods of transportation including walking, and public transit. This update can be found in the Android market.

[edit] Location information

The HTC Dream offers two different sources of location information for applications such as Google Maps: a GPS receiver built-in to the chipset, and radio-tower location based on a database of mobile phone tower locations. In addition, the Dream includes a digital compass; it allows one to turn the phone showing the local map to orient it correctly (the map is not turned on the screen, North is always up).

[edit] Viewing and editing text

To open a sideloaded text file, additional software has to be acquired, for example the free ASTRO File Manager. Instead one can also store text in Gmail as draft message. One can modify a text while also preserving the old version by first copying the text (see below) to another draft message.

Documents in Google Docs can be viewed, but not edited. However, spreadsheets in Google Docs (including the texts in them) can be edited.[30][31]

[edit] Clipboard

A clipboard allows copying, cutting and pasting.

[edit] Navigation within a large webpage

For webpages a magnifying window provides simultaneous 2D-scrolling, thus allowing quick approximate access to any part of even a very long (or wide) webpage. However, this does not work in scrollable text in a subwindow, such as in the case of a long Gmail text.

[edit] Native code

Native code can be executed using the ADB debugger, which is run as a background daemon on the HTC Dream.[32] The shell will run with the user ID of the "shell" user rather than root. When the Dream was first released, it was quickly discovered that the telnet daemon on the phone is given a uid of 0 (root) when it runs, giving the end user complete access to the device. This security hole has since been fixed in build RC30 of Android and was pushed to all devices via an OTA update.[33] However, it is still possible to downgrade to the old firmware in order to exploit the bug and gain root access to the Dream.[34]

The Dream firmware can be updated by flashing from an image stored on the microSD card.[35] These images are cryptographically signed by either the phone manufacturer or network carrier.[36]

The Android Dev Phone 1 allows native code and custom kernels to be run without any special hacks.

Following the disclosure of a root exploit, Jay Freeman released details of how to run Android and ARM Debian Linux together on the Dream.[37]

[edit] Developer edition

On December 5, 2008, Google announced the Android Dev Phone 1, a hardware unlocked version of the HTC Dream. With this version, the user is not only able to use any GSM/UMTS carrier, but also has complete superuser access to the device which is not found in the retail version. The advantages to this version is that it gives full access to the internal files of the phone, in particular changing and re-flashing the bootloader and operating system.[38] This version also has pre-installed utilities to aid in the development of Android apps. This version is sold for US$399 and is only available to registered members of the Android community which is open to all developers for a US$25 fee.[39] Depending on the country, the additional shipping charges (which include tariff and tax) may amount to a substantial fraction of the base price; for example, shipping charges to United Kingdom are $128.25, to Germany are US$134.31 and to Poland US$162.

The phone comes pre-configured for access to T-Mobile's data networks worldwide. If a T-Mobile SIM card is not readily available then details of another operator's network must be entered before the initial setup of the phone can be completed. While prompted for a Google account details the APN settings can be accessed by pressing the Menu key. The existing T-Mobile's entries in the list should not be modified in any way. Instead, a new entry can be created by pressing the Menu key again and choosing Add. All the settings should remain at default values, except those listed in the table below. When the entry is complete it should be saved by pressing the Menu key and choosing Save. From there one can return to the initial setup by pressing the Back arrow key.
Name APN Username Password Port MMSC MMS Proxy MMS Port MNC
UK Networks
O2 PAYG payandgo.o2.co.uk vertigo password
O2 Contract mobile.o2.co.uk o2web password
Orange orangeinternet 9201 33
Rogers Wireless internet.com wapuser1 wap
Three Mobile (3) three.co.uk http://mms.um.three.co.uk:10021/mmsc mms.three.co.uk 8799 20
Vodafone internet web web

[edit] Homebrew

Upon the launch of the T-Mobile G1, one concern among developers was that limitations were present in its build of Android that blocked superuser access to the phone. However, a severe vulnerability was soon discovered in early versions of the firmware — everything typed into the phone's keyboard was being interpreted as commands in a root shell.[40] By using telnetd to exploit this, users could flash a modified image with root access. This process, dubbed "rooting" by the community,[41] allows users to gain superuser access and perform actions previously impossible without root access, such as installing custom builds of Android, running Debian[42], installing custom themes, and enabling ad-hoc Wi-Fi tethering. Although Google and T-Mobile were quick to patch this vulnerability with update RC30, a ROM from HTC was later leaked allowing users to downgrade to an older firmware with the bug.[43] The Android Dev Phone 1 comes with superuser access officially integrated into its firmware.[44]

Rooting also allows the use of modified images to run on the G1 through the original vulnerability. For example, a leaked HTC Magic (Android 1.5) OS was modified to run on the device. Before the official Android 1.5 build for the HTC Dream was released (which included these features), this enabled functionality such as video recording, stereo Bluetooth and an on-screen keyboard.

Webmail

Webmail (or Web-based e-mail) is an e-mail service intended to be primarily accessed via a web browser, as opposed to through a desktop e-mail client (such as Microsoft Outlook, Mozilla's Thunderbird, or Apple Inc.'s Mail). Very popular webmail providers include Gmail, Yahoo! Mail, Hotmail, and AOL.[1]

A major advantage of web-based e-mail over application-based e-mail is that a user has the ability to access their inbox from any Internet-connected computer around the world. However, the need for Internet access is also a drawback, in that one cannot access old messages when not connected to the Internet.

In 1997, before its acquisition by Microsoft, Hotmail introduced its service, which became one of the first popular web-based e-mail offerings. Following Hotmail's success, Google's introduction of Gmail in 2004 sparked a period of rapid development in webmail, due to Gmail's new features such as JavaScript menus, text-based ads, and bigger storage.

Contents

[hide]

[edit] History

The first webmail software was called simply WebMail and was developed in perl by Luca Manunza[2][3] when he was working at CRS4, in Sardinia. The first working demo[4] was released on March 10, 1995; thereafter the source[5] was released (with registration required) on March 30, 1995.

[edit] Software packages

There are also software packages that allow organizations to offer e-mail through the web for their associates. Some solutions are open source software like SquirrelMail, BlueMamba, RoundCube and IlohaMail, while others are closed source like the Outlook Web Access module for Microsoft Exchange. Conversely, there are programs that can simulate a web browser to access webmail as if it were stored in a POP3 or IMAP account. They are susceptible, though, to changes in the user interface of the web service since there is no standard interface.

Some providers offer web access to other's e-mail servers. This allows web access to mailboxes where the mail server does not offer a web interface, or where an alternative interface is desired.

[edit] Rendering and compatibility

There are important differences in rendering capabilities for many popular webmail services such as Yahoo Mail, Gmail, and Windows Live Hotmail. Due to the various treatment of HTML tags, such as