Saturday, February 23, 2013

Is historical claim behind the mystery group of (armed?) Filipinos in Borneo?

Malaysian troops are negotiating with about 100 men from the Philippines who have identified themselves as the 'royal army' of the Sulu Sultanate, which has a historic land claim to the area, say police.

By Simon Roughneen,?Correspondent / February 15, 2013

Malaysian policemen check a vehicle along the main road near Lahud Datu in Malaysia's eastern Sabah state Thursday. Malaysian security forces in Borneo surrounded armed intruders believed to be from the southern Philippines and sought to persuade them to leave peacefully Thursday, authorities said.

Bernama News Agency/AP

Enlarge

It's around an hour by speedboat from Sulu in the southern Philippines to Sabah in the Malaysian part of Borneo, a route often plied by fishermen, traders, and migrants. The maritime route goes from what is the poorest part of the Philippines to eastern Malaysia, and many make the journey in search of work.

Skip to next paragraph

Recent posts

' + google_ads[0].line2 + '
' + google_ads[0].line3 + '

'; } else if (google_ads.length > 1) { ad_unit += ''; } } document.getElementById("ad_unit").innerHTML += ad_unit; google_adnum += google_ads.length; return; } var google_adnum = 0; google_ad_client = "pub-6743622525202572"; google_ad_output = 'js'; google_max_num_ads = '1'; google_feedback = "on"; google_ad_type = "text"; google_adtest = "on"; google_image_size = '230x105'; google_skip = '0'; // -->

But when on Tuesday around 100 men arrived in batches to ? and depending on what account you read ? camp out in, or occupy a village called Lahud Datu, it soon become clear these weren't the usual fishermen or migrant workers.

What exactly is going on is unclear, but it has both countries on high alert. Malaysian security forces have sealed off the village, which is 300 miles from Sabah's regional capital Kota Kinabalu, a two-hour flight from Malaysia's main city Kuala Lumpur.

On Thursday, Malaysia's Home Affairs Minister Hishamuddin Hussein said that Malaysian security forces had cornered the group, said to be armed. By Friday, however, the Sabah police chief was reportedly negotiating with the men, some of whom were claiming to be descendants of the Sultan of Sulu and therefore, they said, entitled to land in this part of Malaysia.

What is the Sultanate, anyway?

The sultanate, or the territory the sultan governed, existed from the late 15th century until the late 19th century, governing Muslims spanning parts of Sulu and northern Borneo.
??
Though the sultanate is not recognized anymore internationally as a governing entity, Malaysia still pays a token "rental fee" to the heirs of the last sultan.
??
The claims could put the Philippines in an awkward position, embroiled in an unwanted territorial dispute, given that the men camped out in Lahud Datu are Filipino nationals.

Who are these men?

Though it?s unclear who this ?royal army? is, analysts are eyeing three southern Filipino militias. Militants from the southern Philippines have a history of crossing the narrow stretches of water to Borneo.
??
Some speculated at first that the groups' appearance had something to do with deadly clashes in early February between the Moro National Liberation Front (MNLF) and Abu Sayyaf, two Muslim armed groups from Mindanao, in the southern part of the Philippines.
??
Some Filipino media reports suggested that at least some of the men who crossed the waters to Sabah are MNLF fighters. But that has not been confirmed.
??
The MNLF signed a peace deal with the Manila government in 1996, while the Moro Islamic Liberation Front (MILF), a MNLF splinter, recently forged its own tentative peace agreement with the Filipino government (with the aid of Malaysia).
??
By far the smallest of the three groups, Abu Sayyaf opposes the agreements, as they grant autonomy to parts of Muslim Mindanao, because Abu Sayyaf has said it wants an Islamic state in the southern Philippines.
??
And Abu Sayyaf has been known to make the same crossing to Malaysia as these self-described descendants of the Sultan of Sulu, much more frequently than other groups, as they have been pursued on and off by the Filipino Army.
??
Abu Sayyaf has long been linked to Al Qaeda. It?s known for hosting the likes of Khalid Sheihk Mohammed, a central figure in the 9/11 attacks. And it is also known for taking 20 people, mainly tourists, hostage in 2000 in Malaysia.
??
These days, though, the group seems more like a criminal gang than a politically-motivated terror cell. It is currently holding, by some estimates, six foreign hostages who it likely wants to exchange for ransom, a money-making tactic used by Abu Sayyaf in the past.
??
MNLF leaders spun a recent attack on Abu Sayyaf as an attempt to crush the group, end such hostage-taking, and thus widen the appeal of the impoverished southern Philippines to tourism.
??
If this group of self described descendants are linked to either the MILF or MNLF, Manila will hardly be happy that groups with which it signed peace deals crossed to Malaysia and faced off with Malaysian soldiers. If they're linked to Abu Sayyaf, it would highlight the inability of US-trained Filipino troops to rein in the group.

Source: http://rss.csmonitor.com/~r/csmonitor/globalnews/~3/bqZeg8Tyb-I/Is-historical-claim-behind-the-mystery-group-of-armed-Filipinos-in-Borneo

jr smith chris anderson rondo suspended bay bridge band of brothers presidents george washington

Field Poll shows overwhelming support for path to citizenship (San Jose Mercury News)

Share With Friends: Share on FacebookTweet ThisPost to Google-BuzzSend on GmailPost to Linked-InSubscribe to This Feed | Rss To Twitter | Politics - Top Stories Stories, News Feeds and News via Feedzilla.

Source: http://news.feedzilla.com/en_us/stories/politics/top-stories/286658533?client_source=feed&format=rss

naacp glen campbell jerusalem artichoke bud shootout aretha franklin stevie wonder new orleans weather

Friday, February 22, 2013

Scroogled: Microsoft escalates Google battle

Escalating one of tech's biggest rivalries, Microsoft is accusing Google (GOOG) of compromising the privacy of Gmail users -- leveling the charge in an unusual, in-your-face ad campaign that it hopes will resonate with consumers even if some analysts call it alarmist and irresponsible.

The public attacks -- in print, television and billboard messages that warn consumers about the supposed dangers of being "Scroogled," or mistreated by Google -- marks a strategic shift in a clash of Internet titans, under the guidance of a bare-knuckle political campaign

An image from Microsoft's "Scroogled" ad campaign against Google.

strategist.

Despite spending billions of dollars to build its own Bing search engine and online advertising service, Microsoft has failed to put much of a dent in Google's dominance of the Internet ad business. It has also gained little traction with a behind-the-scenes effort to convince government officials that Google's business is anti-competitive.

Now the Redmond, Wash., software giant is waging a high-profile, election-style blitz against its Mountain View rival -- using public opinion polls, for example, to shape rapid-fire attacks -- with the help of Mark Penn, a veteran public relations executive and former campaign adviser to former President Bill Clinton and his wife, former Secretary of State Hillary Rodham Clinton.

Penn, who previously consulted for Microsoft, was hired full time last year.

Penn has been advising Microsoft on how to "take a fairly esoteric and complex issue and make it accessible to people who don't live in technology all day," according to Stefan Weitz, who oversees Microsoft's online businesses.

Public attacks on competitors aren't the norm for most tech companies, but there is precedent. Oracle (ORCL) has blasted Hewlett-Packard (HPQ) and others. Two years ago, Google accused Bing of copying its search results.

But some industry experts are critical of Microsoft's latest volley, which suggests Google is invading users' privacy by delivering ads tailored to keywords in consumers' email messages. Analysts say the practice, which relies on automated software, has been accepted for nearly a decade.

"The idea that one company is better than the other is disingenuous and deceptive," said

consumer advocate Jeff Chester of the Center for Digital Democracy, adding that all major Internet companies collect user data. "If Microsoft was as successful as Google in the search business, you would not hear a peep out of them on privacy."

Another privacy advocate, however, said he's happy to see a big Internet company treat privacy as a competitive feature. "I think it's very healthy," said John Simpson of the nonprofit Consumer Watchdog, "even if they're using it to make a buck."

Microsoft first showed its new strategy in the fall with a campaign urging consumers to compare Bing and Google search results. It followed up with ads that criticized a recent change in Google's shopping search service, which had begun showing

An image from Microsoft's "Scroogled" ad campaign against Google.

only results from merchants who pay to be included.

Google says its new shopping search policy means consumers get the most accurate listings, but Microsoft and other critics accused Google of quietly abandoning objective results. The attack drew a backlash, however, after the influential blog Search Engine Land reported Bing's shopping service was also requiring merchants to pay for listings.

While Microsoft says it now accepts free listings, blogger Danny Sullivan, the founding editor of Search Engine Land who sharply criticized Google's shopping listings, concluded that "Bing itself does the same things it accuses Google of."

Microsoft launched another round of ads this month that attack Google's widely used email program, the free, Web-based service Gmail.

"Google looks for keywords in your personal email and uses them to target you with paid ads," Microsoft charged. In an interview, Weitz cited a Microsoft poll that found people overwhelmingly disapprove of that practice, which he said consumers view as "creepy."

The poll itself, and the Microsoft ads, don't specify how emails are scanned. In a statement, Google said "no humans read your email" and said Gmail ads are chosen by "an automated algorithm" similar to programs that screen out spam.

Google has operated Gmail on the same model since 2004, using software that automatically looks for keywords, such as names of consumer products or travel destinations, which trigger the display of relevant ads.

"Nobody from Google or its advertisers has the ability to come to your house or call you on the phone as a result" of the Gmail program, said Rebecca Lieb, digital ad analyst for the Altimeter Group. "I think it's irresponsible to raise alarms about things that aren't adequately explained."

Microsoft's own Web-mail service, Outlook.com, doesn't tailor ads to the content of emails, but it does shows ads based on age, gender and other details that users provide when they open an account.

Weitz conceded Microsoft's campaign may not resonate with tech-savvy industry insiders. But citing comments posted on a Microsoft website, he added, "A number of folks in the mainstream are saying there's a point here. Their response is uniformly outraged."

Contact Brandon Bailey at 408-920-5022. Follow him at Twitter.com/BrandonBailey.

Internet clash

Google handled 67 percent of Internet search requests last month; Microsoft had 16.5 percent and Yahoo had 12.1 percent, according to comScore.
Google had 74.5 percent of U.S. search advertising revenue last year; Microsoft had 8 percent and Yahoo had 6.2 percent, according to eMarketer.
Google says its Gmail service has 425 million active users; Microsoft says its Web-based Outlook.com has 25 million users.

Sources: comScore, eMarketer, Google, Microsoft.

Source: http://www.siliconvalley.com/ci_22602877/scroogled-microsoft-escalates-google-battle?source=rss_viewed

Kick Ass Torrents jamarcus russell Lone Star College Sloane Stephens Beyonce Lip Sync Star Wars citizens bank

NASA and JPL contribute to European Jupiter mission

Feb. 21, 2013 ? NASA has selected key contributions to a 2022 European Space Agency (ESA) mission that will study Jupiter and three of its largest moons in unprecedented detail. The moons are thought to harbor vast water oceans beneath their icy surfaces.

NASA's contribution will consist of one U.S.-led science instrument and hardware for two European instruments to fly on ESA's Jupiter Icy Moons Explorer (JUICE) mission. Jeffrey Plaut of NASA's Jet Propulsion Laboratory in Pasadena, Calif., will be the U.S. lead for the Radar for Icy Moon Exploration experiment. The radar experiment's principal investigator is Lorenzo Bruzzone of Universita degli Studi di Trento in Italy.

Under the lead of Bruzzone and the Italian Space Agency, JPL will provide the transmitter and receiver hardware for a radar sounder designed to penetrate the icy crust of Jupiter's moons Europa, Ganymede and Callisto to a depth of about 5 miles (9 kilometers). This will allow scientists to see for the first time the underground structure of these tectonically complex and unique icy worlds.

JUICE will carry 11 experiments developed by scientific teams from 15 European countries, the United States and Japan.

The spacecraft will orbit Jupiter for three years and travel past Callisto and Europa multiple times, then orbit Ganymede, a moon larger than the planet Mercury. JUICE will conduct the first thorough exploration of Jupiter since NASA's Galileo mission from 1989-2003.

By studying the Jupiter system, JUICE will look to learn more about the formation and evolution of potentially habitable worlds in our solar system and beyond.

"NASA is thrilled to collaborate with ESA on this exciting mission to explore Jupiter and its icy moons," said John Grunsfeld, NASA's associate administrator for science in Washington. "Working together with ESA and our other international partners is key to enabling future scientific progress in our quest to understand the cosmos."

The solar-powered spacecraft will carry cameras and spectrometers, a laser altimeter and an ice-penetrating radar instrument. The mission also will carry a magnetometer, plasma and particle monitors, and radio science hardware. The spacecraft is scheduled to arrive at the Jupiter system in 2030.

"The selection of JUICE's instruments is a key milestone in ESA's flagship mission to the outer solar system, which represents an unprecedented opportunity to showcase leading European technological and scientific expertise," said Alvaro Gimenez Canete, ESA's director of science and robotic exploration.

NASA invited researchers in 2012 to submit proposals for NASA-provided instruments for the mission. Nine were reviewed, with one selected to fly. NASA agreed to provide critical hardware for two of the 10 selected European-led instruments. NASA's total contribution to the JUICE mission is $100 million for design, development and operation of the instruments through 2033.

In addition to the radar team and instrument, the NASA contributions are:

-- Ultraviolet Spectrometer: The principal investigator is Randy Gladstone of Southwest Research Institute in San Antonio. This spectrometer will acquire images to explore the surfaces and atmospheres of Jupiter's icy moons and how they interact with the Jupiter environment. The instrument also will determine how Jupiter's upper atmosphere interacts with its lower atmosphere below, and the ionosphere and magnetosphere above. The instrument will provide images of the aurora on Jupiter and Ganymede.

-- Particle Environment Package: The principal investigator is Stas Barabash of the Swedish Institute of Space Physics. The U.S. lead is Pontus Brandt of the Johns Hopkins University Applied Physics Laboratory (APL) in Laurel, Md. Under the lead of Barabash and the Swedish National Space Board, APL will provide instruments to this suite to measure the neutral material and plasma that are accelerated and heated to extreme levels in Jupiter's fierce and complex magnetic environment.

NASA's Science Mission Directorate conducts a wide variety of research and scientific exploration programs for Earth studies, space weather, the solar system and the universe. The New Frontiers Program Office at NASA's Marshall Space Flight Center in Huntsville, Ala., will manage the NASA contributions. JUICE is the first large-class mission in ESA's Cosmic Vision 2015-2025 Program.

For more information on NASA planetary programs, visit: http://www.nasa.gov .

For more information about the JUICE mission, visit: http://sci.esa.int/juice .

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by NASA/Jet Propulsion Laboratory.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

Source: http://feeds.sciencedaily.com/~r/sciencedaily/space_time/nasa/~3/vfUjp6hAClM/130221214216.htm

tebow press conference trina rob dyrdek oberon donald driver donald driver robin thicke

Petition asking Obama to take on SIM unlocking hits signature threshold

Petition

Some good news for the fight against tyrannical cell phone regulation. The official White House petition calling for the Librarian of Congress to rescind the decision to allow a DMCA exemption allowing entities other than the operators to SIM unlock phones ... take a deep breath -- that was a mouthful ... has eclipsed 100,000 signatures, meaning we'll likely see some sort of response from the Obama administration at some point. 

That doesn't guarantee action. That doesn't mean it no longer will be illegal to SIM-unlock your phone on your own, without operator approval. And it still doesn't change the fact that it's not like there were goon squads going around, scooping up hordes of unruly unlockers. This is mainly a tool to be used in extreme cases of litigation. Chances are if you've finished paying off your subsidized handset, and your account is in good standing, your operator will give you the SIM unlock code.

But that doesn't mean that any of this is a good use of DMCA legislation. Thus, the petition. "If a petition gets enough support," the petitions.whitehouse.gov site reads, "White House staff will review it, ensure it’s sent to the appropriate policy experts, and issue an official response."

So maybe we'll see something change, and maybe we won't. But for sure, the position that operators shouldn't be the sole decider on SIM-unlocking phones will be heard.

read more



Source: http://feedproxy.google.com/~r/androidcentral/~3/hvJ_u1uR7Eo/story01.htm

Jabari Parker 2012 australia Brothers Grimm Tate Stevens Miss Universe 2012 x factor

Thursday, February 21, 2013

Chris Bosh's Wife Not Phased By Lil Wayne's NBA All-Star Diss (@liltunechi @MrsAdrienneBosh)

Feb 19, 2013 | 11:00 PM??? Written By: Mike Hughes


Lil Wayne threw some heavy fire at Chris Bosh on Sunday night during an NBA All-Star Weekend performance. ?Wayne has been upset ever since he was asked to vacate a recent Miami Heat game, and took his anger out by specially blasting Miami Heat players during his performance in front of a large crowd of people. ?He went a step further by throwing a major low-blow at Chris Bosh by claiming that he slept with his wife, Adrienne Williams Bosh.

"F--k LeBron, f--k She-Wade, f--k Chris Bosh, f--k all them ni--as, man. ?And, and, and, I f--ked Chris Bosh wife."

Wayne shouted the above disses to the crowd and according to TMZ, he actually tried to get the crowd's participation involved by screaming, "When I say f--k, you say nba.. When I say f--k, you say the Miami heat."

Chris Bosh's wife Adrienne hasn't directly addressed the hurtful claims spit out by Lil Wayne, and it's unclear if he said them with intentions of actually revealing some truth or just sending a grimy shot at Chris Bosh. ?However, she did re-tweet several tweets containing messages about positivity and truth and disregarding hatred from others. ?Gabrielle Union, Dwyane Wade's wife, even sent out a tweet that while not directly aimed at Lil Wayne, carried the tone that she was speaking in defense of her husband, the Miami Heat and Adrienne as well.

Check out all of the tweets above. ?How do you feel about Lil Wayne's negative Miami Heat antics?

Source: rapfix.mtv.com

Source: http://www.vladtv.com/blog/155988/chris-boshs-wife-not-phased-by-lil-waynes-nba-all-star-diss/

rosie o donnell soda bread recipe vanderbilt evan mathis staff sgt. robert bales jason russell norfolk state

Smartphone petition breaks 100,000 signatures, forces White House response

Phone unlocking petition breaks 100,000 signatures, White House now obliged to respond

The petition to reinstate the DMCA protection of smartphone unlockers has reached 100,000 signatories on We The People. As it's now hit the golden limit, the White House will have to issue an official response explaining its stance on the matter. The petition also asked that if the Librarian of Congress wouldn't bring back the exemption, that a bill should be passed that enshrines the right into law. As soon we see the message from the West Wing, we'll let you know what they say.

Filed under:

Comments

Via: The Next Web

Source: The White House

Source: http://feeds.engadget.com/~r/weblogsinc/engadget/~3/PUOq6042kTc/

annapolis wwe royal rumble trisomy leon panetta luck sag awards 2012 nominees pro bowl 2012 roster

St. Louis City, County Plan To Join Forces On Job Creation

St. Louis Post-Dispatch:

For decades now, Skinker Boulevard has been a dividing line in the St. Louis region.

East of Skinker, you?re in St. Louis city, dealing with one local government. West of Skinker, it?s St. Louis County, and a whole different government.

Now both governments are moving to erase that line, at least when it comes to efforts to grow the economy.

Read the whole story at St. Louis Post-Dispatch

"; var coords = [-5, -72]; // display fb-bubble FloatingPrompt.embed(this, html, undefined, 'top', {fp_intersects:1, timeout_remove:2000,ignore_arrow: true, width:236, add_xy:coords, class_name: 'clear-overlay'}); });

Source: http://www.huffingtonpost.com/2013/02/21/st-louis-city-county-plan_n_2733810.html

kyle orton kyle orton ncaa tournament schedule black and tan dwight howard trade ncaa bracket 2012 2012 ncaa bracket

Separated bike lanes, slower vehicle speeds greatly reduce bicycle injuries

Feb. 20, 2013 ? Using your bicycle to commute to work has numerous health and environmental benefits. Yet, the largest Canadian study on cycling injuries led by Ryerson University suggests cyclists are at risk of injury due to the lack of cycling infrastructure in large urban centres.

"Previous studies have focused on the measures such as helmets that reduce harm after a crash occurs," says Anne Harris, lead author of the study, who is an avid cyclist herself and an assistant professor with Ryerson's School of Occupational and Public Health. "Our study is one of the first to take a comprehensive look at how route infrastructure, particularly at intersections and major roadways, might influence the risk of cyclist injury in Canada."

North American cyclists are eight to 30 more times likely to be seriously injured while cycling than their counterparts in Germany, Denmark and The Netherlands. Harris says one explanation could be the availability of segregated bike lanes in those countries.

The major findings of the Ryerson's School of Occupational and Public Health study are:

Intersections -- safer

? Intersections at residential streets

? Cars travelling 30 km and under decreases the risk of injuring a cyclist by half

Intersections -- less safe

? Traffic circles: designed as a traffic calming measure, actually increase the risk of cyclist injuries. In the study, 19 out of 690 accidents occurred in Vancouver intersections with traffic circles

? Grade: roads that slope downhill are more dangerous than uphill roads

? Arriving at the intersection in the opposite direction of vehicular traffic

Non-intersections -- safer

? Separated bike lanes along major streets

? Bike routes with traffic diversion on local streets

? Bike-only paths separated from traffic

? Non-intersections -- less safe

? Streetcar tracks

? Downhill grade

? Construction at site

? Shared bike lanes or single bike lanes with parked cars present

The researchers also found that painted and shared bike lanes commonly found in Toronto offered no significant protection for cyclists.

"Our research demonstrates that transportation planners really need to segregate cyclists from motor vehicle traffic just as we use sidewalks to separate pedestrians," says Harris. "If people see cycling as a safer activity, they would be more encouraged to commute by bike, which makes them more active and healthy citizens."

Research Methodology

Harris and her co-authors, also cyclists, interviewed 690 cyclists injured in downtown Toronto and Vancouver between May 2008 and November 2009. According to Census data, nearly two per cent of people in Toronto and almost four percent in Vancouver commute by bicycle.

In the study, all of the cyclists sought medical attention at emergency departments with injuries ranging from minor to serious. Of the total number of cyclists, 211 were injured at intersections and 479 injured along roads or paths.

The researchers gathered two sets of data. First, they asked all of the cyclists to map the route they were injured on, and describe the details of their trip and their injury. Next, an observer visited one or two randomly selected locations along each route to coincide the injury site (if it was at an intersection or not). Specific details about each site were gathered such as the presence and type of bike lanes, grade of the road and traffic volume. Finally, the researchers performed statistical analyses to look at the relationship between route infrastructure and relative safety.

At the time of the study, both Toronto and Vancouver had cycling infrastructure typical of North American cities: shared bike lanes, local street bike paths, off-street paths and a few kilometres of physically separated bicycle routes beside major roads. Some of the features unique to Toronto were streetcar tracks on major streets, whereas traffic circles are common in Vancouver residential areas. Vancouver also has a helmet law for all ages, where it's optional for Toronto adult cyclists.

The study's co-authors are: University of British Columbia and University of Minnesota -- Conor Reynolds; University of British Columbia -- Kay Teschke , Peter Cripton, Hui Shen, Shelina Babul, Jeffrey Brubacher, Garth Hunte, Melody Monro; Simon Fraser University -- Meghan Winters; University of Toronto -- Mary Chipman, Lee Vernich; St. Michael's Hospital and University of Toronto -- Michael Cusimano; and University of Toronto and University Health Network -- Steven Friedman.

The study, published in the journal Injury Prevention, was funded by the Heart and Stroke Foundation of Canada and the Canadian Institutes of Health Research.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Ryerson University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

Source: http://feeds.sciencedaily.com/~r/sciencedaily/living_well/~3/UhcfOHBgVYQ/130220131744.htm

mariana trench transcendental meditation trayvon martin obama care miss universe canada don draper gallagher

Snow closes Highway 58 in Southern California, but Grapevine on I-5 reopens

LOS ANGELES -- A winter storm brought California much-needed rain and even a rare tornado, but the breadth and severity of snowfall in much of the state caught drivers by surprise and left hundreds stranded on mountain highways.

A late barrage of snow forced the shutdown of a 35-mile stretch of Highway 58 between Mojave and Bakersfield.

The snow brought hundreds of vehicles to a stop and sent the California Highway Patrol and road crews scrambling to the scene, where they were helping drivers to slowly exit near Tehachapi. No injuries were reported.

The highway remained closed early Wednesday and officials don't know when it would reopen.

"We've had no reports of people being stuck overnight. Between us and Caltrans, we've been patrolling all night," said Steve Dutcher, a public safety dispatcher for the California Highway Patrol.

A similar scene played out shortly about 300 miles to the north, where dozens of cars were either stuck in the snow or involved in accidents near the rural community of Sonora. About 50 to 75 vehicles became stranded or were in collisions on Highway 49 and nearby roadways when it started snowing heavily in the Sierra Nevada foothills, CHP Lt. Scott Clamp said.

"Travelers were just not prepared," Clamp said. There were a handful of minor injuries, but no major injuries, he said.

Interstate 5 in the Grapevine area north of Los Angeles was closed for about two hours early Wednesday and Interstate 15

through Cajon Pass in San Bernardino County was open despite sporadic snow.

The storm from the Gulf of Alaska brought the first significant rainfall to the region in several weeks, the National Weather Service said.

Periodic showers, including hail, hit the San Francisco Bay area in time for the morning commute, while new snow fell in the Sierra Nevada, where ski resorts around Lake Tahoe were expecting up to 8 inches.

Flurries were spotted high in the hills in Oakland and in neighboring Berkeley, said Rick Canepa, a weather service meteorologist in Monterey. The top of Mt. Hamilton near San Jose and the tips of Mt. Diablo in the East Bay also got a dusting.

In the Sacramento area, a tornado with wind speeds between 40 mph and 70 mph was spotted north of Red Bluff shortly after 1:30 p.m. Tuesday, according to the weather service. It caused little or no damage.

Even though San Francisco saw highs in the 70s last week, California has had a colder-than-normal winter overall.

"We went from about 10 degrees above normal this past weekend to 10 degrees below today," said Austin Cross, another weather service meteorologist based in Monterey. "We're usually somewhere in the 60s, temperature-wise, at this time of year."

San Francisco has gotten nearly 14 inches of rain since October, or about 85 percent of its normal rainfall during the fall-winter season, Cross said. Oakland received 83 percent and San Jose had about 80 percent, he added.

The far-reaching storm brought snow and colder-than-average temperatures to Arizona. Flagstaff public schools and Northern Arizona University were closed Wednesday because of the winter storm.

The metropolitan Phoenix area was expected to get up to a half-inch of rain as the storm moves inland, and temperatures were forecast to drop to the mid-50s, about 15 degrees below average for this time of year.

The storm's eastward march also was expected to drop as much as nearly a foot of snow as it moves into parts of Nevada, Colorado and Utah.

------

Associated Press writers Shaya Tayefe Mohajer in Los Angeles and Terry Collins and John S. Marshall in San Francisco contributed to this report.

Source: http://www.contracostatimes.com/breaking-news/ci_22628014/snow-closes-southern-california-highways-including-grapevine-i?source=rss

lisa lampanelli bronx zoo memphis grizzlies celebrity apprentice grizzlies bronx zoo crash april 30

PlayStation Evolution video reminds us about video games | Joystiq


With Sony taking care to refresh our memories on all its PlayStation hardware over the last few days, it has finally brought it all home with what really matters: Games. The video above highlights a few examples that stand out from the usual gaming crowd, including Gran Turismo, Flower, Journey, Ico, Shadow of the Colossus, The Unfinished Swan and LittleBigPlanet.

Presumably, we'll quickly forget about all of these games ? perhaps even grow to hate them for their hilariously antiquated technologies ? after Sony's big PlayStation 2013 event tomorrow night, where the company is expected to unveil its next PlayStation.

Source: http://www.joystiq.com/2013/02/19/playstation-evolution-video-reminds-us-about-video-games/

stephon marbury the lion king suzanne collins cherry blossom festival nc state erika van pelt pat robertson

Wednesday, February 20, 2013

NEW Power Course: Wisconsin Genealogy | FamilyTreeUniversity ...

FTUPowerCourse1YOU?LL LOVE THIS IF:

  • You want to learn about key resources for Wisconsin genealogy
  • You?ve hit a brick wall in searching for your Badger State ancestors and need help to proceed
  • You have roots in Milwaukee and want to know where to find records in the city

From Milwaukee to Madison to Green Bay, roots in this Northern state span from early Native American tribes to French explorers and fur trappers who crossed the border from Canada. Churn up your Cheese Head ancestors in America?s Dairyland and fill in your Badger State family tree in this course.
Family Tree University Power Courses are designed to provide a concentrated dose of genealogy education on a single topic. Students have a week to go through approximately two hours of course material. In less time than it would take to churn a bucket of butter,? you?ll boost your research and leave with new tips and tricks to help you find your ancestors. This course is self-led and does not contain any graded exercises, but instructors are available to answer any and every question you may post on the Discussion Board.

WHAT YOU?LL LEARN:

  • ? Brief history of the state, including immigration and details regarding ethnic groups
  • ? Unique repositories and top websites for Wisconsin
  • ? Where to find vital, military, land and other records
  • ? A breakdown of Wisconsin counties, including info on historical maps
  • ? Records and resources specific to Milwaukee

WHAT YOU?LL RECEIVE:

  • One-hour Wisconsin Genealogy Crash Course video
  • PDF guide on Wisconsin Genealogy explaining how-tos, fast facts, key websites and resources, and a timeline of state history
  • In-depth PDF Sourcebook with a historical overview, record highlights, reference map with county boundaries, bibliography of Wisconsin state resources and directory of archives, libraries and societies?plus a breakdown of what records are available in each county and where to get them
  • PDF guide on finding your ancestors in Milwaukee

WHAT YOU?LL NEED:

  • This Power Course assumes you understand the basics principles of genealogy. The first course in our First Steps series?Discover Your Family Tree?will provide a helpful foundation for genealogy beginners.

Register

?

?

?

Source: http://www.familytreeuniversity.com/new-power-course-wisconsin-genealogy

pope joan pope joan strikeforce tate vs rousey strawberry festival knicks the monkees ciaa

Tuesday, February 19, 2013

Accidental Empires, Chapter 4 -- Amateur Hour - I, Cringely

PE Jan 1975 Cover 225x300ACCIDENTAL EMPIRES

CHAPTER FOUR

AMATEUR HOUR

You have to wonder what it was we were doing before we had all these computers in our lives. Same stuff, pretty much. Down at the auto parts store, the counterman had to get a ladder and climb way the heck up to reach some top shelf, where he?d feel around in a little box and find out that the muffler clamps were all gone. Today he uses a computer, which tells him that there are three muffler clamps sitting in that same little box on the top shelf. But he still has to get the ladder and climb up to get them, and, worse still, sometimes the computer lies, and there are no muffler clamps at all, spoiling the digital perfection of the auto parts world as we have come to know it.

What we?re often looking for when we add the extra overhead of building a computer into our businesses and our lives is certainty. We want something to believe in, something that will take from our shoulders the burden of knowing when to reorder muffler clamps. In the twelfth century, before there even were muffler clamps, such certainty came in the form of a belief in God, made tangible through the building of cathedrals?places where God could be accessed. For lots of us today, the belief is more in the sanctity of those digital zeros and ones, and our cathedral is the personal computer. In a way, we?re replacing God with Bill Gates.

Uh-oh.

The problem, of course, is with those zeros and ones. Yes or no, right or wrong, is what those digital bits seem to signify, looking so clean and unconnected that we forget for a moment about that time in the eighth grade when Miss Schwerko humiliated us all with a true-false test. The truth is, that for all the apparent precision of computers, and despite the fact that our mothers and Tom Peters would still like to believe that perfection is attainable in this life, computer and software companies are still remarkably imprecise places, and their products reflect it. And why shouldn?t they, since we?re still at the fumbling stage, where good and bad developments seem to happen at random.

Look at Intel, for example. Up to this point in the story, Intel comes off pretty much as high-tech heaven on earth. As the semiconductor company that most directly begat the personal computer business, Intel invented the microprocessor and memory technologies used in PCs and acted as an example of how a high-tech company should be organized and managed. But that doesn?t mean that Bob Noyce?s crew didn?t screw up occasionally.

There was a time in the early 1980s when Intel suffered terrible quality problems. It was building microprocessors and other parts by the millions and by the millions these parts tested bad. The problem was caused by dust, the major enemy of computer chip makers. When your business relies on printing metallic traces that are only a millionth of an inch wide, having a dust mote ten times that size come rolling across a silicon wafer means that some traces won?t be printed correctly and some parts won?t work at all. A few bad parts are to be expected, since there are dozens, sometimes hundreds, printed on a single wafer, which is later cut into individual components. But Intel was suddenly getting as many bad parts as good, and that was bad for business.

Semiconductor companies fight dust by building their components in expensive clean rooms, where technicians wear surgical masks, paper booties, rubber gloves, and special suits and where the air is specially filtered. Intel had plenty of clean rooms, but it still had a big dust problem, so the engineers cleverly decided that the wafers were probably dusty before they ever arrived at Intel. The wafers were made in the East by Monsanto. Suddenly it was Monsanto?s dust problem.

Monsanto engineers spent months and millions trying to eliminate every last speck of dust from their silicon wafer production facility in South Carolina. They made what they thought was terrific progress, too, though it didn?t show in Intel?s production yields, which were still terrible. The funny thing was that Monsanto?s other customers weren?t complaining. IBM, for example, wasn?t complaining, and IBM was a very picky customer, always asking for wafers that were extra big or extra small or triangular instead of round. IBM was having no dust problems.

If Monsanto was clean and Intel was clean, the only remaining possibility was that the wafers somehow got dusty on their trip between the two companies, so the Monsanto engineers hired a private investigator to tail the next shipment of wafers to Intel. Their private eye uncovered an Intel shipping clerk who was opening incoming boxes of super-clean silicon wafers and then counting out the wafers by hand into piles on a super-unclean desktop, just to make sure that Bob Noyce was getting every silicon wafer he was paying for.

The point of this story goes far beyond the undeification of Intel to a fundamental characteristic of most high-tech businesses. There is a business axiom that management gurus spout and that bigshot industrialists repeat to themselves as a mantra if they want to sleep well at night. The axiom says that when a business grows past $1 billion in annual sales, it becomes too large for any one individual to have a significant impact. Alas, this is not true when it?s a $1 billion high-tech business, where too often the critical path goes right through the head of one particular programmer or engineer or even through the head of a well-meaning clerk down in the shipping department. Remember that Intel was already a $1 billion company when it was brought to its knees by desk dust.

The reason that there are so many points at which a chip, a computer, or a program is dependent on just one person is that the companies lack depth. Like any other new industry, this is one staffed mainly by pioneers, who are, by definition, a small minority. People in critical positions in these organizations don?t usually have backup, so when they make a mistake, the whole company makes a mistake.

My estimate, in fact, is that there are only about twenty-five real people in the entire personal computer industry?this shipping clerk at Intel and around twenty-four others. Sure, Apple Computer has 10,000 workers, or says it does, and IBM claims nearly 400,000 workers worldwide, but has to be lying. Those workers must be temps or maybe androids because I keep running into the same two dozen people at every company I visit. Maybe it?s a tax dodge. Finish this book and you?ll see; the companies keep changing, but the names are always the same.

?

**********

?

Intel begat the microprocessor and the dynamic random access memory chip, which made possible MITS, the first of many personal computer companies with a stupid name. And MITS, in turn, made possible Microsoft, because computer hardware must exist, or at least be claimed to exist, before programmers can even envision software for it. Just as cave dwellers didn?t squat with their flint tools chipping out parking brake assemblies for 1967 Buicks, so programmers don?t write software that has no computer upon which to run. Hardware nearly always leads software, enabling new development, which is why Bill Gates?s conversion from minicomputers to microcomputers did not come (could not come) until 1974, when he was a sophomore at Harvard University and the appearance of the MITS Altair 8800 computer made personal computer software finally possible.

Like the Buddha, Gates?s enlightenment came in a flash. Walking across Harvard Yard while Paul Allen waved in his face the January 1975 issue of Popular Electronics announcing the Altair 8800 microcomputer from MITS, they both saw instantly that there would really be a personal computer industry and that the industry would need programming languages. Although there were no microcomputer software companies yet, 19-year-old Bill?s first concern was that they were already too late. ?We realized that the revolution might happen without us,? Gates said. ?After we saw that article, there was no question of where our life would focus.?

?Our life!? What the heck does Gates mean here?that he and Paul Allen were joined at the frontal lobe, sharing a single life, a single set of experiences? In those days, the answer was ?yes.? Drawn together by the idea of starting a pioneering software company and each convinced that he couldn?t succeed alone, they committed to sharing a single life?a life unlike that of most other PC pioneers because it was devoted as much to doing business as to doing technology.

Gates was a businessman from the start; otherwise, why would he have been worried about being passed by? There was plenty of room for high-level computer languages to be developed for the fledgling platforms, but there was only room for one first high-level language. Anyone could participate in a movement, but only those with the right timing could control it. Gates knew that the first language?the one resold by MITS, maker of the Altair?would become the standard for the whole industry. Those who seek to establish such de facto standards in any industry do so for business reasons.

?This is a very personal business, but success comes from appealing to groups,? Gates says. ?Money is made by setting de facto standards.?

The Altair was not much of a consumer product. It came typically as an unassembled $350 kit, clearly targeting only the electronic hobbyist market. There was no software for the machine, so, while it may have existed, it sure didn?t compute. There wasn?t even a keyboard. The only way of programming the computer at first was through entering strings of hexadecimal code by flicking a row of switches on the front panel. There was no display other than some blinking lights. The Altair was limited in its appeal to those who could solder (which eliminated most good programmers) and to those who could program in machine language (which eliminated most good solderers).

BASIC was generally recognized as the easiest programming language to learn in 1975. It automatically converted simple English-like commands to machine language, effectively removing the programming limitation and at least doubling the number of prospective Altair customers.

Since they didn?t have an Altair 8800 computer (nobody did yet), Gates and Allen wrote a program that made a PDP-10 minicomputer at the Harvard Computation Center simulate the Altair?s Intel 8080 microprocessor. In six weeks, they wrote a version of the BASIC programming language that would run on the phantom Altair synthesized in the minicomputer. They hoped it would run on a real Altair equipped with at least 4096 bytes of random access memory. The first time they tried to run the language on a real microcomputer was when Paul Allen demonstrated the product to MITS founder Ed Roberts at the company?s headquarters in Albuquerque. To their surprise and relief, it worked.

MITS BASIC, as it came to be called, gave substance to the microcomputer. Big computers ran BASIC. Real programs had been written in the language and were performing business, educational, and scientific functions in the real world. While the Altair was a computer of limited power, the fact that Allen and Gates were able to make a high-level language like BASIC run on the platform meant that potential users could imagine running these same sorts of applications now on a desktop rather than on a mainframe.

MITS BASIC was dramatic in its memory efficiency and made the bold move of adding commands that allowed programmers to control the computer memory directly. MITS BASIC wasn?t perfect. The authors of the original BASIC, John Kemeny and Thomas Kurtz, both of Dartmouth College, were concerned that Gates and Allen?s version deviated from the language they had designed and placed into the public domain a decade before. Kemeny and Kurtz might have been unimpressed, but the hobbyist world was euphoric.

I?ve got to point out here that for many years Kemeny was president of Dartmouth, a school that didn?t accept me when I was applying to colleges. Later, toward the end of the Age of Jimmy Carter, I found myself working for Kemeny, who was then head of the presidential commission investigating the Three Mile Island nuclear accident. One day I told him how Dartmouth had rejected me, and he said, ?College admissions are never perfect, though in your case I?m sure we did the right thing.? After that I felt a certain affection for Bill Gates.

Gates dropped out of Harvard, Allen left his programming job at Honeywell, and both moved to New Mexico to be close to their customer, in the best Tom Peters style. Hobbyists don?t move across country to maintain business relationships, but businessmen do. They camped out in the Sundowner Motel on Route 66 in a neighborhood noted for all-night coffee shops, hookers, and drug dealers.

Gates and Allen did not limit their interest to MITS. They wrote versions of BASIC for other microcomputers as they came to market, leveraging their core technology. The two eventually had a falling out with Ed Roberts of MITS, who claimed that he owned MITS BASIC and its derivatives; they fought and won, something that hackers rarely bothered to do. Capitalists to the bone, they railed against software piracy before it even had a name, writing whining letters to early PC publications.

Gates and Allen started Microsoft with a stated mission of putting ?a computer on every desk and in every home, running Microsoft software.? Although it seemed ludicrous at the time, they meant it.

While Allen and Gates deliberately went about creating an industry and then controlling it, they were important exceptions to the general trend of PC entrepreneurism. Most of their eventual competitors were people who managed to be in just the right place at the right time and more or less fell into business. These people were mainly enthusiasts who at first developed computer languages and operating systems for their own use. It was worth the effort if only one person?the developer himself?used their product. Often they couldn?t even imagine why anyone else would be interested.

Gary Kildall, for example, invented the first microcomputer operating system because he was tired of driving to work. In the early 1970s, Kildall taught computer science at the Naval Postgraduate School in Monterey, California, where his specialty was compiler design. Compilers are software tools that take entire programs written in a high-level language like FORTRAN or Pascal and translate them into assembly language, which can be read directly by the computer. High-level languages are easier to learn than Assembler, so compilers allowed programs to be completed faster and with more features, although the final code was usually longer than if the program had been written directly in the internal language of the microprocessor. Compilers translate, or compile, large sections of code into Assembler at one time, as opposed to interpreters, which translate commands one at a time.

By 1974, Intel had added the 8008 and 8080 to its family of microprocessors and had hired Gary Kildall as a consultant to write software to emulate the 8080 on a DEC time-sharing system, much as Gates and Allen would shortly do at Harvard. Since there were no microcomputers yet, Intel realized that the best way for companies to develop software for microprocessor-based devices was by using such an emulator on a larger system.

Kildall?s job was to write the emulator, called Interp/80, followed by a high-level language called PL/M, which was planned as a microcomputer equivalent of the XPL language developed for mainframe computers at Stanford University. Nothing so mundane (and useful by mere mortals) as BASIC for Gary Kildall, who had a Ph.D. in compiler design.

What bothered Kildall was not the difficulty of writing the software but the tedium of driving the fifty miles from his home in Pacific Grove across the Santa Cruz mountains to use the Intel minicomputer in Silicon Valley. He could have used a remote teletype terminal at home, but the terminal was incredibly slow for inputting thousands of lines of data over a phone line; driving was faster.

Or he could develop software directly on the 8080 processor, bypassing the time-sharing system completely. Not only could he avoid the long drive, but developing directly on the microprocessor would also bypass any errors in the minicomputer 8080 emulator. The only problem was that the 8080 microcomputer Gary Kildall wanted to take home didn?t exist.

What did exist was the Intellec-8, an Intel product that could be used (sort of) to program an 8080 processor. The Intellec-8 had a microprocessor, some memory, and a port for attaching a Teletype 33 terminal. There was no software and no method for storing data and programs outside of main memory.

The primary difference between the Intellec-8 and a microcomputer was external data storage and the software to control it. IBM had invented a new device, called a floppy disk, to replace punched cards for its minicomputers. The disks themselves could be removed from the drive mechanism, were eight inches in diameter, and held the equivalent of thousands of pages of data. Priced at around $500, the floppy disk drive was perfect for Kildall?s external storage device. KildaU, who didn?t have $500, convinced Shugart Associates, a floppy disk drive maker, to give him a worn-out floppy drive used in its 10,000-hour torture test. While his friend John Torode invented a controller to link the Intellec-8 and the floppy disk drive, Kildall used the 8080 emulator on the Intel time-sharing system to develop his operating system, called CP/M, or Control Program/Monitor.

If a computer acquires a personality, it does so from its operating system. Users interact with the operating system, which interacts with the computer. The operating system controls the flow of data between a computer and its long-term storage system. It also controls access to system memory and keeps those bits of data that are thrashing around the microprocessor from thrashing into each other. Operating systems usually store data in files, which have individual names and characteristics and can be called up as a program or the user requires them.

Gary Kildall developed CP/M on a DEC PDP-10 minicomputer running the TOPS-10 operating system. Not surprisingly, most CP/M commands and file naming conventions look and operate like their TOPS-10-counterparts. It wasn?t pretty, but it did the job.

By the time he?d finished writing the operating system, Intel didn?t want CP/M and had even lost interest in Kildall?s PL/M language. The only customers for CP/M in 1975 were a maker of intelligent terminals and Lawrence Livermore Labs, which used CP/M to monitor programs on its Octopus network.

In 1976, Kildall was approached by Imsai, the second personal computer company with a stupid name. Imsai manufactured an early 8080-based microcomputer that competed with the Altair. In typical early microcomputer company fashion, Imsai had sold floppy disk drives to many of its customers, promising to send along an operating system eventually. With each of them now holding at least $1,000 worth of hardware that was only gathering dust, the customers wanted their operating system, and CP/M was the only operating system for Intel-based computers that was actually available.

By the time Imsai came along, Kildall and Torode had adapted CP/M to four different floppy disk controllers. There were probably 100 little companies talking about doing 8080-based computers, and neither man wanted to invest the endless hours of tedious coding required to adapt CP/M to each of these new platforms. So they split the parts of CP/M that interfaced with each new controller into a separate computer code module, called the Basic Input/Output System, or BIOS. With all the hardware-dependent parts of CP/M concentrated in the BIOS, it became a relatively easy job to adapt the operating system to many different Intel-based microcomputers by modifying just the BIOS.

With his CP/M and invention of the BIOS, Gary Kildall defined the microcomputer. Peek into any personal computer today, and you?ll find a general-purpose operating system adapted to specific hardware through the use of a BIOS, which is now a specialized type of memory chip.

In the six years after Imsai offered the first CP/M computer, more than 500,000 CP/M computers were sold by dozens of makers. Programmers began to write CP/M applications, relying on the operating system?s features to control the keyboard, screen, and data storage. This base of applications turned CP/M into a de facto standard among microcomputer operating systems, guaranteeing its long-term success. Kildall started a company called Intergalactic Digital Research (later, just Digital Research) to sell the software in volume to computer makers and direct to users for $70 per copy. He made millions of dollars, essentially without trying.

Before he knew it, Gary Kildall had plenty of money, fast cars, a couple of airplanes, and a business that made increasing demands on his time. His success, while not unwelcome, was unexpected, which also meant that it was unplanned for. Success brings with it a whole new set of problems, as Gary Kildall discovered. You can plan for failure, but how do you plan for success?

Every entrepreneur has an objective, which, once achieved, leads to a crisis. In Gary Kildall?s case, the objective?just to write CP/M, not even to sell it?was very low, so the crisis came quickly. He was a code god, a programmer who literally saw lines of code fully formed in his mind and then committed them effortlessly to the keyboard in much the same way that Mozart wrote music. He was one with the machine; what did he need with seventy employees?

?Gary didn?t give a shit about the business. He was more interested in getting laid,? said Gordon Eubanks, a former student of Kildall who led development of computer languages at Digital Research. ?So much went so well for so long that he couldn?t imagine it would change. When it did?when change was forced upon him?Gary didn?t know how to handle it.?

?Gary and Dorothy [Kildall's wife and a Digital Research vice-president) had arrogance and cockiness but no passion for products. No one wanted to make the products great. Dan Bricklin [another PC software pioneer?read on] sent a document saying what should be fixed in CP/M, but it was ignored. Then I urged Gary to do a BASIC language to bundle with CP/M, but when we finally got him to do a language, he insisted on PL/i ?a virtually unmarketable language.?

Digital Research was slow in developing a language business to go with its operating systems. It was also slow in updating its core operating system and extending it into the new world of 16-bit microprocessors that came along after 1980. The company in those days was run like a little kingdom, ruled by Gary and Dorothy Kildall.

?In one board meeting,? recalled a former Digital Research executive, ?we were talking about whether to grant stock options to a woman employee. Dorothy said, ?No, she doesn?t deserve options?she?s not professional enough; her kids visit her at work after 5:00 p.m.? Two minutes later, Christy Kildall, their daughter, burst into the boardroom and dragged Gary off with her to the stable to ride horses, ending the meeting. Oh yeah, Dorothy knew about professionalism.?

Let?s say for a minute that Eubanks was correct, and Gary Kildall didn?t give a shit about the business. Who said that he had to? CP/M was his invention; Digital Research was his company. The fact that it succeeded beyond anyone?s expectations did not make those earlier expectations invalid. Gary Kildall?s ambition was limited, something that is not supposed to be a factor in American business. If you hope for a thousand and get a million, you are still expected to want more, but he didn?t.

It?s easy for authors of business books to get rankled by characters like Gary Kildall who don?t take good care of the empires they have built. But in fact, there are no absolute rules of behavior for companies like Digital Research. The business world is, like computers, created entirely by people. God didn?t come down and say there will be a corporation and it will have a board of directors. We made that up. Gary Kildall made up Digital Research.

Eubanks, who came to Digital Research after a naval career spent aboard submarines, hated Kildall?s apparent lack of discipline, not understanding that it was just a different kind of discipline. Kildall was into programming, not business.

?Programming is very much a religious experience for a lot of people,? Kildall explained. ?If you talk about programming to a group of programmers who use the same language, they can become almost evangelistic about the language. They form a tight-knit community, hold to certain beliefs, and follow certain rules in their programming. It?s like a church with a programming language for a bible.?

Gary Kildall?s bible said that writing a BASIC compiler to go with CP/M might be a shrewd business move, but it would be a step backward technically. Kildall wanted to break new ground, and a BASIC had already been done by Microsoft.

?The unstated rule around Digital Reseach was that Microsoft did languages, while we did operating systems,? Eubanks explained. ?It was never stated emphatically, but I always thought that Gary assumed he had an agreement with Bill Gates about this separation and that as long as we didn?t compete with Microsoft, they wouldn?t compete with us.?

Sure.

?

**********

?

The Altair 8800 may have been the first microcomputer, but it was not a commercial success. The problem was that assembly took from forty to an infinite number of hours, depending on the hobbyist?s mechanical ability. When the kit was done, the microcomputer either worked or didn?t. If it worked, the owner had a programmable computer with a BASIC interpreter, ready to run any software he felt like writing.

The first microcomputer that was a major commercial success was the Apple II. It succeeded because it was the first microcomputer that looked like a consumer electronic product. You could buy the Apple from a dealer who would fix it if it broke and would give you at least a little help in learning to operate the beast. The Apple II had a floppy disk drive for data storage, did not require a separate Teletype or video terminal, and offered color graphics in addition to text. Most important, you could buy software written by others that would run on the Apple and with which a novice could do real work.

The Apple II still defines what a low-end computer is like. Twenty-third century archaeologists excavating some ancient ComputerLand stockroom will see no significant functional difference between an Apple II of 1978 and an IBM PS/2 of 1992. Both have processor, memory, storage, and video graphics. Sure, the PS/2 has a faster processor, more memory and storage, and higher-resolution graphics, but that only matters to us today. By the twenty-third century, both machines will seem equally primitive.

The Apple II was guided by three spirits. Steve Wozniak invented the earlier Apple I to show it off to his friends in the Homebrew Computer Club. Steve Jobs was Wozniak?s younger sidekick who came up with the idea of building computers for sale and generally nagged Woz and others until the Apple II was working to his satisfaction. Mike Markkula was the semiretired Intel veteran (and one of Noyce?s boys) who brought the money and status required for the other two to be taken at all seriously.

Wozniak made the Apple II a simple machine that used clever hardware tricks to get good performance at a smallish price (at least to produce?the retail price of a fully outfitted Apple II was around $3,000). He found a way to allow the microprocessor and the video display to share the same memory. His floppy disk controller, developed during a two-week period in December 1977, used less than a quarter the number of integrated circuits required by other controllers at the time. The Apple?s floppy disk controller made it clearly superior to machines appearing about the same time from Commodore and Radio Shack. More so than probably any other microcomputer, the Apple II was the invention of a single person; even Apple?s original BASIC interpreter, which was always available in readonly memory, had been written by Woz.

Woz made the Apple II a color machine to prove that he could do it and so he could use the computer to play a color version of Breakout, a video game that he and Jobs had designed for Atari. Markkula, whose main contributions at Intel had been in finance, pushed development of the floppy disk drive so the computer could be used to run accounting programs and store resulting financial data for small business owners. Each man saw the Apple II as a new way of fulfilling an established need? to replace a video game for Woz and a mainframe for Markkula. This followed the trend that new media tend to imitate old media.

Radio began as vaudeville over the air, while early television was radio with pictures. For most users (though not for Woz) the microcomputer was a small mainframe, which explained why Apple?s first application for the machine was an accounting package and the first application supplied by a third-party developer was a database?both perfect products for a mainframe substitute. But the Apple II wasn?t a very good mainframe replacement. The fact is that new inventions often have to find uses of their own in order to find commercial success, and this was true for the Apple II, which became successful strictly as a spreadsheet machine, a function that none of its inventors visualized.

At $3,000 for a fully configured system, the Apple II did not have a big future as a home machine. Old-timers like to reminisce about the early days of Apple when the company?s computers were affordable, but the truth is that they never were.

The Apple II found its eventual home in business, answering the prayers of all those middle managers who had not been able to gain access to the company?s mainframe or who were tired of waiting the six weeks it took for the computer department to prepare a report, dragging the answers to simple business questions from corporate data. Instead, they quickly learned to use a spreadsheet program called VisiCalc, which was available at first only on the Apple II.

VisiCalc was a compelling application?an application so important that it, alone justified the computer purchase. Such an application was the last element required to turn the microcomputer from a hobbyist?s toy into a business machine. No matter how powerful and brilliantly designed, no computer can be successful without a compelling application. To the people who bought them, mainframes were really inventory machines or accounting machines, and minicomputers were office automation machines. The Apple II was a VisiCalc machine.

VisiCalc was a whole new thing, an application that had not appeared before on some other platform. There were no minicomputer or mainframe spreadsheet programs that could be downsized to run on a microcomputer. The microcomputer and the spreadsheet came along at the same time. They were made for each other.

VisiCalc came about because its inventor, Dan Bricklin, went to business school. And Bricklin went to business school because he thought that his career as a programmer was about to end; it was becoming so easy to write programs that Bricklin was convinced there would eventually be no need for programmers at all, and he would be out of a job. So in the fall of 1977, 26 years old and worried about being washed up, he entered the Harvard Business School looking toward a new career.

At Harvard, Bricklin had an advantage over other students. He could whip up BASIC programs on the Harvard time-sharing system that would perform financial calculations. The problem with Bricklin?s programs was that they had to be written and rewritten for each new problem. He began to look for a more general way of doing these calculations in a format that would be flexible.

What Bricklin really wanted was not a microcomputer program at all but a specialized piece of hardware?a kind of very advanced calculator with a heads-up display similar to the weapons system controls on an F-14 fighter. Like Luke Skywalker jumping into the turret of the Millennium Falcon, Bricklin saw himself blasting out financials, locking onto profit and loss numbers that would appear suspended in space before him. It was to be a business tool cum video game, a Saturday Night Special for M.B.A.s, only the hardware technology didn?t exist in those days to make it happen.

Back in the semireal world of the Harvard Business School, Bricklin?s production professor described large blackboards that were used in some companies for production planning. These blackboards, often so long that they spanned several rooms, were segmented in a matrix of rows and columns. The production planners would fill each space with chalk scribbles relating to the time, materials, manpower, and money needed to manufacture a product. Each cell on the blackboard was located in both a column and a row, so each had a two-dimensional address. Some cells were related to others, so if the number of workers listed in cell C-3 was increased, it meant that the amount of total wages in cell D-5 had to be increased proportionally, as did the total number of items produced, listed in cell F-7. Changing the value in one cell required the recalculation of values in all other linked cells, which took a lot of erasing and a lot of recalculating and left the planners constantly worried that they had overlooked recalculating a linked value, making their overall conclusions incorrect.

Given that Bricklin?s Luke Skywalker approach was out of the question, the blackboard metaphor made a good structure for Bricklin?s financial calculator, with a video screen replacing the physical blackboard. Once data and formulas were introduced by the user into each cell, changing one variable would automatically cause all the other cells to be recalculated and changed too. No linked cells could be forgotten. The video screen would show a window on a spreadsheet that was actually held in computer memory. The virtual spreadsheet inside the box could be almost any size, putting on a desk what had once taken whole rooms filled with blackboards. Once the spreadsheet was set up, answering a what-if question like ?How much more money will we make if we raise the price of each widget by a dime?? would take only seconds.

His production professor loved the idea, as did Bricklin?s accounting professor. Bricklin?s finance professor, who had others to do his computing for him, said there were already financial analysis programs running on mainframes, so the world did not need Dan Bricklin?s little program. Only the world did need Dan Bricklin?s little program, which still didn?t have a name.

It?s not surprising that VisiCalc grew out of a business school experience because it was the business schools that were producing most of the future VisiCalc users. They were the thousands of M.B.A.s who were coming into the workplace trained in analytical business techniques and, even more important, in typing. They had the skills and the motivation but usually not the access to their company computer. They were the first generation of businesspeople who could do it all by themselves, given the proper tools.

Bricklin cobbled up a demonstration version of his idea over a weekend. It was written in BASIC, was slow, and had only enough rows and columns to fill a single screen, but it demonstrated many of the basic functions of the spreadsheet. For one thing, it just sat there. This is the genius of the spreadsheet; it?s event driven. Unless the user changes a cell, nothing happens. This may not seem like much, but being event driven makes a spreadsheet totally responsive to the user; it puts the user in charge in a way that most other programs did not. VisiCalc was a spreadsheet language, and what the users were doing was rudimentary programming, without the anxiety of knowing that?s what it was.

By the time Bricklin had his demonstration program running, it was early 1978 and the mass market for microcomputers, such as it was, was being vied for by the Apple II, Commodore PET, and the Radio Shack TRS-80. Since he had no experience with micros, and so no preference for any particular machine, Bricklin and Bob Frankston, his old friend from MIT and new partner, developed VisiCalc for the Apple II, strictly because that was the computer their would-be publisher loaned them in the fall of 1978. No technical merit was involved in the decision.

Dan Fylstra was the publisher. He had graduated from Harvard Business School a year or two before and was trying to make a living selling microcomputer chess programs from his home. Fylstra?s Personal Software was the archetypal microcomputer application software company. Bill Gates at Microsoft and Gary Kildall at Digital Research were specializing in operating systems and languages, products that were lumped together under the label of systems software, and were mainly sold to hardware manufacturers rather than directly to users. But Fylstra was selling applications direct to retailers and end users, often one program at a time. With no clear example to follow, he had to make most of the mistakes himself, and did.

Since there was no obvious success story to emulate, no retail software company that had already stumbled across the rules for making money, Fylstra dusted off his Harvard case study technique and looked for similar industries whose rules could be adapted to the microcomputer software biz. About the closest example he could find was book publishing, where the author accepts responsibility for designing and implementing the product, and the publisher is responsible for manufacturing, distribution, marketing, and sales. Transferred to the microcomputer arena, this meant that Software Arts, the company Bricklin and Frankston formed, would develop VisiCalc and its subsequent versions, while Personal Software, Fylstra?s company, would copy the floppy disks, print the manuals, place ads in computer publications, and distribute the product to retailers and the public. Software Arts would receive a royalty of 37.5 percent on copies of VisiCalc sold at retail and 50 percent for copies sold wholesale. ?The numbers seemed fair at the time,? Fylstra said.

Bricklin was still in school, so he and Frankston divided their efforts in a way that would become a standard for microcomputer programming projects. Bricklin designed the program, while Frankston wrote the actual code. Bricklin would say, ?This is the way the program is supposed to look, these are the features, and this is the way it should function,? but the actual design of the internal program was left up to Bob Frankston, who had been writing software since 1963 and was clearly up to the task. Frankston added a few features on his own, including one called ?lookup,? which could extract values from a table, so he could use VisiCalc to do his taxes.

Bob Frankston is a gentle man and a brilliant programmer who lives in a world that is just slightly out of sync with the world in which you and I live. (Okay, so it?s out of sync with the world in which you live.) When I met him, Frankston was chief scientist at Lotus Development, the people who gave us the Lotus 1-2-3 spreadsheet. In a personal computer hardware or software company, being named chief scientist means that the boss doesn?t know what to do with you. Chief scientists don?t generally have to do anything; they?re just smart people whom the company doesn?t want to lose to a competitor. So they get a title and an office and are obliged to represent the glorious past at all company functions. At Apple Computer, they call them Apple Fellows, because you can?t have more than one chief scientist.

Bob Frankston, a modified nerd (he combined the requisite flannel shirt with a full beard), seemed not to notice that his role of chief scientist was a sham, because to him it wasn?t; it was the perfect opportunity to look inward and think deep thoughts without regard to their marketability.

?Why are you doing this as a book?? Frankston asked me over breakfast one morning in Newton, Massachusetts. By ?this,? he meant the book you have in your hands right now, the major literary work of my career and, I hope, the basis of an important American fortune. ?Why not do it as a hypertext file that people could just browse through on their computers??

I will not be browsed through. The essence of writing books is the author?s right to tell the story in his own words and in the order he chooses. Hypertext, which allows an instant accounting of how many times the words Dynamic Random-Access Memory or fuck appear, completely eliminates what I perceive as my value-added, turns this exercise into something like the Yellow Pages, and totally eliminates the prospect that it will help fund my retirement.

?Oh,? said Frankston, with eyebrows raised. ?Okay.?

Meanwhile, back in 1979, Bricklin and Frankston developed the first version of VisiCalc on an Apple II emulator running on a minicomputer, just as Microsoft BASIC and CP/M had been written. Money was tight, so Frankston worked at night, when computer time was cheaper and when the time-sharing system responded faster because there were fewer users.

They thought that the whole job would take a month, but it took close to a year to finish. During this time, Fylstra was showing prerelease versions of the product to the first few software retailers and to computer companies like Apple and Atari. Atari was interested but did not yet have a computer to sell. Apple?s reaction to the product was lukewarm.

VisiCalc hit the market in October 1979, selling for $100. The first 100 copies went to Marv Goldschmitt?s computer store in Bedford, Massachusetts, where Dan Bricklin appeared regularly to give demonstrations to bewildered customers. Sales were slow. Nothing like this product had existed before, so it would be a mistake to blame the early microcomputer users for not realizing they were seeing the future when they stared at their first VisiCalc screen.

Nearly every software developer in those days believed that small businesspeople would be the main users of any financial products they?d develop. Markkula?s beloved accounting system, for example, would be used by small retailers and manufacturers who could not afford access to a time-sharing system and preferred not to farm the job out to an accounting service. Bricklin?s spreadsheet would be used by these same small businesspeople to prepare budgets and forecast business trends. Automation was supposed to come to the small business community through the microcomputer just as it had come to the large and medium businesses through mainframes and minicomputers. But it didn?t work that way.

The problem with the small business market was that small businesses weren?t, for the most part, very businesslike. Most small businesspeople didn?t know what they were doing. Accounting was clearly beyond them.

At the time, sales to hobbyists and would-be computer game players were topping out, and small businesses weren?t buying. Apple and most of its competitors were in real trouble. The personal computer revolution looked as if it might last only five /ears. But then VisiCalc sales began to kick in.

Among the many customers who watched VisiCalc demos at Marv Goldschmitt?s computer store were a few businesspeople? rare members of both the set of computer enthusiasts and the economic establishment. Many of these people had bought Apple lis, hoping to do real work until they attempted to come to terms with the computer?s forty-column display and lack of lowercase letters. In VisiCalc, they found an application that did not care about lowercase letters, and since the program used a view through the screen on a larger, virtual spreadsheet, the forty-column limit was less of one. For $100, they took a chance, carried the program home, then eventually took both the program and the computer it ran on with them to work. The true market for the Apple II turned out to be big business, and it was through the efforts of enthusiast employees, not Apple marketers, that the Apple II invaded industry.

?The beautiful thing about the spreadsheet was that customers in big business were really smart and understood the benefits right away,? said Trip Hawkins, who was in charge of small business strategy at Apple. ?I visited Westinghouse in Pittsburgh. The company had decided that Apple II technology wasn?t suitable, but 1,000 Apple lis had somehow arrived in the corporate headquarters, bought with petty cash funds and popularized by the office intelligentsia.?

Hawkins was among the first to realize that the spreadsheet was a new form of computer life and that VisiCalc?the only spreadsheet on the market and available at first only on the Apple II?would be Apple?s tool for entering, maybe dominating, the microcomputer market for medium and large corporations. VisiCalc was a strategic asset and one that had to be tied up fast before Bricklin and Frankston moved it onto other platforms like the Radio Shack TRS-80.

?When I brought the first copies of VisiCalc into Apple, it was clear to me that this was an important application, vital to the success of the Apple II,? Hawkins said. ?We didn?t want it to appear on the Radio Shack or on the IBM machine we knew was coming, so I took Dan Fylstra to lunch and talked about a buyout. The price we settled on would have been $1 million worth of Apple stock, which would have been worth much more later. But when I took the deal to Markkula for approval, he said, ?No, it?s too expensive.??

A million dollars was an important value point in the early microcomputer software business. Every programmer who bothered to think about money at all looked toward the time when he would sell out for a cool million. Apple could have used ownership of the program to dominate business microcomputing for years. The deal would have been good, too, for Dan Fylstra, who so recently had been selling chess programs out of his apartment. Except that Dan Fylstra didn?t own VisiCalc?Dan Bricklin and Bob Frankston did. The deal came and went without the boys in Massachusetts even being told.

?

Source: http://www.cringely.com/accidental-empires-chapter-4?utm_source=rss&utm_medium=rss&utm_campaign=accidental-empires-chapter-4-amateur-hour

republican debate lewis black kirkwood chris brown and rihanna nightline brady quinn brady quinn