Sunday, August 31, 2008

Minneapolis Sculpture Garden

The Spoonbridge and Cherry sculpture by Claes Oldenburg and Coosje Van Bruggen sits in the Minneapolis Sculpture Garden near the Walker Art Museum. (Ann Hermes/The Christian Science Monitor) Copyright © 2008 The Christian Science Monitor. All rights reserved.

How to get virtually UNLIMITED Online Space for Free? (preview)

KNOwledge Within Life Is priZeless. is a Technology Oriented Blog which contains news, information and articles based on Internet, Computer

Tuesday, August 19, 2008
Posted by: Anurag Bansal at 8:20 PM

In one of my previous posts, I mentioned about couple of options by which you can share files with family and friends. Out of that list, some of the options let you upload any kind of file and your files remain on their server forever.

I am listing some of the options which I tried and would definitely recommend anyone interested. For complete review and restrictions, read my previous post on the same topic.

Please click here to continue reading this article.

Thank you to Anurag Bansal for permission to post his research and for the valuable content on his blog!

Why Clouds

Mystical lessons within the clouds.

by Rabbi Boruch Leff

When was the last time you asked a question with childlike wonder, like "Why is the sky blue?" If it's been a while, let's try one now.

Why did God create a world with clouds?

Scientifically, we could answer that clouds consist of water that has evaporated, then condensed into vapor. When these vapor particles combine and become heavy enough, they will fall as rain. As described by meteorologist, Jeff Pardo, clouds "help regulate the earth's energy balance, by reflecting and scattering solar radiation or absorbing the earth's radiated infrared energy. Clouds maintain the earth's atmospheric stability because clouds form when air rises and cools. When a blob of air goes up into an area of less pressure, it cools. When it reaches its dew point temperature, the rising parcel is no longer unsaturated. Water begins to condense, and it then rains."

This is why the Talmud states that God never withholds clouds from the world -- they are required constantly for the world's existence. (Taanit 3b)

But there are deeper mystical lessons contained within the clouds. The verse states "God covers the heavens with clouds and prepares rain for the earth" (Psalms147:8).

The onset of heavy, dark clouds may appear somewhat menacing, but it is really nothing of the sort. God brings the clouds and fills them with rainwater, bringing tremendous blessing to the world. The lesson is clear. God often send us worries and troubles but in the end we come to understand that the purpose of the ordeals was to carry us to great achievements.

In a deeper vein, Kabbalists suggest that there are times when God brings clouds to block the sunlight for ecological purposes. Similarly, there are times when God presents blockades to spiritual success for certain purposes. If a person stares directly into the sun, his eyesight is temporarily impaired, and prolonged exposure would lead to vision damage. This is why we cannot tolerate staring into the sun.

We experience this in the spiritual realm as well. There are times when jumping to great spiritual heights too quickly is damaging to our growth. One who takes on too much too soon can easily burn out and, in the end, regress.

Stable growth needs to happen gradually. Taking on too much, too fast, most often doesn't last. This is one of the reasons God redeemed the Jewish people from Egypt in stages; releasing oneself from an idolatrous Egyptian culture cannot be done overnight.

Spiritual growth requires patience and consistency. Remember, it doesn't matter how high up you are on the spiritual ladder, as long as you are moving up.

Perhaps this is why God blocks the sun with clouds. Ultimately the clouds purpose it to produce the blessing of rain; in spiritual terms the clouds remind us to aim for permanent, steady growth, one step at a time.

Clouds act as a type of barrier, letting us know that there are levels that are presently beyond us and we shouldn't leap to heights we're not yet ready for. But when we do climb the ladder and finally reach the clouds, we see that they have no strong substance to them. You can fly right through them! Clouds are a mirage, they are not real obstacles.

The message is clear. Once we start to grow spiritually and embark on the path toward heaven, we should not be intimidated by the obstacles, the clouds that lie before us. They are an only an illusion. Just keep soaring and you'll pass right through them.$.asp
Site contents copyright © 1995 - 2008 Aish HaTorah

Saturday, August 30, 2008

TreeMark™ Tree Rating - A fresh perspective to green computing

Electronics & IT Grimoire:
Sharing some of the ingredients & spells from my Grimoire …

Where did the Computer Go? Computing in the Cloud.

Published on WSO2 Oxygen Tank (

by Ayanthi Anandagoda

Table of Contents
Ayanthi Anandagoda is a Senior Content Specialist at WSO2. ayanthi at wso2 dot com



© 2008 WSO2 Inc.

Students in Brazil get a new Classmate

By Ina Fried
Staff writer, CNET News
August 29, 2008, 4:00 a.m. PDT

Editors' note: This is part of a series exploring computing in Latin America.

CAMPINAS, Brazil--A math teacher gives a class of eighth-graders their assignment and tells them to get to work.

The students grab their bags and fan out across the campus, enjoying the sunny autumn day. Sitting in groups of three and four, some at tables and some on the ground, the students work on the day's lesson. None of the students are using books or writing on paper. Instead, in each student's hands is a small blue-and-white computer that acts as both textbook and notebook.

Click here to read all of the blogs in The Borders of Computing series.

The computers are Intel's Classmate PC, and each one of the students at the Bradesco Foundation school here has one to use each day. As the largest one-to-one computing project in Latin America it's being closely watched. School officials say there is more at stake than the reputation of the Classmate PC, however.

"We have to tread carefully," said Vice Principal Tania Maria Gebin de Carvalhao. "You can't have a recall of students and say 'wait, we did it wrong, come back.'"

The stakes are also high for the technology companies involved. Intel and Microsoft hope to show not only the power of giving laptops to students--but also to show the world that they too have a product in this area--with so many headlines in the U.S. focused on Nicholas Negroponte's One Laptop Per Child project (Microsoft, more recently has started working with OLPC as well).

One of the keys is knowing when to use computers and when not to us them. In chemistry, for example, it's important that students have the hands-on experience they get by mixing chemicals in test tubes.

Eighth-grade math students work at Intel Classmate PCs in an outdoor classroom at the Bradesco Foundation school in Campinas, Brazil.

"The lab is still good," Gebin de Carvalhao said, but the computers have also come in handy, such as if a teacher wants to demonstrate an explosive reaction.
"Sometimes for safety reasons, it's better not to do it in the lab."

Although it is traditional paintings and not PowerPoint illustrations that hang on her wall, art teacher Elaine Barreiros has also found the computers to be a valuable addition to her classroom.

On this day, she has about two-thirds of the class researching the dress of different ethnic groups while a third of the kids have set their laptops aside and are carving wax sculptures based on their research.

Elaine shrugs off the notion that computers might get in the way. Pointing to the current project, she notes that many of the students will never have the opportunity to travel to all of Brazil, even. "This is the best resource we have," she said, pointing to a laptop. "They can travel the world."

In teaching geometry class, Paulo Cesar Mucci uses an electronic whiteboard to show how a protractor works, noting that the technology makes it possible to see every degree, something that wouldn't be the case if he had to hold up a protractor or draw one by hand.

Unlike in some other one-to-one programs, the Bradesco students don't get to take the laptops home each night.

There are two main reasons for this. First of all, when this group of students heads home in the afternoon, the laptops' day is just beginning. Students in Brazil only go to school for four hours a day, meaning the school is able to offer three shifts of classes: morning, afternoon, and night. As a result, the laptops can do triple duty, even with each student having his or her own laptop throughout the day.

Even if they had more laptops, they still wouldn't send them home, though. Administrators would be worried about the laptops making it back to school. It's not that they think the students would mistreat or misplace the laptops.

"They might get mugged," said the school's principal. Because of where the students live, "it's still not safe."
Copyright ©2008 CNET Networks, Inc., a CBS Company. All rights reserved.

Friday, August 29, 2008

Eyes turn to dawn of 'visual computing'

SAN JOSE, California (AFP) — Lifelike graphics are breaking free of elite computer games and spreading throughout society in what industry insiders proclaim is the dawning of a "visual computing era."

Astronauts, film makers and celebrities joined software savants, engineers and gamers in the heart of Silicon Valley this week for a first-ever NVision conference devoted to computer imagery advances changing the way people and machines interact.

"Visual computing is transforming the videogame industry; transforming the film industry, and has all kinds of potential for how we view real-time television," NVIDIA co-founder Jen-Hsun Huang told those gathered at the event.

"We solve some of the most challenging problems for more and more companies around the world. Let the era of visual computing begin."

Gamers dueled for three days in a cavernous room in the San Jose Convention Center while entrepreneurs showed how graphics breakthroughs are shining in other fields.

Car makers are exploring letting potential buyers not only customize automobiles with graphics software but go on virtual test drives.

Graphics processing underpins financial modeling and weather forecasting.

Israel-based Optitex demonstrated software that replicates fabrics so realistically that clothing designers can see what fashions will look and act like on people before garments are made.

Optitex's animation software is being eyed by Hollywood film makers.

Dassault Systemes puts 3D computer-assisted design to work virtually constructing passenger jets, buildings and more.

"Three-D should be a new way for us to dream and design the future of our world," The French company's chief executive Bernard Charles said at NVision.

"It will impact everything we do: education, science, talking to each other ... of course games."

He predicts that lifelike graphics combined with feedback from online communities will let people influence how products are designed, sold and even how "green" they are.

Charles maintains computer simulations will be so realistic that virtual activities will mirror physical experiences.

Simulators already play an important part in training for space shuttle missions, according to former US astronaut Eileen Collins, the first woman shuttle commander.

"When you fly the actual mission you feel like you are in a simulator," Collins said. "We really can't do our job without the good visual graphics that we get."

The world of visual computing is "inescapable," said Chris Malachowsky, a co-founder of NVIDIA, a California firm renowned for high-end graphics processing cards for computers.

"We are being presented with displays everywhere," Malachowsky told AFP. "It used to be about the computing part, but the emphasis is shifting. It is not so much about the computation but how it is presented and seen by people."

The rising tide of digital videos, photos, films and television shows on the Internet is lifting the status of graphics chips, cards, and software and strengthening a trend to "unflatten" displays with 3D imagery.

Malachowsky spoke of using visual computing power to develop new medicines or provide doctors with real-time 3D images of patients' organs.

"They will be able to recreate scan data so fast you could see your own heart beating," Malachowsky said.

"This is being subsidized by all these kids out there playing games."

Perceptive Pixel founder Jeff Han, referred to by some as "the father of touch screen" computing, maintains graphics opens up user interface control possibilities that could render a "mouse" obsolete.

Han demonstrated touch-screen technology that lets several people simultaneously manipulate applications and files on a single large monitor.

"It's not personal computing anymore," Han said. "It's visual computing."

Battlestar Galactica bombshell Tricia Helfer praised computer animation innovations that enable the science fiction television series to rivet viewers.

Helfer plays a part-machine, part-organic Cylon character called "Number Six" that has turned on its creators.

"It's a bit threatening," Helfer said of technology promising to one day make animated characters indistinguishable from real actors.

"But the advantages and uses of it are amazing, but it is something we are going to have to get used to."

Hosted by Google
Copyright © 2008 AFP. All rights reserved.

The Key Advantage of Cloud Computing is Portability

The key advantage of cloud computing isn't performance or scalability – it is portability

No Blueprint Yet For Private Clouds

Posted by Roger Smith, Aug 28, 2008 05:28 PM;jsessionid=RUPC25P1M02GIQSNDLRSKHSCJUNN2JVN?print=true

Many people don't like the concept of "private clouds," including my colleague John Foley and Sam Johnston ("The case against 'private clouds' "), since by definition cloud computing involves letting people plug into shared IT services in data centers that aren't their own. As oxymorons go, though, private cloud computing doesn't strike me as particularly egregious: I would probably rank it halfway between 'green data center' and 'business intelligence' on my own (admittedly moronic) oxymoron scale.

I discussed the cloud computing ecosystem earlier this month with Sam Charrington, VP of product marketing and management for Appistry, a maker of middleware that helps applications run smoothly in a cloud environment, after his LinuxWorld Expo Cloud Computing session. Charrington's view of the future of cloud computing includes Google (NSDQ: GOOG)-like public clouds as platforms for applications; virtual private clouds, which are third-party clouds, or segments of the public cloud with additional features for security, compliance, etc. (for HIPPA medical record compliance or SOX accounting standards, for example); as well as private or internal clouds, which are an extension of virtualization and used primarily because of their capital or operational efficiencies. His formulation seems to me to make sense since it also dovetails with my view of cloud computing as a natural evolution from the grid/utility computing model, which is the delivery of storage, computation and other computing resources as metered services similar to the way traditional public utilities deliver electricity.

A private cloud, by analogy, is computing capacity produced "off-the-grid" similar to the ways some homeowners produce electrical power with renewable energy sources such as solar arrays on their roof or windmills, and therefore have the option of using it themselves; selling it back to a centralized grid; or allocating it to anyone they choose. Ultimately, anyone with a data center will conceivably be able to provide cloud services, as long as those services conform to a set of cloud infrastructure standards, most of which have yet to be defined.

Sam Johnston ("Cloud standards: not so fast...") is one of many who say cloud standardization efforts are premature. Johnson points to the market-driven ecosystem that has sprung up overnight aroundAmazon (NSDQ: AMZN) Web Services as an example of what kind of cloud standards are needed, namely "simple, rugged, market tested interfaces defined by the innovators in each area (virtualization, storage, services, etc.)." I tend to agree that much of the cloud standardization effort at the moment seems to be putting the cart before the horse, although I'm intrigued by the possibility of leveraging some of the work that's been done on the new Open Virtualization Format (OVF), created by the Distributed Management Task Force (DMTF) standards organization. OVF is a platform independent, efficient, extensible, open packaging and distribution format for virtual machines that allows virtualization packaging, distribution, installation, and management -- all within an archive or Tar file such as "myApp.ova," which can include a digital signature for security.

OVF can be compared roughly to the MP3 digital music format that is used to encapsulate music information. A packaging format that is vendor-neutral, it allows virtual machines, or sets of virtual machines, to be installed on any platform including public, virtual private, and private clouds. OVF is based on the DMTF's Common Information Model (CIM), which would make a good starting point for a cloud API. (If you're not familiar with CIM, it's an open standard that defines how managed elements in an IT environment are represented as a common set of objects and relationships between them. This is intended to allow consistent management of these managed elements, independent of their manufacturer or provider.) This seems like as good a blueprint as any for cloud standardization, although I would like some safeguards to assure the separation of public, virtual private, and private clouds. Off-grid private clouds should be able to be autonomous in much the same way that off-grid homes can generate electrical power on-site with renewable energy sources such as solar or wind; with a generator and adequate fuel reserves; or simply done without, as in Amish communities.

Copyright © 2008 United Business Media LLC, All rights reserved.

Thursday, August 28, 2008

Bill Gates spills Microsoft cloud computing strategy

Storage Optimization
Posted in Uncategorized by storageoptimization on the August 28, 2008

Image: PC Magazine

Good old billg has something to say in his “exit interview” about storage in the cloud in this week’s PC Magazine. In essence, his view is that computing and storage will move to the cloud at different rates, and that storage is the more logical thing to move first. Your local storage (presumably on your Windows PC in Mr. Gates’ worldview) will be a cache of a subset of the master data held in the cloud.,2817,2321132,00.asp

Moving data in to the cloud makes a lot of sense, as it makes that data available to computers everywhere, and it also centralizes management of data for backups, geo-replication, and hardware refresh in places where economies of scale can take place that an average user or company could not manage or afford.

I agree with Bill on a couple of these points. I think storage does move to the cloud faster than compute, both because people already understand the idea of storage networks, and that their storage is on a network somewhere –in the cloud, that network pipe is just a bit longer. Compute is something people are use to having close at hand, either on their desktop or in their own data center.

For storage in the cloud to really take off, though, I think storage optimization is an absolutely key ingredient. As regular readers of this blog know, when I say storage optimization I mean a combination of content-aware technologies to drastically reduce the size of data (content aware compression, subfile deduplication, and logical compaction to name a few). There are two places where storage optimization makes sense for storage in the cloud to take off. One is at the customer end of the pipe – if you can shrink your data before you send it to the cloud, you’ll use less bandwidth getting it there and getting it back, and since most “clouds” charge you per Gigabyte, you’ll pay less too.

The other place is in the data center of the storage cloud provider. That’s going to be a very competitive marketplace, and the cloud vendors that can charge you the least amount per Gigabyte – to store, to transfer, to replicate –are going to have the competitive advantage. So the cloud vendors that do the best job of integrating storage optimization in to the cloud in a transparent way will have the edge. And the cloud is a great place to get that edge.

Think of deduplication, for example. If you deduplicate songs just in your own house, well you may only have one copy of each song. Why would you have ten copies of a Britney Spears song? (I might ask why you would have any at all … but we’ll leave that for another time.) However, if 5 million people store their data at a cloud storage provider, how many copies of that hit song might end up there? Does the cloud provider need to store 5 million copies of the same thing? No. If they do, they are being very inefficient. A song is a simple example, but even with enterprise data, the more data you have, the more likely it is that you’ll find patterns, correlations, duplicates, or data relationships that can be exploited for better compression. So the cloud offers an opportunity for efficiency that don’t exist at each little pool of local storage on your hard drive today.

To me, storage optimization and the move of storage to the cloud make a perfect match.

Storage Optimization aims to provide an objective look at the fast-changing world of storage. The blog was started by Carter George, co-founder of Ocarina Networks and provides regular commentary, including guest posts from industry leaders, customers and influencers, on how storage innovations are helping to shape the future of business.

'One Laptop' Falls Short Of Education Goals

by Cyrus Farivar

Morning Edition, August 27, 2008

Listen Now [3 min 14 sec]t

One Laptop Per Child was an ambitious promise to children in the third world. The project has had trouble with its leadership, finances and competitors. Instead of the legacy of education for third-world children, the One Laptop Per Child program has spurred an industry in low-cost laptops for consumers.

purchase transcript:
STEVE INSKEEP, host: Poor people in developing countries have been the focus of ambitious projects, like this one aimed at bridging the world's digital divide. A non-profit called One Laptop Per Child made a promise three years ago to provide $100 computers to millions of children. The group has achieved only a fraction of that goal. But as Cyrus Farivar reports, One Laptop Per Child has still made its mark on the global computer industry.

CYRUS FARIVAR: There's only...

NPR (National Public Radio) is an internationally acclaimed producer and distributor of noncommercial news, talk, and entertainment programming.
Copyright 2008 NPR

A holistic approach to green IT is essential

Businesses must look at the whole picture when implementing green strategies
Written by David Tebbutt

Computing, 24 Jul 2008

Green computing is an admirable objective ­ – but because the strategy focuses on computing itself, it often fails to consider the wider environmental problems that face the world.

Even labelling the green computing issue as “environmental” does not really do the trick. The better approach is to start with “sustainable development” and work back from there.

According to the 1987 Brundtland Report, also known as Our Common Future, sustainable development “meets the needs of the present without compromising the ability of future generations to meet their own needs”.

The report took an international view of sustainability and applied it to employment, food, energy, water and sanitation. We sometimes forget that if our computer equipment is made in China, its water and ground pollution largely stays there, while the gaseous emissions are shared with the whole world.

The only truly environmental way of accounting for our choices is to look at the lifecycle impact of what we buy and throw away. That includes everything from raw materials through construction, packaging, delivery, use and disposal.

Anything less than this is just playing at being green.

Of course, such an approach is also inconvenient. It is very difficult to discover the carbon footprint of what we buy – ­ it is much easier to find out how much energy something uses for our base calculations.

As such, you rarely hear IT vendors talking about embedded carbon and other pollution in the products they try to convince IT managers to buy. And it is why governments are so keen to focus on pricing carbon and setting targets.

Some 5,000 British companies will be obliged to sign up for the Carbon Reduction Commitment in 2010, which will make businesses measure their carbon emissions to see how they compare year on year. League tables will be published and some firms will be rewarded with bonuses, while others will be punished with fines.

The commitment will focus minds. But companies will analyse their emissions ­ – including those they inherit with their energy supplies ­ – and may decide to offshore polluting elements of their work. The result will be a nice, clean UK operation and no reprimands for embedded carbon. As a result, the company in question might be obeying the letter of the law ­ – but hardly its spirit.

So, what will drive companies to do the right thing? Money and regulation are top of the list. Brand value, corporate social responsibility and public relations are all closely intertwined.

Many companies need to be seen to be environmentally sensitive and if you work for such a firm, life will be relatively easy. These businesses concentrate on working with management and examine every corner of the business to discover opportunities to be greener.

However, if money and regulations drive the firm, every green decision needs to be costed. If regulations are involved, penalties are usually not far away.

As such, the equation still boils down to money ­ – and you are wasting your time trying to appeal to the firm’s better nature. Sooner or later, though, the company will come under customer or shareholder pressure to act greener, especially if it is part of a supply chain to more committed businesses.

Centre attention on the areas that will make a difference but require little or no investment – ­ switch off desktops at night, print fewer documents, turn off power chargers when they are not in use, turn off lights if no one is around.

Extend the period between machine replacements and find ways to reuse retired equipment. The Met Office, for example, sends its end-of-life computers to other measuring stations around the world.

Your next stage should be to look at virtualisation, which can help cut the amount of equipment you need or allow you to take on more work without buying new PCs. You might also be able to optimise your datacentre cooling without massive upheaval. And going even further, large companies can benefit from consolidating multiple datacentres, which can reduce electricity use and other associated costs.

But IT has a much bigger role to play in helping a company achieve good environmental performance.

Technology can help reduce energy use and other pollution in areas which are beyond IT’s remit.

Publishing reference materials online ­ – catalogues, manuals, directories and so on ­ – saves on print, transport and packaging. Videoconferencing can save time and money ­ – and the stresses of work life can be diminished, too.

Home working can also help employees dodge the commute and, depending on how many people do it and for how many days, can also reduce office size and the associated costs.

In a recent survey, Freeform Dynamics discovered that only 28 per cent of IT departments actually know how much energy they use. And, no doubt, the figure would be even smaller if the question were asked with regards to the total IT estate.

If you belong to such a company, perhaps the most effective starting point for everything would be to obtain a copy of its bills. This approach might lead to more granular metering, so that you can see where to apply new measures for saving energy.

David Tebbutt is programme director at analyst Freeform Dynamics. Read the blog at:

Computing provides insight for IT leaders.

Computing and are published in the UK by Incisive Media.

© Incisive Media Ltd. 2008

Akhter Computers Launches LoCO2 All-in-One Energy Star 4.0 PC

Posted By: Another PC that joins the ranks of the recently released Tangent Evergreen 17 and Shuttle X27 energy-efficient PCs is the new Akhter LoCO2 PC. The LoCO2 claims to the world's first Energy Star 4.0 approved all-in-one PC. It's a combined computer which has a 19-inch LCD panel, Intel Core 2 Duo processor, and hard disk drive, in just a single form. LoCO2 consumes 55 watts of energy when in use, and 3 watts when put in Sleep Mode.

In spite of the incorporated elements, the computer maintains a thin profile measuring merely 85mm in depth. Other optional features include a touch panel and 802.11 b/g WiFi connectivity. The Akhter LoCO2 all-in-one PC has a starting price of approximately $1,078 depending on the configuration.

Copyright © 1996-2008 Ziff Davis Publishing Holdings Inc. All Rights Reserved.

Wednesday, August 27, 2008

Cloud Computing Terminology

Thinking Out Cloud
Cloud Computing, Grids, Everything-as-a-Service and more
Geva Perry

While the debate on the actual definition of cloud computing rages on, it seems that a whole new cloud computing vocabulary is rapidly emerging. I thougt I'd list some of the new terms I'm seeing with brief definitions, examples of usage and references to discussions related to these terms. Hope this is useful.

Cloudburst: The term cloudburst is being use in two meanings, negative and positive:

  1. Cloudburst (negative): The failure of a cloud computing environment due to the inability to handle a spike in demand.
    Reference:"The only way to do cloud computing efficiently is to share the cloud - to establish a broad, multitenant grid (or a number of them) that balances the loads of many different companies. Otherwise, it'll be one cloudburst after another, and a whole lot of underutilized capital assets." Source: Nicholas Carr: Intuit's cloudburst frustrates customers.
  2. Cloudburst (positive): The dynamic deployment of a software application that runs on internal organizational compute resources to a public cloud to address a spike in demand.
    Reference: "ISV virtual appliances should underpin a new surge in cloud use followed by self-service mechanisms and enterprise connectors enabling organizations to 'cloudburst' to using cloud services." Source: The 451 Group: RightScale rolls its on-ramp toward other cloud systems (subscription required)
    Related uses: Cloudbursting. Reference "In addition to direct sales to enterprises, going forward it hopes that extending out from private clouds to public ones – what we like to call 'cloudbursting' – will become a prevailing IT weather pattern and provide it with additional opportunities. " Source: The 451 Group: Q-Layer has the wisdom to enable private clouds (subscription required)

Cloudstorming: The act of connecting multiple cloud computing environments.
Reference: "...Zimory will be covering off the key cloudy marketplaces and activities: public cloud, internal cloud, cloudbursting (grow-over from internal to public clouds) and cloudstorming (connecting multiple clouds)." Source: The 451 Group: A Cloud for All Seasons

Vertical Cloud: A cloud computing environment optimized for use in a particular vertical -- i.e., industry -- or application use case.
Reference: "The verticalization of the cloud would provide marketing benefits, as Friedman notes, while also providing a possible means of addressing issues of information security crucial to industries such as health care and financial services." Source: Nicholas Carr: The vertical cloud

Private Cloud: A cloud computing-like environment within the boundaries of an organization and typically for its exclusive usage.
Reference: "It is these companies that have dramatically leveraged their internal and originally Private Cloud Computing infrastructures to significant economic benefit. " Source: Kent Langley: Private Cloud Computing: A Few Thoughts

Internal Cloud: A cloud computing-like environment within the boundaries of an organization and typically available for exclusive use by said organization.
Reference: "With Cloud Computing becoming more and more popular, large corporations are likely to set up their own clouds and integrate them with external clouds, like Amazon EC2." Source: Markus Klems: Internal Cloud

Hybrid Cloud: A computing environment combining both private (internal) and public (external) cloud computing environments. May either be on a continuous basis or in the form of a 'cloudburst'.
Reference: "Microsoft would, no doubt, agree. Their "software plus services" approach similarly advocates a hybrid cloud/desktop environment." Source: Kendall Whitehouse: Kevin Lynch: Clearing the AIR

Cloudware: A general term referring to a variety of software, typically at the infrastructure level, that enables building, deploying, running or managing applications in a cloud computing environment.
Reference: "Go to Google Maps, Yahoo Mail, or MySpace — most of Web 2.0, in other words — and you're using cloudware." Source: Wired: Geekipedia - Cloudware

External Cloud: A cloud computing environment that is external to the boundaries of the organization. Although it often is, an external cloud is not necessarily a public cloud. Some external clouds make their cloud infrastructure available to specific other organizations and not to the public at-large.
Reference: "If an enterprise were to run an app in an external Cloud and wants to connect that to their systems of record in their own datacenters, they might want to consider the same platform in their data centers." Source: Bert Armijo: Pain in the aaSemantics

Public Cloud: A cloud computing environment that is open for use to the general public, whether individuals, corporations or other types of organizations. Amazon Web Services are an example of a public cloud.
Reference: Gerrit Huizenga: Um, Just who is managing your public cloud?

Cloud Provider: An organization that makes a cloud computing environment available to others, such as an external or public cloud.
Reference: "Some workloads, such as application testing and training, are prime candidates for early deployment to a cloud provider due to their transient nature and high Total Cost of Ownership (TCO)." Source: John Janakiraman: Deploying Your Existing Applications to the Cloud

Cloud Enabler: A general term that refers to organizations (typically vendors) who are not cloud providers per se, but make available technology, such as cloudware, that enables cloud computing.

Cloud-Oriented Architecture (COA): An architecture for IT infrastructure and software applications that is optimized for use in cloud computing environments. The term is not yet in wide use, and as is the case for the term "cloud computing" itself, there is no common or generally accepted definition or specific description of a cloud-oriented architecture.
Reference: James Urquhart: The Principles of Cloud Oriented Architecture

Cloud Service Architecture (CSA): A term coined by Jeff Barr, chief evangelist at Amazon Web Services. The term describes an architecture in which applications and application components act as services on the cloud, which serve other applications within the same cloud environment.
Reference: Jeff Barr: The Emerging Cloud Service Architecture

Virtual Private Cloud (VPC): A term coined by Reuven Cohen, CEO and founder of Enomaly. The term describes a concept that is similar to, and derived from, the familiar concept of a Virtual Private Network (VPN), but applied to cloud computing. It is the notion of turning a public cloud into a virtual private cloud, particularly in terms of security and the ability to create a VPC across components that are both within the cloud and external to it.
Reference: "A VPC is a method for partitioning a public computing utility such as EC2 into quarantined virtual infrastructure. A VPC may encapsulate multiple local and remote resources to appear as a single homogeneous computing environment bridging the ability to securely utilize remote resources as part of an seamless global compute infrastructure." Source: Reuven Cohen: Life in the Cloud: Virtual Private Cloud

Cloud Portability: The ability to move applications (and often their associated data) across cloud computing environments from different cloud providers, as well as across private or internal cloud and public or external clouds.

Cloudsourcing - As defined by Dion Hinchcliffe: "Leveraging services in the network cloud to provide external computing capabilities, often to replace more expensive local IT capabilities.Cloudsourcing can theoretically provide significant economic benefits along with some attendant trade-offs. These trade-offs can include security and performance. The term "cloud" represents a set of external services on a 3rd party network, usually the Internet. The services can represent raw computing, storage, messaging, or more structured capabilities such as vertical and horizontal business applications, even community. These services are delivered over the network, but generally behave as if they are local." Read an overview of cloudsourcing by Dion Hinchcliffe.


Who Provides What in the Cloud

John Edwards, InfoWorld

Wednesday, August 27, 2008 1:00 PM PDT

The news that AT&T has joined the rapidly growing ranks of cloud computing providers reinforces the argument that the latest IT outsourcing model is well on its way to becoming a classic disruptive technology.

By enabling datacenter operators to "publish" computing resources -- such as servers, storage, and network connectivity -- cloud computing provides a pay-by-consumption scalable service that's usually free of long-term contracts and is typically application- and OS-independent. The approach also eliminates the need to install any on-site hardware or software.

Currently dominated by and several small startups, cloud computing is increasingly attracting the interest of industry giants, including Google, IBM, and now AT&T.

"Everyone and their dog will be in cloud computing next year," predicts Rebecca Wettemann, vice president of research at Nucleus Research, a technology research firm.

Yet James Staten, an infrastructure and operations analyst at Forrester Research, warns that prospective adopters need to tread carefully in a market that he describes as both immature and evolving. Staten notes that service offerings and service levels vary widely between cloud vendors.

"Shop around," he advises. "We're already seeing big differences in cloud offerings."

To help cut through the confusion, here's a rundown some major cloud providers -- both current and planned -- all offering resources that go beyond basic services such as SaaS (software as a service) applications and Web hosting:

3Tera: Appliance-Driven Virtual Servers

3Tera's AppLogic is a grid engine that has evolved over time into a full-fledged cloud computing environment. The company says its offering is designed to enable datacenters to replace expensive and hard-to-integrate IT infrastructure -- such as firewalls, load balancers, servers, and SANs -- with virtual appliances. Each appliance runs in its own virtual environment.

AppLogic combines servers into a scalable grid that's managed as a single system via a browser or secure shell. According to 3Tera, datacenters can add or remove servers on the fly, monitor hardware, manage user credentials, reboot servers, install software, build virtual appliances, back up the system, repair damaged storage volumes, inspect logs, and perform every other management tasks from a single point of control, all while the system is running. As-You-Need-Them Basic IT Resources

Amazon was an early cloud computing proponent, and the company now has one of the market's longest menu of services. Amazon's core cloud offering, the Elastic Compute Cloud (EC2), provides a virtualized cloud infrastructure that's designed to provide scalable compute, storage, and communication facilities.

Amazon's cloud computing arsenal also includes the Simple Storage Service (S3), a persistent storage system, as well as the Simple Database (SimpleDB), which provides a remotely accessible database, and the Simple Queuing Service (SQS), a message queue service that's also an agent for tying together distributed applications created by the EC2, S3, and SimpleDB combo.

AT&T: Scalable Hosting in a Managed Network

AT&T Synaptic Hosting aims to give datacenters the ability to manage applications, compute resources on servers, and stored data elastically, so they can scale up or down as needed. The hosted platform provides dynamic security and storage capabilities as well as a Web portal to manage capacity, conduct maintenance, and monitor network service and performance.

AT&T has long offered hosting services, but not ones that could scale up or down as needed. AT&T's resources and services run within its own network, rather than across datacenters linked via the public Internet, which the company claims provides more certainty over server levels.

Google: Resources for Small Businesses and Developers

Google already offers several cloud-based services, such as e-mail and storage, for consumers, a well as the AppEngine development and provisioning platform for individual developers. The company's logical next step, given its vast infrastructure resources, would be a move into the enterprise cloud market.

"There's not that much difference between the enterprise cloud and the consumer cloud," Google CEO Eric Schmidt said last May during an appearance in Los Angeles with IBM chief Sam Palmisano, as the companies announced a joint cloud computing initiative.

Over the next year or so, Google and IBM plan to roll out a worldwide network of servers for a cloud computing infrastructure. The IBM-Google cloud runs on Linux-based machines using Xen virtualization and Apache Hadoop, an open source implementation of the Google File System. Provisioning is automatic, courtesy of IBM's Tivoli Provisioning Manager.

IBM: A Platform for Your "Internal" Cloud

Aside from its Google venture, IBM is focusing its cloud strategy on "Blue Cloud," a series of cloud computing offerings that will enable computing across a distributed, globally accessible fabric of resources, rather than on local machines or remote server farms. Built on IBM's massive-scale computing initiatives, Blue Cloud aims to give datacenters the ability to establish their own cloud computing architecture to handle the enormous data-processing power required for video, social networking, and other Web 2.0 technologies.

Initially, the Blue Cloud technology must be deployed internally at each organization, essentially as the foundation for an "internal" cloud. The Blue Cloud platform, running on IBM BladeCenters with Power and x86 processors and Tivoli service management software, dynamically provisions and allocates resources as workloads fluctuate for an application. Blue Cloud is being billed as a more distributed computing architecture than typically found in most enterprise datacenters. It is based on Hadoop. Over time, IBM expects to offer Blue Cloud resources on demand, in the provisioned style of and AT&T.

IBM also provides hosting services for SaaS providers, including SAP and SucecssFactors.

Sun Microsystems: An On-Demand Grid, and Perhaps More

With its "the network is the computer" mantra, Sun provided much of the inspiration for the cloud computing movement. And its Sun Grid Engine was one of the first on-demand cloud offerings, providing access to compute and storage resources optimized for parallel-processing applications.

The company also has a research venture dubbed "Project Caroline" meant to provide a configurable pool of virtualized compute, storage, and networking resources to small and medium-size SaaS providers, so they don't need to develop their own infrastructure. There have been recent reports that Sun is planning to turn Project Caroline into a full-blown business, but there's been no official word from they company yet.

TerremarkWorldwide: Resource Pool for On-Demand Servers

The Terremark Enterprise Cloud is designed to give datacenters an Internet-optimized computing infrastructure. Enterprise Cloud clients buy a dedicated resource pool of processing, memory, storage, and networking, from which they can deploy servers on demand. A Web portal allows server to be dynamically provisioned from a pre-allocated pool of dedicated computer resources. Terremark promises that its cloud servers behave exactly like their physical counterparts, allowing applications to be run without modification.

XCalibreCommunications: Self-Provisioned Virtual Servers

Described by some observers as Europe's answer to Amazon's EC2, Scotland-based XCalibre's FlexiScale provides self-provisioning of virtual dedicated servers via a control panel or API. Persistent storage is based on a fully virtualized high-end SAN/NAS back end.

Copyright 1998-2007, PC World Communications, Inc.

School officials suggest buying more online versions of books

Internet access may replace the purchase of texts, Martin says

By Lisa Boone-Wood Journal Reporter
Published: August 27, 2008

Winston-Salem/Forsyth County school officials told the school board last night that buying more online subscriptions for social-studies textbooks might be a necessary next step.

While evaluating a tight budget earlier this year, school officials decided to buy classroom sets of social-studies books for sixth-and seventh-graders, instead of buying books for each student to take home.

After realizing that every book in the classroom set doesn't come with access to an online version, school officials suggested that buying an additional 82 online subscriptions, at a cost of about $5,000.

The additional subscriptions would be bought so more students can access the online textbooks outside of school.

Superintendent Don Martin said that every student has access to the books during the school day and can access online versions of books and other learning tools online if they have Internet access at home.

Students can also go to the more than 40 WinstonNet labs in local libraries and other locations to access the information, he said.

"It's kind of an experiment to see how that works," Martin said. "We will actually evaluate that at the end of the year. If it works well, we won't buy textbooks next year.

"I actually think the opportunity to not carry that big, old book back-and-forth and access the book online is interesting."

Martin told members of the school board last night that he has received several inquiries from parents about the online textbooks. "Everybody doesn't have a computer at home," he said, adding that another possibility in making the textbooks more readily available would be to use the old textbooks.

"We've had no one indicate that they want to go back to the old textbooks, but that is an option," he said.

■ Lisa Boone-Wood can be reached at 727-7232 or at
©2008 Media General Communications Holdings, LLC. A Media General company.

About CherryPal for Everyone (CP4Every1 or CPFE)

CP4Every1 is constantly crawling the web (on human hands and knees) to find unique information of value regarding green technology, cheap and reliable connectivity, personal, portable and sustainable industry developments, future and social/cultural transformative technology, political relevance and news that is NOT just another re-posting of the same press release pushed out by the industry.

Please note that all copyrights and links to original material are provided and respected. NO robots were used to post content.

Your comments are invited.

Enter your Email to receive CPFE Updates

Preview Powered by FeedBlitz


for $10 off purchase price


Scroll to bottom for Google Custom Search Results

Search Results

Other CherryPal Brand Angel Blogs